Position- ETL Architect
Location- Bengaluru, chennai
Experience- 8 to 11 Years
Notice period- 0 to 6 o Days or Serving NP
Job Description
6+ years of experience Developing, deploying, and monitoring end to end data integration pipelines with extensive knowledge of evaluation metrics and best practices.
6+ years Data Warehouse / Data Lake Architecture and Development.
5+ years Data Modeling & Architecture with strong programming program
3+ years’ experience on Cloud-Native Solutions (In AWS , Azure GCP)
ETL / ELT, data Pipelines, Data Quality, blueprints development.
Data Streaming framework (Kafka, AWS kinesis, AWS SQS or similar)
Non-Relational Database experience (Document DB, Graph DB, etc.)
Cloud-Native data integration & analytics services (AWS Glue, S3, Lambda, EMR, Azure Synapse, Azure Data Factory, Azure Data lake, Databricks etc..)
AI / ML experience is a great advantage (Jupiter& / Python / Apache Spark)
Data API, Embedded Analytics, Event-Driven Microservices architecture exp.
Experience in SQL, Apache Spark, Python, Java / Scala.
Data Integration (AWS Glue), Data Streaming (Kafka), and transformation
A strong DataOps, DevOps, MLOps, Data & Analytics background
Understanding of data structures, data modeling and data architecture
Cloud-Native databases (SQL / NoSQL), RDS, DynamoDB, Snowflake. Redshift, BigQuery
Strong communication skills and ability to work with ambiguity.
Able to provide direction / support to data engineering team members.
Strong architectural knowledge for data integration patterns and their pros and cons.
Strong knowledge of data quality, metadata management, security frameworks / tools implemented for data on cloud architecture
Skillful resource to understand customer needs and provide target solutions, which is scalable, reliable, and highly available.
Architect Etl • Jamnagar, Gujarat, India