Position Summary :
We are seeking a highly skilled
ETL Developer
with
5–8 years of experience
in data integration, transformation, and pipeline optimization. This role is a key part of our
Data Engineering function
within the Business Intelligence team, responsible for enabling robust data flows that power enterprise dashboards, analytics, and machine learning models. The ideal candidate has strong SQL and scripting skills, hands-on experience with cloud ETL tools, and a passion for building scalable data infrastructure.
Education Qualification :
B. Tech (CS, Elec), MCA or higher.
Key Responsibilities :
Design, develop, and maintain
ETL pipelines
that move and transform data across internal and external systems.
Collaborate with
data analysts, BI developers, and data scientists
to support reporting, modeling, and insight generation.
Build and optimize
data models and data marts
to support business KPIs and self-service BI.
Ensure
data quality, lineage, and consistency
across multiple source systems.
Monitor and tune performance of ETL workflows, troubleshoot bottlenecks and failures.
Support the migration of on-premise ETL workloads to
cloud data platforms
(e.g., Snowflake, Redshift, BigQuery).
Implement and enforce
data governance, documentation, and operational best practices .
Work with DevOps / DataOps teams to implement
CI / CD for data pipelines .
Required Qualifications :
5–8 years
of hands-on experience in ETL development or data engineering roles.
Advanced SQL skills and experience with
data wrangling
on large datasets.
Proficient with at least one
ETL tool
(e.g.,
Informatica ,
Talend ,
AWS Glue ,
SSIS ,
Apache Airflow , or
Domo Magic ETL ).
Familiarity with
data modeling
techniques (star / snowflake schemas, dimensional models).
Experience working with
cloud data platforms
(e.g., AWS, Azure, GCP).
Strong understanding of
data warehouse concepts , performance optimization, and data partitioning.
Experience with
Python or scripting languages
for data manipulation and automation.
Preferred Qualifications :
Exposure to
BI platforms
like Domo, Power BI, or Tableau.
Knowledge of
CI / CD practices
in a data engineering context (e.g., Git, Jenkins, dbt).
Experience working in
Agile / Scrum environments .
Familiarity with
data security and compliance
standards (GDPR, HIPAA, etc.).
Experience with
API integrations
and external data ingestion.
Etl Developer • Delhi, India