Job Title : ETL Developer – DataStage, AWS, Snowflake
Experience : 5–7 Years
Location : Remote
Job Type : Full-time
About the Role
We are looking for a talented and motivated ETL Developer / Senior Developer to join our data engineering team. You will work on building scalable and efficient data pipelines using IBM DataStage (on Cloud Pak for Data) , AWS Glue , and Snowflake . You will collaborate with architects, business analysts, and data modelers to ensure timely and accurate delivery of critical data assets supporting analytics and AI / ML use cases.
Key Responsibilities
Design, develop, and maintain ETL pipelines using IBM DataStage (CP4D) and AWS Glue / Lambda for ingestion from varied sources like flat files, APIs, Oracle, DB2, etc.
Build and optimize data flows for loading curated datasets into Snowflake , leveraging best practices for schema design, partitioning, and transformation logic.
Participate in code reviews , performance tuning, and defect triage sessions.
Work closely with data governance teams to ensure lineage, privacy tagging, and quality controls are embedded within pipelines.
Contribute to CI / CD integration of ETL components using Git, Jenkins, and parameterized job configurations.
Troubleshoot and resolve issues in QA / UAT / Production environments as needed.
Adhere to agile delivery practices, sprint planning, and documentation requirements.
Required Skills and Experience
4+ years of experience in ETL development with at least 1–2 years in IBM DataStage (preferably CP4D version) .
Hands-on experience with AWS Glue (PySpark or Spark) and AWS Lambda for event-based processing.
Experience working with Snowflake : loading strategies, stream-task, zero-copy cloning, and performance tuning.
Proficiency in SQL , Unix scripting , and basic Python for data handling or automation.
Familiarity with S3 , version control systems (Git), and job orchestration tools.
Experience with data profiling, cleansing, and quality validation routines.
Understanding of data lake / data warehouse architectures and DevOps practices.
Good to Have
Experience with Collibra, BigID , or other metadata / governance tools
Exposure to Data Mesh / Data Domain models
Experience with agile / Scrum delivery and Jira / Confluence tools
AWS or Snowflake certification is a plus
Lead Etl • Dombivali, Maharashtra, India