About the Role :
We are looking for a highly skilled ETL Senior Engineer with AWS expertise to design, build, and optimize robust data pipelines that power enterprise decision-making.
This role involves working with modern cloud-native tools to build scalable, secure, and high-performance ETL workflows that integrate structured and unstructured data from diverse sources.
The ideal candidate will have hands-on expertise in AWS services, SQL / PL-SQL, and Python, along with strong data engineering and orchestration skills.
If you are passionate about creating data-driven solutions that enable analytics, insights, and business transformation, this is the role for you.
Key Responsibilities :
- Design, build, and optimize end-to-end ETL pipelines on AWS for large-scale data integration.
- Work with AWS services including S3, Glue, Athena, EMR, and Redshift to process and transform datasets.
- Develop and maintain ETL scripts using SQL / PL-SQL and Python.
- Implement data orchestration workflows using Airflow to manage scheduling and dependencies.
- Build dashboards and reports in Tableau to enable real-time business insights.
- Collaborate with data scientists, analysts, and business teams to understand requirements and translate them into scalable data solutions.
- Ensure data quality, validation, and governance across ingestion and transformation layers.
- Optimize pipelines for performance, cost efficiency, and scalability in cloud environments.
- Support production systems with monitoring, troubleshooting, and root cause analysis.
- Document ETL processes, workflows, and data flow diagrams for operational Skills :
- 610 years of experience in ETL development and data engineering.
- Strong proficiency in SQL / PL-SQL and Python for data manipulation and automation.
- Hands-on expertise with AWS cloud services : S3, Glue, Athena, EMR, and Redshift.
- Experience with Apache Airflow for data pipeline orchestration.
- Proficiency in building dashboards using Tableau.
- Solid understanding of data warehouse concepts, schemas, and data modeling.
- Strong analytical, debugging, and problem-solving Skills :
- Familiarity with CI / CD pipelines and DevOps practices for data engineering.
- Experience with other orchestration frameworks (Luigi, Prefect).
- Exposure to NoSQL or big data technologies like Spark or Hadoop.
- Certifications in AWS (AWS Data Analytics Specialty, AWS Solutions Architect).
Why Join Us?
Work on enterprise-scale cloud data transformation programs.Opportunity to design pipelines that directly impact business decisions and strategy.Collaborate with cross-functional global teams in a fast-paced environment.Remote flexibility with strong learning and certification support.Competitive compensation with career growth into Data Architect or Lead Data Engineer roles(ref : hirist.tech)