Senior Data Engineer
Job Title : Senior Data Engineer
Location : Remote / Willing to Travel
Job Type : Full-time
Experience Level : 8+ years
About the Role :
We are seeking a highly skilled Senior Data Engineer to join our team in building a modern data platform on AWS. You will play a key role in transitioning from legacy systems to a scalable, cloud-native architecture using technologies like Apache Iceberg, AWS Glue, Redshift, and Atlan for governance. This role requires hands-on experience across both legacy (e.g., Siebel, Talend, Informatica) and modern data stacks.
Responsibilities :
- Design, develop, and optimize data pipelines and ETL / ELT workflows on AWS.
- Migrate legacy data solutions (Siebel, Talend, Informatica) to modern AWS-native services.
- Implement and manage a data lake architecture using Apache Iceberg and AWS Glue.
- Work with Redshift for data warehousing solutions including performance tuning and modeling.
- Apply data quality and observability practices using Soda or similar tools.
- Ensure data governance and metadata management using Atlan (or other tools like Collibra, Alation).
- Collaborate with data architects, analysts, and business stakeholders to deliver robust data solutions.
- Build scalable, secure, and high-performing data platforms supporting both batch and real-time use cases.
- Participate in defining and enforcing data engineering best practices.
Required Qualifications :
8+ years of experience in data engineering and data pipeline development.Strong expertise with AWS services, especially Redshift, Glue, S3, and Athena.Proven experience with Apache Iceberg or similar open table formats (like Delta Lake or Hudi).Experience with legacy tools like Siebel, Talend, and Informatica.Knowledge of data governance tools like Atlan, Collibra, or Alation.Experience implementing data quality checks using Soda or equivalent.Strong SQL and Python skills; familiarity with Spark is a plus.Solid understanding of data modeling, data warehousing, and big data architectures.Strong problem-solving skills and the ability to work in an Agile environment.Expertise in Databricks, including designing and optimizing workflows within the Databricks environment.