Job Description :
Key Responsibilities :
Design, build, and maintain scalable data pipelines and ETL processes.
Work on data integration and transformation using Apache Airflow and AWS Glue.
Manage and optimize data storage solutions using Snowflake and AWS S3.
Ensure high performance and reliability of data systems.
Collaborate with cross-functional teams to support analytical and business intelligence needs.
Required Skills :
Strong experience in Snowflake, Airflow, Apache, AWS S3, Glue, and SQL Server.
Good understanding of data warehousing concepts and ETL design patterns.
Proficiency in writing complex SQL queries and performance tuning.
Hands-on experience with cloud data environments (preferably AWS).
Nice to Have :
Experience with Python or other scripting languages.
Exposure to data governance, security, and CI / CD pipelines.
Data Engineer • MH, India