We are looking for a skilled and motivated AWS Data Engineer with 3–4 years of hands-on experience in modern data and application development environments. The ideal candidate will have strong technical expertise in Python, MongoDB, Snowflake, DBT, and Airflow—with a focus on designing, developing, and optimizing scalable data pipelines and integrated business solutions.Key Responsibilities : Design, develop, and maintain data integration and ETL workflows using tools like Airflow & DBT.Build, automate, and optimize data pipelines to support analytics and application needs.Write efficient, reusable, and reliable Python code for backend services and data processing.Work with Snowflake for data warehousing and performance optimization.Utilize MongoDB for data storage, retrieval, and modeling for dynamic, high-traffic applications.Collaborate closely with cross-functional teams—Data Engineers, Analysts, and Product Managers—to translate business requirements into technical solutions.Implement best practices in version control, CI / CD pipelines, and code documentation.Troubleshoot and resolve software defects, performance bottlenecks, and production issues.Required Skills & Qualifications : 3–4 years of proven experience in data engineering roles.Strong programming proficiency in Python.Practical experience working with DBT, Airflow, Snowflake, and MongoDB.Solid understanding of ETL processes, data modeling, and API integration.Familiarity with Git, CI / CD pipelines, and Agile / Scrum methodologies.Strong problem-solving, analytical, and debugging skills.Exposure to data governance, data quality, and metadata management.Familiarity with DevOps and containerization (Docker / Kubernetes).
Aws Data Engineer • Bengaluru, Karnataka, India