🌟 We're Hiring : Data Platform Engineer! 🌟
We are seeking an experienced Data Platform Engineer to design, build, and maintain scalable data infrastructure using AWS cloud services. The ideal candidate will have expertise in Python, PySpark, EMR, and Apache Airflow to develop robust data pipelines and analytics solutions that drive business insights.
📍 Location : Pune, India
⏰ Work Mode : Work from anywhere
💼 Role : AWS, Python, Pyspark, EMR, Apache Airflow- Data Platform Engineer
What You'll Do :
🎯 Design and implement scalable data pipelines using Apache Airflow
☁️ Build and optimize AWS EMR clusters for big data processing
🐍 Develop data processing applications using Python and PySpark
📊 Create ETL workflows for data ingestion and transformation
🔧 Monitor and troubleshoot data platform performance
🤝 Collaborate with data scientists and analysts on data requirements
What We're Looking For :
✅ 6+ years of experience in data engineering
✅ Strong expertise in AWS services (EMR, S3, Glue, Lambda)
✅ Proficiency in Python and PySpark for big data processing
✅ Hands-on experience with Apache Airflow for workflow orchestration
✅ Knowledge of data warehousing and ETL best practices
✅ Experience with SQL and NoSQL databases
Ready to make an impact? 🚀 Apply now and let's grow together!
Data Python Pyspark • Pune, Maharashtra, IN