Job Title : Data Engineer (Python + AWS)
Experience : 5+ Years
Location : Kolkata (Onsite)
Employment Type : Full-Time / Permanent
Notice Period : Immediate / 15 Days
About the Role
We are looking for an experienced Data Engineer with strong expertise in Python, SQL, AWS, and modern data engineering frameworks . The ideal candidate will be responsible for building scalable, secure, and high-performance data pipelines, ensuring smooth data integration and transformation across multiple systems.
This role requires hands-on experience in designing, developing, and optimizing ETL workflows , managing data lakes / warehouses, and ensuring data quality and governance for enterprise-scale applications.
Key Responsibilities
- Design, build, and manage robust and scalable data pipelines (batch and real-time).
- Develop ETL processes to acquire, transform, and integrate data from multiple sources.
- Build and maintain data warehouses, data lakes , and other storage solutions.
- Optimize data systems for performance, reliability, and scalability .
- Collaborate with cross-functional teams to understand data requirements and deliver efficient solutions.
- Ensure data quality, consistency, and integrity across all systems and pipelines.
- Implement data governance, privacy, and security best practices.
- Monitor and troubleshoot data flows and pipelines , ensuring high availability and minimal downtime.
- Document data architecture, workflows, and system designs for maintainability and scalability.
Required Technical Skills
✅ Must Have Skills
Strong proficiency in Python programming and SQL .Hands-on experience in ETL design and development using tools such as AWS Glue, Apache Airflow, Luigi, or Talend .Expertise in AWS Cloud Services (S3, Redshift, Lambda, Glue).Experience with Big Data frameworks like Apache Spark , Hadoop , or Kafka .Strong understanding of data modeling , data warehousing , and data architecture principles.Proficiency in Git version control and CI / CD automation pipelines .⭐ Good to Have Skills
Exposure to GCP or Azure cloud platforms.Familiarity with data orchestration tools and data governance frameworks .Experience working with structured and unstructured data at scale.Professional Attributes
Excellent problem-solving and analytical thinking skills.Strong communication and collaboration across teams.Proven ability to deliver end-to-end data solutions with minimal supervision.Detail-oriented mindset with a passion for data quality and automation .Educational Qualification
🎓 Bachelor’s or Master’s degree in Computer Science , Engineering , Information Systems , or a related field.
📘 Minimum 15 years of full-time education required.