Job Title : Data Engineer
Experience : 6+ Years
Location : Bangalore / Hyderabad / Pune / Noida
Job Description :
We are seeking a highly skilled Data Engineer with over 6 years of experience in building and optimizing scalable data pipelines and architectures. The ideal candidate will have strong expertise in Kafka, Python, PySpark, and Snowflake, and will be responsible for designing, developing, and maintaining data solutions that enable analytics and business insights at scale.
Key Responsibilities :
- Design, develop, and optimize scalable data pipelines and ETL workflows.
- Work with Kafka for real-time data ingestion and streaming.
- Develop data processing solutions using Python and PySpark.
- Implement and manage data warehousing solutions in Snowflake.
- Collaborate with cross-functional teams (data scientists, analysts, product teams) to deliver high-quality data solutions.
- Ensure data quality, reliability, and security across all data platforms.
- Troubleshoot, optimize, and improve performance of data pipelines.
- Stay updated with emerging data engineering tools and best practices.
Required Skills & Experience :
6+ years of proven experience in Data Engineering.Strong programming experience in Python.Hands-on expertise with PySpark for big data processing.Proficiency in Kafka for real-time data streaming.Experience with Snowflake (data modeling, performance optimization, SQL).Solid understanding of ETL / ELT concepts, data warehousing, and distributed data systems.Experience working in cloud environments (AWS / Azure / GCP is a plus).Strong problem-solving, communication, and collaboration skills.Nice-to-Have Skills :
Exposure to Airflow or similar orchestration tools.Knowledge of CI / CD for data pipelines.Familiarity with containerization (Docker, Kubernetes).(ref : hirist.tech)