Job Description
We are seeking a skilled Data Engineer with hands-on experience in Google Cloud Platform (GCP) , specifically BigQuery , Dataflow , Airflow , and Python . The ideal candidate will be responsible for developing scalable data pipelines, transforming and ingesting large-scale data, and ensuring data quality and security across workflows.
Roles and Responsibilities
Design, develop, and deploy scalable data pipelines using GCP services .
Utilize Python to build, optimize, and automate data engineering workflows.
Work with large-scale data ingestion, processing, and storage architectures.
Monitor and troubleshoot data ingestion and processing issues to maintain system reliability.
Optimize data workflows for performance and cost efficiency.
Collaborate with data scientists, analysts, and other stakeholders to understand data requirements.
Implement security best practices for data handling and storage.
Document solutions and contribute to knowledge sharing within the team.
Stay updated with the latest trends in data engineering and GCP technologies .
Mandatory Skills
Python – Strong programming skills for data processing and automation.
GCP – Hands-on experience with BigQuery , Dataflow , and Airflow .
Data Transformation & Ingestion – Proven expertise in building and managing data pipelines.
Notice Period-Immediate to 60 days
Experience-5years to 12years
Pan India Location
If you are interested kindly share me your resumes to vidhiya.v@ltimindtree.com
Data Engineer • Rajkot, Gujarat, India