Experience : 5.5+years
Location : Pan India
Immediate Joiners Preferred
Role Overview :
We are looking for a skilled and experienced GCP Data Engineer to join our team. The ideal candidate will have strong expertise in Google Cloud Platform (GCP), Python, PySpark, and SQL, and will be responsible for building scalable data pipelines and solutions in a cloud-native environment.
Key Responsibilities :
- Design, develop, and maintain data pipelines and ETL processes on GCP
- Write efficient and optimized code using Python, PySpark, and SQL
- Integrate and transform data from multiple sources into GCP-based data warehouses or data lakes
- Optimize data pipelines for performance, scalability, and cost-efficiency
- Collaborate with data analysts, data scientists, and business stakeholders to understand data needs
- Ensure data quality, reliability, and compliance with security standards
Required Skills :
5+ years of experience as a Data EngineerStrong hands-on experience with Google Cloud Platform (GCP) and its services (e.g., BigQuery, Dataflow, Pub / Sub, Cloud Storage)Proficient in Python and PySpark for data engineering tasksAdvanced knowledge of SQL for data manipulation and analyticsExperience with data modeling, data warehousing, and big data processingFamiliarity with CI / CD tools and agile development practicesPreferred Qualifications :
GCP certification (e.g., Professional Data Engineer) is a plusExperience with Airflow, Terraform, or other orchestration / automation toolsExposure to real-time data processing and streaming (Kafka, Pub / Sub, etc.)ref : hirist.tech)