We are seeking an experienced GCP Data Engineer with strong expertise in designing, building, and managing scalable data pipelines and analytical solutions on Google Cloud Platform (GCP) . The ideal candidate will have a solid background in big data technologies, data warehousing, and cloud-based data architecture, with hands-on experience using GCP-native tools.
Key Responsibilities :
- Design, develop, and maintain data pipelines and ETL processes on GCP using services like Dataflow, Dataproc, Pub / Sub, Cloud Functions, and Composer (Airflow) .
- Build and optimize data warehouses and data lakes using BigQuery, Cloud Storage, and Cloud SQL .
- Collaborate with data analysts, data scientists, and business stakeholders to define and deliver scalable data solutions.
- Implement data quality, monitoring, and validation frameworks to ensure data integrity and reliability.
- Optimize data models and SQL queries for performance and cost efficiency .
- Work with Terraform or Deployment Manager to automate infrastructure provisioning.
- Ensure data security and compliance with organizational and regulatory standards.
- Stay updated with the latest GCP and big data technologies and best practices.
Required Skills and Experience :
Minimum 6 years of experience in data engineering, with at least 3 years of hands-on experience in GCP .Strong proficiency in Python , SQL , and data pipeline orchestration tools (Airflow / Composer).Experience with BigQuery , Dataflow , Dataproc , Pub / Sub , and Cloud Storage .Proficiency in ETL / ELT design , data modeling , and data warehousing concepts .Experience with CI / CD pipelines , Git , and infrastructure-as-code tools (Terraform preferred).Strong analytical, problem-solving, and communication skills.Familiarity with machine learning pipelines , data governance , or real-time streaming is an added advantage.