We are looking for a skilled GCP Data Engineer to design, build, and optimize scalable data pipelines and solutions on Google Cloud Platform. The candidate should have strong experience in BigQuery, Dataflow, Cloud Storage, Pub / Sub, and data integration frameworks.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL / ELT workflows on GCP.
- Optimize BigQuery queries and manage large datasets efficiently.
- Implement data integration solutions using Cloud Storage, Pub / Sub, Dataflow, and Dataproc.
- Build and maintain workflow orchestration using Airflow / Composer.
- Collaborate with data analysts, data scientists, and engineering teams to deliver high-quality data solutions.
- Ensure CI / CD best practices, version control (Git), and cloud governance in data projects.
Required Skills
Hands-on experience with GCP services : BigQuery, Dataflow, Dataproc, Pub / Sub, Cloud Storage.Strong knowledge of SQL, data modeling, and ETL / ELT pipeline development.Experience in Python / Java for data engineering tasks.Familiarity with Airflow / Composer for workflow orchestration.Good understanding of CI / CD, version control (Git), and cloud best practices.Ability to handle large datasets and troubleshoot performance issues.Skills Required
GCP BigQuery, ETL / ELT pipelines, Dataflow & Pub / Sub, Python / Java programming, Airflow workflow orchestration, data modeling & SQL