Talent.com
This job offer is not available in your country.
GCP Data Engineer

GCP Data Engineer

Fractalbangalore, karnataka, in
30+ days ago
Job description

We are seeking an experienced GCP Data Engineer with 6+ years of experience to design, develop, and manage cloud-based data solutions on Google Cloud Platform (GCP). The ideal candidate will have expertise in BigQuery, Dataflow, Pub / Sub, Cloud Composer (Apache Airflow), and Terraform, along with strong experience in ETL / ELT pipelines, data modeling, and performance optimization.

Experience : 6-14 years

Locations : Bangalore, Mumbai, Pune, Chennai, Gurgaon, Noida

Key Responsibilities :

  • Design & Implement Data Pipelines : Develop and optimize ETL / ELT pipelines using Dataflow, BigQuery, and Cloud Composer (Airflow).
  • Data Integration : Work with structured and unstructured data sources, integrating data from on-premise and cloud-based systems.
  • Data Warehousing & Modeling : Design high-performance data models in BigQuery, ensuring scalability and cost efficiency.
  • Automation & Infrastructure as Code (IaC) : Implement Terraform for provisioning GCP resources and automate deployments.
  • Streaming & Batch Processing : Work with Pub / Sub, Dataflow (Apache Beam), and Kafka for real-time and batch data processing.

Required Skills & Qualifications :

  • Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
  • 6+ years of experience in data engineering, cloud data solutions, and pipeline development.
  • GCP Expertise : Hands-on experience with BigQuery, Dataflow, Pub / Sub, Cloud Storage, Cloud Composer (Airflow), Vertex AI, and IAM Policies.
  • Programming : Proficiency in Python, SQL, and Apache Beam (Java or Scala is a plus).
  • Create a job alert for this search

    Gcp Data Engineer • bangalore, karnataka, in