Talent.com
Big Data Developer

Big Data Developer

Talentmaticsnew delhi, delhi, in
7 hours ago
Job description

Immediate Joiners Only

Job Title : GCP Big Data Engineer

Location : Bangalore / Gurugram

Experience : 4–12 years

About the Role

We are seeking a GCP Big Data Engineer with strong expertise in building, optimizing, and managing large-scale data solutions on Google Cloud Platform (GCP) . The ideal candidate will have hands-on experience in GCP Big Data services , PySpark , SQL , and ETL design , along with the ability to lead and deliver complex data engineering initiatives.

Key Responsibilities

  • Design, develop, and maintain scalable and efficient data pipelines using GCP Big Data technologies.
  • Implement ETL workflows using PySpark , SQL , and GCP-native services .
  • Drive architecture discussions and ensure adherence to best practices in performance, governance, and cost optimization.
  • Collaborate with cross-functional teams to enable seamless data ingestion, transformation, and analytics.
  • Lead and mentor team members in delivering end-to-end data engineering projects.
  • Ensure robust data quality, reliability, and performance across systems.

Required Skills

Mandatory Technical Expertise :

  • GCP Services : GCP Storage, BigQuery, DataProc, Cloud Composer, DMS, Datastream, Analytics Hub, Workflows, Dataform, Datafusion, Pub / Sub, Dataflow, Cloud Pub / Sub
  • Programming & Data Processing : Java, Python, Scala, PySpark, ETL Design
  • Big Data Ecosystem : Hadoop, Spark, Hive, etc.
  • Data Querying : Strong in ANSI-SQL
  • Workflow Orchestration : Apache Airflow
  • Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.
  • 4–12 years of overall experience in data engineering, with a minimum of 3+ years on GCP.
  • Proven ability to lead data engineering initiatives and deliver large-scale data solutions.
  • Create a job alert for this search

    Big Data Developer • new delhi, delhi, in