We are seeking an experienced GCP Big Data Engineer with 8–10 years of expertise in designing, developing, and optimizing large-scale data processing solutions. The ideal candidate will bring strong leadership capabilities, technical depth, and a proven track record of delivering end-to-end big data solutions in cloud environments.
Key Responsibilities : -
- Lead and mentor teams in designing scalable and efficient ETL pipelines on Google Cloud Platform (GCP) .
- Drive best practices for data modeling, data integration, and data quality management .
- Collaborate with stakeholders to define data engineering strategies aligned with business goals.
- Ensure high performance, scalability, and reliability in data systems using SQL and PySpark .
Must-Have Skills : -
GCP expertise in data engineering services (BigQuery, Dataflow, Dataproc, Pub / Sub, Cloud Storage).Strong programming in SQL & PySpark .Hands-on experience in ETL pipeline design, development, and optimization .Strong problem-solving and leadership skills with experience guiding data engineering teams.Qualification : -
Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field .Relevant certifications in GCP Data Engineering preferred.