We are looking for 7 Senior Data Engineers experienced in building large-scale, high-performance data pipelines and solutions on Google Cloud Platform (GCP) .
Experience : 8+ years in Data Engineering
Key Responsibilities
- Design, develop, and deploy data pipelines (batch and streaming) on GCP.
- Build data lake / lakehouse solutions and framework-based ingestion / processing systems.
- Work with BigQuery, Vertex AI, Pub / Sub, Cloud Functions , and other GCP-native tools.
- Implement transformations using dbt and orchestrate workflows with Apache Airflow .
- Collaborate with stakeholders for data modeling, operational support, and performance tuning.
- Apply CI / CD best practices using Jenkins and GitHub Actions .
- Conduct code reviews, unit testing, and automation for reliable data delivery.
Must-Have Skills
Strong hands-on experience with GCP data ecosystem (BigQuery, Pub / Sub, Vertex AI, Cloud Functions)Expertise in dbt and Apache AirflowProficiency in Snowflake, Redshift, or BigQueryExperience with GitHub / Git Toolkit and CI / CD pipelinesStrong understanding of data frameworks , performance optimization, and data quality management