GCP Data Engineer
Experience : 3 to 6 years overall
Key Skills :
- Google BigQuery – Must have worked on migration and pipeline creation
- SQL – Strong in both basic and advanced queries
- Python scripting
- Airflow / Cloud Composer
- Cloud platform exposure – GCP (primary), AWS (secondary acceptable)
- Terraform – for Infrastructure as Code (good to have)
- Cloud Functions and event-driven architecture
Roles & Responsibilities :
Develop and maintain scalable, cloud-native data processing pipelines on GCP.Work extensively on BigQuery, DataFlow, Pub / Sub, Cloud Storage , and Airflow for orchestration.Automate infrastructure using Terraform and follow agile development practices.Implement data solutions for enterprise-scale data lakes and data warehouses.Write clean, efficient, and production-ready code using Python and SQL .Handle data quality issues including data duplication and debugging code failures .Collaborate with cross-functional teams to build resilient and reliable data platforms.Additional Screening Points :
Hands-on experience with debugging / analyzing production issuesExposure to identifying and resolving data duplication or consistency issuesShould have worked on end-to-end data pipeline creation and monitoring