Job Description – Lead GCP Data Engineer (Python + GCP)
We are looking for an experienced Senior Data Engineer with strong skills in Python and Google Cloud Platform (GCP) . You will design and maintain scalable ETL pipelines , work with key GCP services, and ensure high-quality, reliable data delivery for analytics and business needs.
Key Responsibilities
- Build and maintain ETL data pipelines using Python.
- Work with GCP services such as :
- Dataflow (batch & streaming data processing)
- BigQuery (data warehouse & analytics)
- Cloud Functions , Cloud Run (serverless compute)
- Cloud Composer / Airflow (workflow orchestration)
- GCS (cloud storage)
- IAM (access & security)
- Develop data ingestion, transformation, and cleaning logic.
- Implement data quality checks and monitoring.
- Work with APIs using FastAPI , streaming tools like Kafka , and processing engines like Spark .
- Use storage systems such as MongoDB , Redis , or Bigtable .
- Write complex SQL queries (SQL Server, Oracle, PostgreSQL).
- Participate in GitHub version control and CI / CD deployments .
- Document pipeline designs and support procedures.
- Collaborate with data science, analytics, and engineering teams.
Required Skills
10+ years of experience in Python-based backend or data engineering.Strong experience with GCP (Dataflow, BigQuery, Cloud Functions, Composer, GCS).Good knowledge of data pipeline architecture , ETL, and data integration.Hands-on experience with :Spark , Kafka , Airflow , FastAPI , RedisCI / CD and GitHubStrong SQL skills with enterprise databases.Experience with on-prem to cloud data migration .Nice-to-Have Skills
Snowflake experienceGKE or Cloud Run deployment experienceDatabricks experienceExposure to Azure (ADF, Azure data tools)Soft Skills
Strong problem-solving and analytical abilitiesGood communication and teamwork skills