Senior Data Engineer – Python & GCP
Location : Hyderabad
Interview Date : November 15 (Face-to-Face)
About the Role :
We are seeking an experienced Senior Data Engineer with strong expertise in Python and Google Cloud Platform (GCP) to join our dynamic data engineering team. The ideal candidate will design, develop, and maintain scalable ETL pipelines and data systems that drive our analytics and business decision-making processes.
Experience Level :
8–10 years of relevant IT experience
Key Responsibilities
- Design, develop, test, and maintain robust ETL data pipelines using Python.
- Work extensively with GCP services, including Dataflow, BigQuery, Cloud Functions, Cloud Composer (Airflow), IAM, Cloud Run, and Google Cloud Storage.
- Implement data ingestion, transformation, and validation logic to ensure data quality and consistency.
- Collaborate with cross-functional teams, including data scientists and analysts, to deliver reliable data solutions.
- Manage version control through GitHub and contribute to CI / CD pipelines for data projects.
- Write and optimize complex SQL queries across databases such as SQL Server, Oracle, and PostgreSQL.
- Create and maintain documentation, including data flow diagrams and process documentation.
Technical Expertise
Strong proficiency in Python for backend or data engineering projects.Deep working knowledge of GCP services, especially Dataflow, BigQuery, Cloud Functions, and Cloud Composer.Experience with data orchestration and workflow tools like Airflow.Proficiency in Apache Spark and Kafka for data processing and streaming.Hands-on experience with FastAPI, MongoDB, Redis / Bigtable.Sound understanding of CI / CD practices and version control systems (GitHub).Advanced SQL skills and experience with enterprise-grade relational databases.Solid experience in cloud migration and large-scale data integration.Nice to Have
Experience with Snowflake or Databricks for big data analytics.Familiarity with GKE, Cloud Run deployments, or Azure Data Factory.