Job Title : Senior Data Engineer (L4) – Python & GCP
Experience : 7+ years
Location : Hyderabad
Overview :
We’re looking for an experienced Senior Data Engineer skilled in Python and Google Cloud Platform (GCP) to design and maintain scalable ETL data pipelines . You’ll work with various GCP services, manage data flows, ensure data quality, and support analytics and data science teams with reliable data solutions.
Key Responsibilities :
- Design, build, and maintain ETL pipelines using Python .
- Work with GCP services such as Dataflow , BigQuery , Cloud Functions , Cloud Composer , GCS , and IAM .
- Develop and manage data ingestion , transformation , and cleansing processes.
- Use tools like Apache Spark , Kafka , Airflow , and FastAPI for data processing and orchestration.
- Work with databases (SQL Server, Oracle, PostgreSQL, MongoDB, Redis).
- Write and optimize SQL queries for data validation and reporting.
- Collaborate with data scientists and analysts to deliver high-quality data solutions.
- Use GitHub for version control and contribute to CI / CD pipelines .
- Create documentation for data pipelines and workflows.
Required Skills :
7–10 years of experience in Python development for data engineering.Strong hands-on experience with GCP (Dataflow, BigQuery, Cloud Functions, Composer, GCS).Expertise in data pipelines , ETL processes , and data integration .Knowledge of Spark , Kafka , Redis , FastAPI , and Airflow .Strong SQL skills with enterprise databases.Experience migrating data from on-premise to cloud .Familiarity with GitHub and CI / CD practices .Nice to Have :
Experience with Snowflake , Databricks , Azure Data Factory , or GKE / Cloud Run deployments.