Data Engineer (Python + DBT + GCP)
Role : Data Engineer
Experience : 5–8 Years
Location : Hyderabad
Employment Type : Full-time
⭐ Role Overview We are looking for an experienced Data Engineer with strong hands-on expertise in Python, DBT, and Google Cloud Platform (GCP). The ideal candidate will design and build scalable data pipelines, modernize data platforms, implement data transformation frameworks using DBT, and work closely with analytics and product teams to enable high-quality, reliable data delivery.
📌 Key Responsibilities
Design, build, and maintain scalable ETL / ELT pipelines on GCP using tools such as Cloud Composer, Dataflow, BigQuery, and Cloud Storage.
Develop modular, version-controlled DBT models (staging, intermediate, and mart layers) aligned with best transformation practices.
Write efficient, production-grade Python code for data processing, automation, orchestration, and validation.
Optimize BigQuery performance using partitioning, clustering, cost-efficient query strategies, and data modeling.
Implement CI / CD workflows for DBT and ETL pipelines using Git-based development, testing, and deployment automation.
Collaborate with data analysts, data scientists, and cross-functional teams to deliver reliable, high-quality datasets.
Ensure data quality through validation frameworks, testing, auditing, and monitoring pipelines with GCP-native tools.
Troubleshoot pipeline failures, optimize system performance, and ensure end-to-end data availability.
Maintain documentation, lineage visibility, and metadata using GCP and DBT tools.
Engineer • Hyderabad, Republic Of India, IN