About Company : Our Client is a leading Indian multinational IT services and consulting firm. It provides digital transformation, cloud computing, data analytics, enterprise application integration, infrastructure management, and application development services. The company caters to over 700 clients across industries such as banking and financial services, manufacturing, technology, media, retail, and travel & hospitality.
Its industry-specific solutions are designed to address complex business challenges by combining domain expertise with deep technical capabilities. With a global workforce of over 80,000 professionals and a presence in more than 50 countries.
Job Title : Senior Data Engineer
Locations : Pan India
Experience : 7-15Years
Employment Type : Contract to Hire
Work Mode : Work From Office
Notice Period : Immediate to 15 Day
Job Summary :
We are seeking an experienced and highly motivated Senior Data Engineer to join our data engineering team. The ideal candidate will have strong experience in GCP data services , Big Query , Python programming , and building scalable data pipelines using Apache Airflow and Dataflow . You will be responsible for designing and implementing end-to-end data solutions that are robust, scalable, and efficient, helping power key business decisions with accurate and timely data.
Key Responsibilities :
- Design, develop, and maintain large-scale, reliable, and scalable data pipelines on Google Cloud Platform .
- Build and manage ETL / ELT pipelines using Apache Airflow , Dataflow , and BigQuery .
- Develop custom data ingestion and transformation logic using Python .
- Collaborate with data scientists, analysts, and product teams to understand data needs and deliver solutions.
- Optimize data flows for performance, cost, and scalability across GCP services.
- Implement data quality checks, monitoring, and alerting for critical pipelines.
- Apply best practices for data security, governance, and compliance.
- Mentor junior engineers and lead design / code reviews.
Required Skills & Qualifications :
7–13 years of professional experience in data engineering or a related role.Strong hands-on experience with Google Cloud Platform (BigQuery, Cloud Storage, Cloud Functions, Pub / Sub, Dataflow).Deep expertise in BigQuery : advanced SQL, partitioning, clustering, performance optimization.Proficiency in Python , with experience writing modular, reusable code for data processing.Experience orchestrating workflows using Apache Airflow (Cloud Composer is a plus).Experience working with Apache Beam or Google Dataflow for stream and batch processing.Strong understanding of data warehousing concepts, ETL design patterns, and data modeling.Familiarity with CI / CD pipelines and version control systems (Git, Cloud Build).Strong problem-solving and communication skills.Preferred Qualifications :
GCP certifications (e.g., Professional Data Engineer , Cloud Architect ) are a plus.Experience with Terraform or other IaC tools for infrastructure automation.Knowledge of other data tools (e.g., Looker, dbt, Kafka, Spark) is a plus.Experience in agile development and working with cross-functional teams.