About the Role :
We are looking for an experienced
Data Engineer
with strong expertise in
Google Cloud Platform (GCP)
and
BigQuery
to join our growing data team. You will be responsible for designing, building, and optimizing data pipelines and architecture to support analytics, reporting, and data-driven decision-making across the organization.
Key Responsibilities :
Design and implement scalable, reliable, and high-performance data pipelines on GCP.
Build and maintain data warehouse solutions using BigQuery.
Advanced skills in
SQL ,
Python ,
Kafka
and possibly
Scala
or
Java .
Develop ETL / ELT processes to ingest structured and unstructured data from multiple sources.
Ability to make architectural decisions, optimize performance, and manage data security and governance.
Collaborate with data analysts, data scientists, and business stakeholders to define data needs.
Optimize query performance and manage data partitioning and clustering in BigQuery.
Ensure data quality, governance, and security best practices.
Monitor and troubleshoot pipeline performance and data flow issues.
Work with tools such as Dataflow, Cloud Composer (Apache Airflow), Pub / Sub, and Cloud Storage.
Required Qualifications :
7+ years of experience in Data Engineering or a related field.
Strong hands-on experience with
Google Cloud Platform (GCP)
services, especially
BigQuery and Kafka .
Proficiency in
SQL
and at least one programming language like
Python or Golang .
Experience with data modeling, ETL / ELT processes, and building data pipelines.
Experience with GCP tools like
Cloud Dataflow ,
Pub / Sub ,
Cloud Composer , and
Cloud Storage .
Understanding of CI / CD practices and version control systems like Git.
Strong problem-solving and communication skills.
Preferred Qualifications :
GCP Professional Data Engineer Certification.
Experience with data lakes, streaming data, and real-time analytics.
Exposure to other cloud platforms (AWS, Azure) is a plus.
Senior Data Engineer • Delhi, India