We are seeking a seasoned GCP Data Warehouse Architect to lead the design, development, and governance of scalable, secure, and high-performance data architectures on Google Cloud Platform (GCP). The ideal candidate will have a proven track record in architecting enterprise-grade data warehouses and large-scale data pipelines that support real-time analytics, business intelligence, machine learning, and AI-driven decision-making across global organizations.
This role is pivotal in shaping the future of our data platform—enabling data democratization, accelerating time-to-insight, and ensuring compliance with data governance, security, and performance standards. You will work closely with data scientists, engineers, business stakeholders, and cloud architects to deliver robust, future-ready data solutions.
Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field.
8+ years of experience in data engineering, with 5+ years focused on GCP and large-scale data warehouse implementations.
Deep expertise in Google Cloud Platform (GCP) services :
BigQuery (including federated queries, partitioning, clustering, materialized views, BI Engine)
Cloud Storage (buckets, lifecycle policies, IAM)
Dataflow (Apache Beam, streaming & batch)
Cloud Composer (Airflow workflows)
Pub / Sub, Dataproc, Dataplex, Data Catalog, Cloud Functions, Cloud SQL, Firestore
Proven experience in designing and managing enterprise data warehouses (e.g., 100+ TB+ data volume, 1000+ daily jobs).
Strong understanding of data modeling, ETL / ELT patterns, data quality, and metadata management.
Experience with infrastructure-as-code (Terraform, Pulumi) and CI / CD pipelines (Cloud Build, GitHub Actions).
Familiarity with data governance tools (e.g., Collibra, Informatica, Alation) and data observability platforms.
Excellent communication skills with the ability to translate technical concepts for business stakeholders.
Preferred Qualifications
Google Cloud Professional Data Engineer or Google Cloud Professional Architect certification.
Experience with multi-cloud or hybrid data architectures (e.g., integrating GCP with AWS / Azure).
Hands-on experience in AI / ML pipeline integration with BigQuery ML, Vertex AI, or custom models.
Knowledge of real-time analytics, stream processing, and event-driven architectures.
Experience in regulated industries (e.g., finance, healthcare, energy, logistics) with strict compliance requirements.
Technical Architect • Delhi, India