Talent.com
This job offer is not available in your country.
Capco - GCP Data Architect - Big Data Technologies

Capco - GCP Data Architect - Big Data Technologies

Capco Technologies Pvt LtdBangalore
30+ days ago
Job description

Job Title : GCP Data Architect

Location : Bangalore

Job Summary :

We are seeking a highly skilled GCP Data Architect with extensive experience in architecting and implementing large-scale, cloud-native data solutions. The ideal candidate will have strong expertise in Google Cloud Platform (GCP), data modeling, data governance, ETL / ELT pipelines, and advanced analytics ecosystems. This role involves designing enterprise-grade data platforms that power business intelligence, AI / ML, and real-time decision-making across structured, semi-structured, and unstructured datasets.

Key Responsibilities :

Enterprise Data Architecture & Modeling :

  • Design end-to-end data architecture frameworks on GCP to support enterprise data strategy.
  • Develop logical, physical, and canonical data models supporting OLTP, OLAP, and streaming workloads.
  • Ensure scalability, security, and performance alignment with business use cases.

Data Integration & Pipelines :

  • Architect and implement ETL / ELT pipelines using Cloud Dataflow, Apache Beam, Dataproc, and Composer (Airflow).
  • Enable real-time and batch data ingestion using Pub / Sub, Kafka, or equivalent.
  • Optimize data pipeline orchestration and reliability for mission-critical workloads.
  • Data Warehousing & Analytics :

  • Design, implement, and optimize data lakes and warehouses leveraging BigQuery, Cloud Storage, and Dataform.
  • Develop partitioning, clustering, and materialized views to improve query performance and cost efficiency.
  • Integrate GCP solutions with BI tools (e.g., Looker, Tableau, Power BI).
  • Cloud Migration & Modernization :

  • Lead migration of legacy data platforms to GCP, ensuring minimal downtime and high availability.
  • Re-engineer monolithic data systems into cloud-native, microservices-driven architectures.
  • Performance & Optimization :

  • Perform SQL query tuning, indexing strategies, and workload management for high-performance data solutions.
  • Conduct capacity planning, monitoring, and auto-scaling to ensure SLA compliance.
  • Data Governance, Security & Compliance :

  • Implement data governance frameworks, metadata management, and data catalogs (Data Catalog, Collibra, or equivalent).
  • Ensure compliance with GDPR, HIPAA, SOC 2, or RBI guidelines for financial data.
  • Enforce IAM roles, VPC Service Controls, encryption (KMS), and DLP policies for secure data handling.
  • Collaboration & Leadership :

  • Partner with business stakeholders, product teams, and data scientists to translate analytical needs into scalable architectures.
  • Provide technical leadership, mentorship, and best practices to data engineers and developers.
  • Qualifications :

    Education : Bachelors or Masters degree in Computer Science, Information Technology, Data Engineering, or related field.

    Experience : 8+ years in data architecture, with at least 3+ years on GCP ecosystem.

    Proven track record in :

  • Big Data technologies (Hadoop, Spark, Kafka, Flink).
  • Cloud-native services (BigQuery, Dataflow, Dataproc, Pub / Sub, Composer).
  • Databases : Proficiency in both SQL (PostgreSQL, MySQL, Oracle) and NoSQL (MongoDB, Cassandra, Firestore).
  • ETL / ELT tools : Informatica, Talend, dbt, or equivalent.
  • Strong expertise in data modeling, data warehousing, and data migration strategies.
  • Hands-on with DevOps practices CI / CD pipelines (Cloud Build, Jenkins, GitLab), Infrastructure as Code (Terraform, Deployment Manager).
  • Excellent problem-solving, stakeholder management, and communication skills.
  • Preferred Skills & Certifications :

    Certifications :

  • Google Cloud Professional Data Engineer / Cloud Architect.
  • AWS or Azure certifications (multi-cloud exposure preferred).
  • Knowledge of machine learning pipelines (Vertex AI, TensorFlow Extended).
  • Familiarity with containerization and orchestration (Docker, Kubernetes, GKE).
  • Experience with data governance tools like Collibra, Alation, or Atlan.
  • (ref : hirist.tech)

    Create a job alert for this search

    Big Data Architect • Bangalore