Talent.com
Auxo AI - Data Architect - Python Programming

Auxo AI - Data Architect - Python Programming

AuxoAIMumbai
2 days ago
Job description

Description :

AuxoAI is hiring a Data Architect GCP to lead enterprise data platform design, architecture modernization, and solution delivery across global client engagements. In this client-facing role, you will architect scalable data platforms using GCP-native services, guide onshore / offshore data engineering teams, and define best practices across ingestion, transformation, governance, and consumption layers.

This role is ideal for someone who combines deep GCP platform expertise with leadership experience, and is confident working with both engineering teams and executive :

  • Design and implement enterprise-scale data architectures using GCP services, with BigQuery as the central analytics platform
  • Lead end-to-end implementation of medallion architecture (Raw Processed Curated) patterns
  • Oversee data ingestion pipelines using Cloud Composer, Dataflow (Apache Beam), Pub / Sub, and Cloud Storage
  • Implement scalable ELT workflows using Dataform and modular SQLX transformations
  • Optimize BigQuery workloads through advanced partitioning, clustering, and materialized views
  • Lead architectural reviews, platform standardization, and stakeholder engagements across engineering and business teams
  • Implement data governance frameworks leveraging tools like Atlan, Collibra, and Dataplex
  • Collaborate with ML teams to support Vertex AI-based pipeline design and model deployment
  • Enable downstream consumption through Power BI, Looker, and optimized data marts
  • Drive adoption of Infrastructure-as-Code (Terraform) and promote reusable architecture templates
  • Manage a distributed team of data engineers; set standards, review code, and ensure platform stability

Requirements :

  • 10+ years of experience in data architecture and engineering
  • 4+ years of hands-on GCP experience, including BigQuery, Dataflow, Cloud Composer, Dataform, and Cloud Storage
  • Deep understanding of streaming + batch data patterns, event-driven ingestion, and modern warehouse design
  • Proven leadership of cross-functional, distributed teams in client-facing roles
  • Strong programming skills in Python and SQL
  • Experience working with data catalog tools (Atlan, Collibra), Dataplex, and enterprise source connectors
  • Excellent communication and stakeholder management skills
  • Preferred Qualifications :

  • GCP Professional Data Engineer or Cloud Architect certification
  • Experience with Vertex AI Model Registry, Feature Store, or ML pipeline integration
  • Familiarity with AlloyDB, Cloud Spanner, Firestore, and enterprise integration tools (e.g., Salesforce, SAP, Oracle)
  • Background in legacy platform migration (Oracle, Azure, SQL Server)
  • (ref : hirist.tech)

    Create a job alert for this search

    Architect Python • Mumbai