Description :
AuxoAI is hiring a Data Architect GCP to lead enterprise data platform design, architecture modernization, and solution delivery across global client engagements. In this client-facing role, you will architect scalable data platforms using GCP-native services, guide onshore / offshore data engineering teams, and define best practices across ingestion, transformation, governance, and consumption layers.
This role is ideal for someone who combines deep GCP platform expertise with leadership experience, and is confident working with both engineering teams and executive :
- Design and implement enterprise-scale data architectures using GCP services, with BigQuery as the central analytics platform
- Lead end-to-end implementation of medallion architecture (Raw Processed Curated) patterns
- Oversee data ingestion pipelines using Cloud Composer, Dataflow (Apache Beam), Pub / Sub, and Cloud Storage
- Implement scalable ELT workflows using Dataform and modular SQLX transformations
- Optimize BigQuery workloads through advanced partitioning, clustering, and materialized views
- Lead architectural reviews, platform standardization, and stakeholder engagements across engineering and business teams
- Implement data governance frameworks leveraging tools like Atlan, Collibra, and Dataplex
- Collaborate with ML teams to support Vertex AI-based pipeline design and model deployment
- Enable downstream consumption through Power BI, Looker, and optimized data marts
- Drive adoption of Infrastructure-as-Code (Terraform) and promote reusable architecture templates
- Manage a distributed team of data engineers; set standards, review code, and ensure platform stability
Requirements :
10+ years of experience in data architecture and engineering4+ years of hands-on GCP experience, including BigQuery, Dataflow, Cloud Composer, Dataform, and Cloud StorageDeep understanding of streaming + batch data patterns, event-driven ingestion, and modern warehouse designProven leadership of cross-functional, distributed teams in client-facing rolesStrong programming skills in Python and SQLExperience working with data catalog tools (Atlan, Collibra), Dataplex, and enterprise source connectorsExcellent communication and stakeholder management skillsPreferred Qualifications :
GCP Professional Data Engineer or Cloud Architect certificationExperience with Vertex AI Model Registry, Feature Store, or ML pipeline integrationFamiliarity with AlloyDB, Cloud Spanner, Firestore, and enterprise integration tools (e.g., Salesforce, SAP, Oracle)Background in legacy platform migration (Oracle, Azure, SQL Server)(ref : hirist.tech)