Job Title : Data Solutions Architect Google Cloud Platform (GCP)
Location : Noida / Bangalore
Job Type : Full Time
Experience : 8+ Years
Industry : IT Services & Analytics
Role and Responsibilities
We are seeking a highly experienced and strategic Data Solutions Architect Google Cloud Platform (GCP) to join our technology team. This pivotal role will be responsible for leading the design, architecture, and implementation of highly scalable, secure, and performant data solutions on GCP.
The ideal candidate will possess deep expertise in GCP's comprehensive suite of data services, modern cloud data architecture patterns, and best practices in data engineering. You will be instrumental in translating complex business requirements into robust technical solutions, collaborating extensively with cross-functional teams to drive innovation, ensure data reliability, and support advanced analytics and AI initiatives that deliver significant business value.
Preferred Technical and Professional Experience
- Education : Bachelors or Masters degree in Computer Science, Engineering, Information Systems, or a related technical field (or equivalent practical experience).
Experience :
8+ years of extensive experience in data engineering, data architecture, or analytics.At least 3 years in a dedicated data architect or solutions architect role.Minimum of 5 years of hands-on experience designing and implementing enterprise-scale data solutions specifically on Google Cloud Platform.GCP Expertise : Proven expert-level proficiency with core and advanced GCP data services, including but not limited to :BigQuery (advanced SQL, optimization, data modeling, partitioning, clustering)Cloud Storage (data lake design, lifecycle management)Dataflow (Apache Beam for batch and streaming processing)Dataproc (managed Apache Spark / Hadoop)Pub / Sub (real-time messaging)Cloud Composer (Apache Airflow for workflow orchestration)Data Catalog (metadata management, data discovery)Cloud Functions / Cloud Run (serverless compute for data processing)Looker (BI integration)Technical Proficiency :
Exceptional proficiency in SQL (with a focus on BigQuery optimization) and Python (including libraries like Pandas, NumPy, etc.).Extensive experience with data modeling techniques (dimensional, Kimball, Inmon) and designing data warehousing / data lakehouse architectures.Hands-on experience with ETL / ELT tools, orchestration frameworks, and API-driven data integration.Proficiency with Infrastructure as Code (IaC) tools like Terraform or Cloud Deployment Manager for provisioning GCP resources.Familiarity with event-driven architectures and messaging systems (e.g., Kafka).Understanding of containerization technologies (Docker, Kubernetes, GKE) and CI / CD pipelines (e.g., Cloud Build, Cloud Deploy) for data workloads.Exposure to NoSQL databases (e.g., Firestore, Bigtable, MongoDB) and various file formats (JSON, Avro, Parquet).Knowledge of machine learning workflows and MLOps practices on GCP (e.g., Vertex AI) is a plus.Certifications (Highly Preferred) : Google Cloud Professional Data Engineer and / or Google Cloud Professional Cloud Architect.(ref : hirist.tech)