Role Overview :
We are seeking a highly skilled Data Architect - GCP with 6–8 years of experience in designing, developing, and managing enterprise data solutions on Google Cloud Platform (GCP) along with Data Governance. The ideal candidate will have a strong background in cloud data architecture, data warehousing, big data processing, and data integration, with proven expertise in data governance, delivering scalable, secure, and efficient data platforms.
Key Responsibilities :
Design and architect end-to-end data solutions on GCP, aligning with business and technical requirements.
Hands on experience with Data Governance
Define data models, storage strategies, data ingestion, processing, and consumption frameworks.
Implement data lakes, data warehouses, and data marts using services like BigQuery, Cloud Storage, Dataflow, Dataproc, Pub / Sub, and Composer.
Collaborate with business stakeholders, data scientists, and engineering teams to understand data needs and translate them into scalable architectures.
Design and implement data governance, security, and compliance frameworks for cloud-based data platforms.
Optimize data workflows, query performance, and storage costs in the GCP environment.
Lead data migration and modernization initiatives from on-premise or other cloud platforms to GCP.
Stay updated with GCP services, features, and industry best practices to recommend improvements and innovation.
Provide technical leadership and mentoring to data engineering teams.
Required Skills and Qualification :
Educational Background :
Bachelor’s degree in Computer Science, Software Engineering, or a related field.
Experience :
6–8 years of experience in data architecture and engineering roles, with at least 3 years hands-on on GCP.
Hands-on Data Governance experience.
Strong expertise in GCP data services : BigQuery, Dataflow, Pub / Sub, Dataproc, Cloud Storage, Cloud Composer, Data Catalog.
Proficient in data modeling, data warehousing concepts, ETL / ELT pipelines, and big data processing frameworks.
Experience with SQL, Python, and Terraform (preferred) for infrastructure as code.
Hands-on experience in data security, encryption, access control, and governance on GCP.
Experience in integrating with real-time data pipelines and event-driven architectures.
Strong understanding of DevOps, CI / CD pipelines for data workflows, and cloud cost optimization.
GCP Professional Data Engineer / Cloud Architect certification is a plus.
Soft Skills :
Strong problem-solving and analytical skills.
Excellent communication and collaboration abilities.
Ability to work independently and within a team in an Agile / Scrum environment.
Traits we are looking for :
Curiosity-driven : Passionate about exploring new tools, technologies, and methods to solve problems creatively.
Problem-Solver : Thrives on identifying and fixing complex issues, with a natural inclination for troubleshooting.
Self-starter : Takes initiative, requires minimal supervision, and demonstrates ownership of tasks from start to finish.
Innovative Mindset : Enjoys experimenting with different approaches and is not afraid to break things to learn.
Continuous Learner : Actively seeks to expand their knowledge through reading, online courses, and experimenting with side projects.
Adaptability : Quick to adjust to new technologies, tools, or changes in requirements.
Detail-Oriented : Pays attention to both the big picture and the small details, ensuring quality in execution.
Good To Have :
Exposure to AI / ML workflows, data preparation for ML models.
Experience with third-party tools like Apache Airflow, Looker, or Dataplex.
Knowledge of other cloud platforms (AWS, Azure) for hybrid / multi-cloud strategies.
What We Offer :
Competitive salary and benefits package.
Opportunity to work on cutting-edge security challenges.
A collaborative and growth-oriented work environment with opportunities for career development.
Data Architect • Pune, India