Inviting applications for the role of GCP Data Modular
We are seeking a highly experienced Senior GCP Data Modular to architect, design, and build scalable and modular data solutions on Google Cloud Platform (GCP). This role requires deep technical expertise in GCP services, strong data engineering principles, and a modular mindset to enable the reuse of data components across multiple use cases. You will play a critical role in shaping our cloud data architecture, leading development efforts, and mentoring junior engineers.
Responsibilities :
- Lead the design and development of modular, reusable, and scalable data pipeline components and frameworks using GCP native services.
- Architect end-to-end data solutions leveraging BigQuery, Dataflow, Cloud Composer, Pub / Sub, Cloud Functions, and Vertex AI (optional).
- Define and enforce best practices in modular data pipeline development, CI / CD, version control, testing, and deployment.
- Collaborate with cross-functional teams including data architects, analysts, and business stakeholders to design data models and transformations aligned with business goals.
- Mentor and guide junior data engineers, setting coding standards and conducting code reviews.
- Work with DevOps to implement infrastructure as code (IaC) using Terraform and integrate data pipelines into automated workflows.
- Ensure solutions are compliant with data governance, privacy, and security standards using tools like Data Catalog, Dataplex, and IAM.
- Continuously evaluate and improve the performance, cost, and scalability of data platforms and components.
Qualifications we seek in you :
Minimum Qualifications / Skills :
Bachelor's or Master's degree in Computer Science, Engineering, or a related field.10+ years of progressive experience in data engineering roles, with a strong focus on cloud technologies.6+ years of data engineering experience, with at least 3 years building solutions on GCP.Proven expertise in developing modular data pipelines and componentized frameworks.Advanced proficiency in Python, SQL, and Apache Beam or Dataflow.Strong knowledge of BigQuery optimization techniques and scalable data modeling (e.g., star / snowflake schemas, partitioning, clustering).Experience in orchestration with Cloud Composer (Airflow), and messaging with Pub / Sub.Hands-on experience with Terraform, Git, and CI / CD pipelines (e.g., Cloud Build, Jenkins, GitHub Actions).Familiarity with data governance, metadata management, and data quality best practices.Excellent communication, leadership, and problem-solving skills.Deep and demonstrable expertise with the Google Cloud Platform (GCP) and its core data engineering services (e.g., BigQuery, Dataflow, Cloud Composer, Cloud Storage, Pub / Sub, Cloud Functions).Extensive experience designing, building, and managing large-scale data pipelines and ETL / ELT workflows specifically on GCP.Strong proficiency in SQL and at least one programming language relevant to data engineering on GCP (e.g., Python).Comprehensive understanding of data warehousing concepts, data modeling techniques optimized for BigQuery, and NoSQL database options on GCP (e.g., Cloud Bigtable, Firestore).Solid grasp of data governance principles, data security best practices within GCP (IAM, KMS), and compliance frameworks.Excellent problem-solving, analytical, and debugging skills within a cloud environment.Exceptional communication, collaboration, and presentation skills, with the ability to articulate technical concepts clearly to various audiences.Preferred Qualifications / Skills :
GCP Professional Data Engineer or Cloud Architect certification.Experience implementing data mesh or data product strategies using modular components.Background in advanced analytics, ML pipelines, or real-time streaming solutions.Familiarity with modern data stack tools such as dbt, Looker, Dataform, or Fivetran.ref : hirist.tech)