About the Role :
We are seeking a skilled and proactive Data Engineer with hands-on experience in DBT (Data Build Tool) and Google Cloud Platform (GCP).
The ideal candidate will be responsible for building and maintaining robust, scalable, and efficient data pipelines and transforming raw data into actionable insights that drive business decisions.
This role demands a strong foundation in data engineering practices, cloud data architecture, and experience with modern data stack Responsibilities :
- Design, build, and maintain scalable ETL / ELT pipelines using DBT on GCP BigQuery.
- Develop and optimize data models, ensuring clean, well-documented, and testable transformation logic.
- Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and deliver reliable data solutions.
- Manage data ingestion, data transformation, and data validation workflows across multiple sources and formats.
- Implement CI / CD practices for data pipeline development and deployment.
- Monitor and troubleshoot data jobs, ensuring data quality and consistency across environments.
- Maintain and optimize performance of data pipelines and BigQuery tables, minimizing costs and ensuring efficient query execution.
- Ensure compliance with data governance, security, and privacy policies.
- Keep up to date with industry trends and propose the adoption of new tools and techniques as Skills & Qualifications :
- 4- 7 years of experience as a Data Engineer or in a similar role.
- Strong hands-on experience with DBT (Data Build Tool) including modular SQL modeling, testing, and documentation.
- Expertise in Google Cloud Platform (GCP) especially BigQuery, Cloud Storage, and Cloud Functions.
- Solid SQL skills and experience in writing complex, optimized queries.
- Proficient in Python or Shell scripting for building automation around data workflows.
- Experience working with structured and semi-structured data (JSON, CSV, Parquet, etc.
- Familiarity with version control systems (e.g., Git) and CI / CD pipelines.
- Strong understanding of data warehousing concepts, data modeling (star / snowflake schemas), and data quality frameworks.
- Ability to work in Agile environments and collaborate across multidisciplinary teams.
- Excellent problem-solving, analytical, and communication skills
(ref : hirist.tech)