Description :
We are looking for an experienced GCP BigQuery Data Engineer (Contract) to design, build, and maintain scalable data pipelines and solutions on the Google Cloud Platform.
The ideal candidate will have strong hands-on experience in BigQuery, DBT, Python, and Production Support.
Key Responsibilities :
- Design, develop, and maintain scalable data pipelines using BigQuery, DBT, and Python.
 - Implement and optimize ETL / ELT workflows on GCP using tools like Cloud Composer, Dataflow, and Cloud Storage.
 - Provide production support for data workflows ensuring performance, availability, and data integrity.
 - Monitor, troubleshoot, and resolve issues in batch and streaming pipelines.
 - Collaborate with cross-functional teams to understand business requirements and deliver data-driven solutions.
 - Automate operational tasks and build alerting mechanisms using Python and GCP monitoring tools.
 - Maintain and enhance DBT models, ensuring modularity, documentation, and test coverage.
 - Ensure compliance with data governance, security, and privacy standards.
 
Required Skills :
Strong hands-on experience with GCP BigQuery, DBT, and Python.Solid understanding of ETL / ELT processes, data modeling, and workflow orchestration.Experience in production support and troubleshooting data pipelines.Working knowledge of GCP services such as Cloud Composer, Dataflow, and Cloud Storage.Excellent analytical and problem-solving skills(ref : hirist.tech)