Location : : 510 years
Job Type : Full-time
Domain : Data Engineering / GCP / Cloud Analytics
We are hiring a GCP Data Engineer to design and manage data pipelines and Enterprise Data Warehousing (EDW) solutions using Google BigQuery and related GCP services. The ideal candidate will have experience in ingesting data from both SQL sources and APIs, orchestrating workflows using Google Composer (Airflow), and ensuring secure, scalable pipeline execution.
Key Responsibilities :
- Design and build ingestion pipelines to move data into Google BigQuery from SQL databases and external APIs.
- Develop and schedule workflows using Google Composer (Airflow).
- Handle structured and semi-structured data ingestion and transformation.
- Implement secure access controls, secrets management, and authentication workflows.
- Optimize BigQuery performance for large-scale data analysis and reporting.
- Collaborate with cross-functional teams to support data integration and business needs.
Required Skills :
Hands-on experience with Google BigQuery, Google Cloud Storage, Google Composer (Airflow)Proficiency in SQL (expert level); strong analytical and transformation logicIntermediate knowledge of Python, especially for scripting and Airflow DAGsUnderstanding of GCP authentication, service accounts, and secrets managementFamiliarity with Firestore is a Bachelors or Masters degree in Computer Science, Software Engineering, Data Science, or a related discipline.ref : hirist.tech)