Description : About the Role :
We are looking for an experienced Data Engineer (GCP) to join our growing team.
If youre passionate about building scalable data solutions and working on impactful projects with a forward-thinking organization, this opportunity is for you!
Key Responsibilities :
- Design, develop, and maintain scalable data pipelines and architectures on Google Cloud Platform (GCP).
- Work extensively with BigQuery, Dataflow, Pub / Sub, Cloud Composer, and Airflow.
- Develop and optimize ETL processes using Python and SQL.
- Collaborate with cross-functional teams for data modeling, integration, and orchestration.
- Manage data warehousing and enable real-time data processing.
- Implement CI / CD practices and containerization tools such as Docker and Kubernetes (added advantage).
Required Skills & Experience :
Strong hands-on experience with GCP data services.Expertise in SQL, Python, and modern ETL frameworks.Proven experience in data pipeline design and data modeling.Familiarity with data orchestration tools such as Airflow or Cloud Composer.Knowledge of CI / CD pipelines, Docker, and Kubernetes is a plus.Why Join Us :
Work on exciting projects across diverse domains.Be part of a collaborative, growth-oriented culture.Opportunity to work with a highly skilled and passionate team.(ref : hirist.tech)