About the Role :
We are seeking a highly skilled Data Engineer to design, develop, and optimize scalable data pipelines and solutions on Google Cloud Platform (GCP). The ideal candidate will have strong expertise in BigQuery, SQL, Python / Java, and hands-on experience in building robust data pipelines to support advanced analytics and business intelligence.
Key Responsibilities :
- Design, develop, and maintain scalable data pipelines and ETL / ELT processes.
- Work extensively on GCP services including BigQuery, Cloud Storage, Dataflow, Pub / Sub, and Composer.
- Optimize data models and queries for performance and cost efficiency in BigQuery.
- Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements.
- Ensure data quality, governance, and security across all data pipelines and storage systems.
- Troubleshoot and resolve data pipeline issues in real time.
- Contribute to architecture design discussions and provide best practices for data engineering.
Required Skills :
Strong hands-on experience with Google Cloud Platform (GCP), particularly BigQuery.Proven expertise in building and maintaining data pipelines.Strong SQL skills for query optimization and large-scale data manipulation.Proficiency in Python or Java for developing scalable ETL / ELT solutions.Good understanding of data modeling, partitioning, and performance tuning.Experience with workflow orchestration tools (e.g., Airflow / Cloud Composer) is a plus.Familiarity with CI / CD, version control (Git), and agile methodologies.(ref : hirist.tech)