Job Title : GCP Data Engineer
Location : Bangalore (Devanahalli) Hybrid (2 days onsite)
Experience : 3 to 7 Years
Work Mode : Hybrid (2 days onsite, 3 days remote)
Joining : Immediate Joiners Only
Interview Process :
1 HR Round2 Technical RoundsJob Summary :
We are seeking a skilled GCP Data Engineer with 3 to 7 years of experience to join our team on a hybrid basis. The ideal candidate will have hands-on experience designing and developing data analytics and data integration solutions on Google Cloud Platform, focusing on BigQuery, Airflow (DAGs), Python, and SQL. You will work on complex datasets, ETL / ELT pipelines, and collaborate with cross-functional teams to enable data-driven decision-making.
Key Responsibilities :
- Design, build, and maintain scalable data pipelines and ETL / ELT workflows on Google Cloud Platform.
- Develop and optimize SQL queries for data extraction, transformation, and analysis on large-scale datasets.
- Build data analytics solutions using GCP services like BigQuery, Cloud Storage, Dataflow, Pub / Sub, and Cloud Composer.
- Design and implement data models including star schemas to support analytical use cases.
- Develop, schedule, and maintain DAGs in Apache Airflow for orchestrating data workflows.
- Collaborate with data scientists, analysts, and business stakeholders to gather requirements and translate them into technical solutions.
- Manage code using version control tools like GitHub and implement best practices for CI / CD pipelines.
- Troubleshoot data pipeline issues, optimize performance, and ensure data quality and consistency.
- Document data workflows, architecture, and technical processes.
- Follow industry best practices in data governance, security, and compliance.
Required Skills & Experience :
- 3 to 7 years of professional experience in Data Engineering (strictly no support profiles).
- Strong understanding of database concepts, ETL / ELT processes, star schema design, and data modeling.
- Proficient in advanced SQL for querying and manipulating large, complex datasets in business environments.
- Hands-on experience with Google Cloud Platform services :
- BigQuery
- Cloud Storage
- Dataflow
- Pub / Sub
- Cloud Composer (Airflow)
- Experience in authoring and managing Airflow DAGs for data orchestration.
- Strong programming skills in Python for data manipulation and pipeline development.
- Familiarity with source code management tools such as GitHub.
- Strong communication and interpersonal skills to collaborate effectively with technical and non-technical teams.
- Proven troubleshooting and problem-solving abilities in data engineering contexts.
Preferred Qualifications :
- Prior experience or knowledge of the insurance industry, including products and services, is highly desirable.
- Exposure to data governance, security frameworks, and compliance standards.
- Experience working in Agile / Scrum environments
ref : hirist.tech)