Job Title : GCP Data Engineer
Location : Chennai, Bangalore, Hyderabad, Pune- Hybrid mode
Experience : 5+ years
Employment Type : Full-time
Shift : 2pm - 11pm IST
Role Summary :
Key Responsibilities :
- Develop and maintain ETL / ELT workflows for structured and unstructured data sources
- Optimize BigQuery SQL queries for performance and cost efficiency
- Build streaming data solutions using Apache Beam and Pub / Sub
- Implement data quality checks, validation rules, and monitoring mechanisms
- Collaborate with data architects to define data models, schemas, and partitioning strategies
- Integrate data from various sources including APIs, flat files, relational databases, and third-party platforms
- Automate workflows using Cloud Composer (Airflow) and CI / CD pipelines
- Ensure data security, governance, and compliance with organizational and regulatory standards
- Troubleshoot and resolve issues related to data ingestion, transformation, and availability
- Document technical designs, data flows, and operational procedures
Required Skills and Qualifications :
5+ years of experience in data engineering.Strong proficiency in Python, Pyspark, SQL , and optionally Java or ScalaHands-on experience with BigQuery , Dataflow , Dataproc , Pub / Sub, and Cloud StorageExperience with Apache Beam, Spark, or Hadoop for large-scale data processingFamiliarity with Cloud Composer (Airflow) for orchestrationKnowledge of data warehousing, data lakes, and dimensional modelingExperience with CI / CD tools like Jenkins, GitHub Actions , or Cloud BuildExcellent problem-solving, communication, and stakeholder management skillsPreferred Qualifications :
GCP Professional Data Engineer CertificationExperience with machine learning pipelines or data science workflowsFamiliarity with Looker, Tableau, or Power BI for data visualizationExposure to Kafka, Snowflake, or other cloud platforms (AWS, Azure)Experience in healthcare, finance, or retail domains is a plus(ref : hirist.tech)