Job Summary :
We are looking for an experienced Data Engineer with strong hands-on expertise in Google Cloud Platform (GCP) and BigQuery to design, build, and manage scalable data pipelines. The ideal candidate will have a deep understanding of ETL frameworks, data modeling, and cloud-based analytics solutions.
Key Responsibilities :
- Design, develop, and maintain scalable data pipelines and ETL workflows on GCP.
- Work extensively with BigQuery, Dataflow, Pub / Sub, Cloud Storage, and Composer (Airflow).
- Optimize data storage, query performance, and cloud cost efficiency.
- Collaborate with data scientists, analysts, and business teams to deliver clean, structured datasets.
- Implement data quality checks, lineage, and governance across data platforms.
- Automate data workflows using Python, SQL, and orchestration tools.
- Integrate data from multiple sources, including APIs, on-prem databases, and third-party applications.
- Ensure security, scalability, and reliability in all data architecture designs.
Required Skills & Experience :
Hands-on experience with Google Cloud Platform (BigQuery, Dataflow, Pub / Sub, Cloud Storage)Proficiency in SQL and PythonStrong understanding of ETL concepts, data modeling, and data warehousingFamiliarity with CI / CD pipelines (Git, Jenkins, etc.)Experience with workflow orchestration tools such as Apache Airflow or Cloud ComposerWorking knowledge of DataOps and DevOps best practicesExcellent problem-solving, debugging, and communication skillsGood to Have :
Experience with Snowflake, Databricks, or TerraformKnowledge of machine learning pipelines or streaming data (Kafka)Exposure to data visualization tools (Tableau, Power BI, Looker)(ref : hirist.tech)