Role : GCP Dataflow
Location : Chennai (Hybrid)
Experience : 7-10 Years
Must-Have :
- Experience with GCP and Workflow ETL.
Key Responsibilities :
ETL Development & Support :
Design, develop, and maintain robust ETL pipelines using GCP Dataflow, BigQuery, and Workflow.Perform complex data extraction, transformation, and loading from various data sources into cloud-based systems.Support and troubleshoot existing ETL processes to ensure high reliability and performance.Cloud & DevSecOps Integration :
Collaborate with DevSecOps teams to align data pipelines with CI / CD practices.Implement security and compliance best practices in the data engineering processes.Module Development :
Independently develop modules as part of larger applications or data solutions.Ensure modules are aligned with project and business objectives.Testing & Quality Assurance :
Execute unit tests for developed components.Participate in peer code reviews, inspections, and other quality assurance activities.Impact Analysis & Maintenance :
Analyze the impact of new features or changes on existing systems.Provide maintainable and scalable solutions aligned with business needs.Business Understanding :
Work closely with business analysts and stakeholders to understand data requirements.Translate business requirements into technical specifications and reliable solutions.ref : hirist.tech)