Mode of Work : : 5-10 : : 2 PM - 11 PM
Job Summary :
- Design, develop, and optimize ETL workflows using DataStream, Cloud Composer, and Dataplex.
- Integrate and transform data from diverse sources into data Ensure data quality, governance, and performance tuning of pipelines.
- Collaborate with cross-functional teams to define requirements and deliver solutions.
- Support deployment, monitoring, and troubleshooting of data integration Responsibilities :
- Infrastructure support & monitoring of ETL tasks.
- Performance optimization and proactive production support.
- Develop automation scripts (Python, Bash, PowerShell).
- Manage security & compliance for data / report access.
- Build ad-hoc Tableau dashboards for executive management.
- Ensure cost-effective cloud implementations (GCP Big Skills & Qualifications :
- Hands-on with DataStream, Cloud Composer, Dataplex.
- Strong PL / SQL expertise (Big Query preferred).
- Proficiency in Google Cloud Big Query.
- Hands-on with Python, Tableau, SAP Business Objects (Webi).
- Strong problem-solving, analytical, and communication skills.
- Education : Bachelors in CS / Engineering (8+ years in Data Analytics).
- Experience in data pipelines & cloud migration (on-prem ? cloud DWs).
Key Skills : - Python
- PL / SQL (Big Query preferred)
- GCP services DataStream, Cloud Composer, Dataplex
(ref : hirist.tech)