Job Summary
We are seeking a skilled and motivated Data Integration Engineer with hands-on experience in HVR (High Volume Replicator) and Fivetran , and solid knowledge of DBT and Snowflake , to join our One Data Platform team. The ideal candidate will be responsible for building and maintaining robust data pipelines and transformation workflows across cloud and on-premise environments.
Key Responsibilities
- Configure and manage HVR replication for Oracle, SQL Server, MySQL, and other enterprise databases.
- Design and maintain Fivetran connectors for cloud and SaaS data sources.
- Develop and maintain DBT models for data transformation in Snowflake .
- Monitor and troubleshoot data pipelines to ensure reliability and performance.
- Collaborate with data architects and business stakeholders to understand data requirements.
- Optimize data flows for latency, throughput, and cost efficiency.
- Document integration processes and maintain operational runbooks.
- Participate in on-call rotations and incident resolution when needed.
Required Skills & Experience
3+ years of experience in data integration or ETL / ELT engineering.Strong hands-on experience with HVR (configuration, troubleshooting, performance tuning).Solid understanding of Fivetran connectors, transformations, and destination configurations.Experience with DBT (data modeling, testing, documentation) and Snowflake (warehouse management, performance tuning).Familiarity with relational databases (Oracle, SQL Server, MySQL) and data warehousing concepts.Proficiency in SQL and scripting (Python, Bash).Good communication skills in English (written and verbal).