Position Overview
Summary :
Design, develop and implement scalable batch / real time data pipelines (ETLs) to integrate data from a variety of sources into Data Warehouse and Data Lake
Design and implement data model changes that align with warehouse dimensional modeling standards.
Proficient in Data Lake, Data Warehouse Concepts and Dimensional Data Model.
Responsible for maintenance and support of all database environments, design and develop data pipelines, workflow, ETL solutions on both on-prem and cloud-based environments.
Design and develop SQL stored procedures, functions, views, and triggers.
Design, code, test, document and troubleshoot deliverables.
Collaborate with others to test and resolve issues with deliverables.
Maintain awareness of and ensure adherence to Zelis standards regarding privacy.
Summary :
- Design, develop and implement scalable batch / real time data pipelines (ETLs) to integrate data from a variety of sources into Data Warehouse and Data Lake
- Design and implement data model changes that align with warehouse dimensional modeling standards.
- Proficient in Data Lake, Data Warehouse Concepts and Dimensional Data Model.
- Responsible for maintenance and support of all database environments, design and develop data pipelines, workflow, ETL solutions on both on-prem and cloud-based environments.
- Design and develop SQL stored procedures, functions, views, and triggers.
- Design, code, test, document and troubleshoot deliverables.
- Collaborate with others to test and resolve issues with deliverables.
- Maintain awareness of and ensure adherence to Zelis standards regarding privacy.
- Create and maintain Design documents, Source to Target mappings, unit test cases, data seeding.
- Ability to perform Data Analysis and Data Quality tests and create audit for the ETLs.
- Perform Continuous Integration and deployment using Azure DevOps and Git.
Requirements :
3+ Years Microsoft BI Stack (SSIS, SSRS, SSAS)3+ Years data engineering experience to include data analysis.3+ years programming SQL objects (procedures, triggers, views, functions) in SQL Server.Experience optimizing SQL queries.Advanced understanding of T-SQL, indexes, stored procedures, triggers, functions, views, etc.Experience designing and implementing Data Warehouse.Working Knowledge of Azure / AWS Architecture, Data LakeMust be detail oriented. Must work under limited supervision. Must demonstrate good analytical skills as it relates to data identification and mapping and excellent oral communication skills.Must be flexible and able to multi-task and be able to work within deadlines; must be team-oriented, but also be able to work independently.Preferred Skills :
Experience working with an ETL tool (DBT preferred)Working Experience designing and developing Azure / AWS Data Factory Pipelines.Working understanding of Columnar MPP Cloud data warehouse using Snowflake.Working knowledge managing data in the Data Lake.Business analysis experience to analyze data to write code and drive solutions.Working knowledge of : Git, Azure DevOps, Agile, Jira and Confluence.Healthcare and / or Payment processing experience.