Job Description
Responsible for design, development, implementation, testing, documentation, and support of analytical and data solutions / projects requiring data aggregation / data pipelines / ETL / ELT from multiple sources into an efficient reporting mechanism, database / data warehouse using appropriate tools like Informatica, Azure Data Factory, SSIS. This includes interacting with business to gather requirements, analysis, and creation of functional and technical specs, testing, training, escalation, and follow-up.
- Development, maintenance, and enhancement of Data Pipelines (ETL / ELT) and processes with thorough knowledge of star / snowflake schemas
- Developing complex SQL queries and SQL optimization
Development experience must be full Life Cycle experience including business requirements gathering, data sourcing, testing / data reconciliation, and deployment within Business Intelligence / Data Warehousing Architecture.
Understanding of Data ArchitectureKnowledge of ETL and data engineering standards and best practices for the design and development of data pipelines and data extract, transform and load processesDesign, build and test data products based on feeds from multiple systems, using a range of different storage technologies, access methods or bothKnowledge of data warehousing concepts, including multi-dimensional models and ETL logic for maintaining star-schemasGood Understanding of concepts and principles of data modelling.Ability to produce, maintain and update relevant data models for specific needs.Can reverse-engineer data models from a live systemSQL programming desirable (i.e., stored procedures dev.)Proficient in data analysis, defect identifications and resolutions.Strong professional verbal and written communication skills.Ability to work with little supervision and within changing priorities.Ability to analyze requirements and troubleshoot problems.Requirements
IDMC Admin, Data Engineer, ETL , Power Center