Key Responsibilities
- Design, build, optimize, and support new and existing data models and ETL processes based on client requirements.
- Build, deploy, and manage scalable data infrastructure to handle the needs of a growing data-driven organization.
- Coordinate data access and security to ensure data scientists and analysts have timely access to required data.
- Design, develop, and maintain Ab Initio graphs for ETL from diverse sources to various targets.
- Implement data quality and validation processes within Ab Initio.
- Collaborate with data architects and business analysts to translate requirements into efficient ETL processes.
- Analyze and model data to ensure optimal ETL design and performance.
- Utilize Ab Initio components (Transform Functions, Rollup, Join, Normalize, etc.) to build scalable, efficient data integration solutions.
- Implement best practices for reusable Ab Initio components.
- Optimize Ab Initio graphs for performance, including tuning and troubleshooting as required.
- Work closely with cross-functional teams (analysts, DBAs, QA) to ensure smooth integration of ETL processes.
- Participate in design reviews and provide technical expertise to enhance solution quality.
- Document processes and designs to ensure maintainability and knowledge sharing.
Skills Required
Data Engineer, Ab Initio, Performance Tuning, Data Processing, data engineering , data infrastructure