About the Job :
The ideal candidate will be responsible for designing, developing, and maintaining robust ETL processes. The role requires expertise in handling large-scale data integration, ensuring data accuracy, and delivering scalable solutions aligned with business Responsibilities :
- Design, develop, and implement efficient ETL processes to support data integration and business intelligence needs.
- Analyze, troubleshoot, and maintain existing ETL workflows and pipelines.
- Optimize ETL performance and ensure scalability of data processes.
- Ensure data quality, consistency, and reliability across systems.
- Collaborate with data architects, analysts, and business stakeholders to gather requirements and deliver
solutions.
Perform unit testing and support UAT to validate ETL processes.Maintain documentation for ETL processes, workflows, and data mappings.Monitor ETL jobs and proactively resolve failures or performance issues.Contribute to continuous improvement by recommending new tools, technologies, and best Skills & Requirements :Strong hands-on experience with ETL tools (e.g., Informatica, Talend, DataStage, SSIS, or equivalent).Proficiency in SQL and relational database management systems (Oracle, MySQL, SQL Server, PostgreSQL, etc.)Solid understanding of data warehousing concepts, data modeling, and schema design.Experience in handling large datasets, data transformation, and integration.Knowledge of performance tuning and optimization in ETL workflows.Familiarity with scripting languages (Python, Shell, or similar) for automation.Good understanding of cloud data platforms (AWS, Azure, GCP) is a plus(ref : hirist.tech)