Key Responsibilities :
- Design and develop ETL jobs using Talend to extract, transform, and load data from various sources (e.g., databases, APIs, flat files).
- Collaborate with data architects, business analysts, and stakeholders to understand business requirements.
- Optimize existing ETL workflows for performance, scalability, and reliability.
- Ensure data quality, integrity, and consistency across systems.
- Monitor and troubleshoot ETL jobs in development, test, and production environments.
- Work with various data sources including SQL Server, Oracle, MySQL, PostgreSQL, and cloud-based platforms.
- Create technical documentation for ETL processes and data flows.
- Participate in code reviews and adhere to development best practices and standards.
Required Skills :
Hands-on experience with Talend Open Studio or Talend Data Integration.Strong proficiency in SQL and relational databases.Knowledge of data modeling, data warehousing concepts, and ETL best practices.Familiarity with big data platforms (e.g., Hadoop, Spark) is a plus.Experience with cloud platforms (e.g., AWS, Azure, GCP) and data lakes is advantageous.Understanding of API integration (REST / SOAP).Ability to write and debug Java code used within Talend.Familiarity with version control systems (e.g., Git) and CI / CD pipelines is a plus.Skills Required
Sql Server, Oracle, Mysql, Postgresql, Aws, Azure, Gcp