Design, develop, and implement ETL processes using industry-standard tools and technologies.
Extract data from various sources such as databases, flat files, APIs, and cloud applications.
Transform data using techniques such as data cleansing, data enrichment, and data aggregation.
Load data into target systems such as data warehouses, data marts, and data lakes.
Monitor and maintain ETL pipelines to ensure data quality and data integrity.
Troubleshoot and resolve data integration issues.
Optimize ETL processes for performance and efficiency.
Collaborate with data analysts, data engineers, and business stakeholders to understand data requirements and translate them into technical specifications.
Stay abreast of the latest advancements in data integration technologies and best practices.
Required Skills :
3+ years of experience in ETL development and data warehousing.
Strong proficiency in SQL and experience with relational databases (Oracle, SQL Server, MySQL).
Experience with ETL tools such as Informatica PowerCenter, Talend, SSIS, or Apache NiFi.
Experience with scripting languages (Python, Shell scripting).
Knowledge of data warehousing concepts and best practices.
Strong analytical and problem-solving skills.
Excellent communication and interpersonal skills.
Preferred Skills :
Experience with cloud platforms (AWS, Azure, GCP) and cloud data services (AWS Glue, Azure Data Factory).
Experience with big data technologies such as Hadoop, Spark, and Hive.
Knowledge of data modeling and data quality best practices.