Data Engineer - Informatica & Cloud
Work Location : Bangalore / Pune
Experience : 6 - 9 Years
Notice Period : Immediate Joiners Only
We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have extensive experience across both on-premise and cloud data integration platforms, with a strong focus on IICS, Python , AWS, and Snowflake . This is an excellent opportunity for a hands-on engineer who is passionate about building robust and scalable data solutions.
Responsibilities :
- Design, develop, implement, and maintain ETL processes using and Informatica Intelligent Cloud Services (IICS) to extract, transform, and load data from various sources into target systems, including Snowflake DB.
- Develop and optimize Python scripts for data manipulation, automation, and integration tasks.
- Work extensively with AWS services for data storage, processing, and analytics (e.g., S3, Redshift, Glue).
- Implement and manage data pipelines within the Snowflake ecosystem, ensuring data quality, consistency, and performance.
- Utilize Unix and Shell Scripting for job scheduling, file manipulations, and process automation.
- Collaborate with data architects, business analysts, and other stakeholders to understand data requirements and translate them into technical specifications.
- Troubleshoot, debug, and optimize existing data solutions for performance and reliability.
- Ensure adherence to best practices in data governance, security, and data quality.
Required Skills (All Mandatory) :
Informatica PowerCenter : Proven experience with ETL development, mapping, workflow design, and performance tuning.Informatica Intelligent Cloud Services (IICS) / Cloud Data Integration (CDI) : Strong hands-on experience in building cloud-native data integration solutions.Python : Proficient in Python for data processing, scripting, and API integrations.AWS (Amazon Web Services) : Solid understanding and practical experience with core AWS data services (e.g., S3, Redshift, Glue).Snowflake DB : Expertise in Snowflake data warehousing, including SQL, data loading, and optimization.Unix / Linux & Shell Scripting : Competency in Unix / Linux commands and shell scripting for automation and system interaction.Qualifications :
Bachelor's degree in Computer Science, Information Technology, or a related field.6-9 years of progressive experience in data engineering, ETL development, and cloud data platforms.Ability to work independently and collaboratively in a fast-paced environment.Strong analytical, problem-solving, and communication skills.ref : hirist.tech)