Job descriptionDesign, develop, and maintain ETL / ELT pipelines using Informatica and Snowflake .Collaborate with data architects, analysts, and business stakeholders to understand requirements and translate them into robust data solutions.Perform data modeling (conceptual, logical, and physical) and implement efficient warehouse structures in Snowflake.Optimize Informatica mappings, sessions, and workflows for performance and scalability.Manage data ingestion , transformation, and integration from multiple source systems (structured and unstructured).Implement data quality , data governance , and best practices for data management.Monitor, troubleshoot, and tune ETL / ELT jobs for performance and reliability.Support migration and modernization initiatives (e.g., from on-premise ETL to cloud-based Snowflake environments).Provide technical leadership and mentorship to junior data engineers.Total Experience : Minimum 10–12 years in data engineering or related roles.Informatica : Minimum 8 years of hands-on experience in ETL development , workflow design , and performance tuning .Snowflake : Minimum 4 years of practical experience in data warehousing , data modeling , and ELT pipeline development .Strong experience with SQL and performance optimization techniques.Solid understanding of data integration , data quality frameworks , and metadata management .Experience with cloud platforms (AWS, Azure, or GCP) and data pipeline orchestration tools.Excellent analytical, problem-solving, and communication skills.