Technical functional role.
Design, develop, and maintain Informatica ETL workflows (PowerCenter, IDQ, IICS) for large-scale data integration.
Work with cloud platforms (Azure, AWS, GCP) and integrate data with Snowflake, Synapse, Redshift, BigQuery or other cloud data warehouses.
Implement data quality, profiling, and cleansing using Informatica IDQ.
Optimize ETL / ELT pipelines for high performance and scalability.
Develop real-time and batch data pipelines using Informatica CDC, Kafka, Spark, or other streaming technologies.
Collaborate with data architects, analysts, and business teams to gather requirements and design robust data solutions.
Ensure data governance, security, and compliance practices are followed.
Support and troubleshoot issues in production environments, ensuring minimal downtime.
Mentor junior engineers and contribute to best practices for data engineering and DevOps automation.
Primary Skills (Must-Have)
Strong hands-on experience with Informatica PowerCenter (ETL / ELT) and Informatica Data Quality (IDQ).
Expertise in SQL and PL / SQL (Oracle, SQL Server, Teradata, DB2, etc.).
Experience with Informatica Intelligent Cloud Services (IICS) Data Integration (DI), Application Integration (AI), API Management.
Strong understanding of cloud platforms (Azure, AWS, or GCP) and their data services.
Proficiency in integrating with Cloud Data Warehouses (Snowflake, Synapse, Redshift, BigQuery).
Hands-on knowledge of data modeling (Star, Snowflake schemas, OLTP, OLAP).
Proven ability to handle large-scale data integration and performance tuning.
Secondary Skills (Good-to-Have)
Programming & Automation : Python, Java, or Shell scripting.
Big Data Ecosystem : Hadoop, Spark, or Databricks for large-scale data processing.
DevOps / CI-CD : Jenkins, Git, Azure DevOps for deployment automation.
Informatica • bangalore, India