Role Overview :
We are seeking a talented Informatica BDM (Big Data Management) Developer to design, develop, and optimize data integration workflows on modern data platforms. The ideal candidate will have hands-on experience in ETL development, data pipelines, and data lake integrations using Informatica BDM, with a strong understanding of Hadoop or cloud ecosystems.
Key Responsibilities :
- Design, develop, and maintain ETL / ELT workflows using Informatica BDM for data ingestion, transformation, and loading.
- Integrate data from multiple structured and unstructured sources into enterprise data lakes or warehouses.
- Optimize mappings and workflows for high performance, scalability, and fault tolerance.
- Collaborate with business analysts and data architects to understand data requirements and translate them into technical solutions.
- Work with Hadoop, Hive, Spark, or cloud-based data platforms (AWS, Azure, GCP) to process large datasets.
- Perform data quality checks, validation, and documentation for all data flows.
- Support deployment and troubleshooting in development and production environments.
Required Skills & Experience :
Atleast 3-4 years of hands-on experience with Informatica BDM / PowerCenter / IICS.Strong expertise in ETL development, data warehousing concepts, and performance tuning.Experience with Hadoop ecosystems (HDFS, Hive, Spark) or cloud-native data services (AWS Glue, Azure Data Factory, BigQuery).Solid understanding of SQL, data modeling, and data governance principles.Good communication and problem-solving skills with the ability to work in cross-functional teams.Preferred Skills (Good to Have) :
Experience with Python / Scala scripting for data processing.Knowledge of Informatica Data Quality (IDQ) or Enterprise Data Catalog (EDC).Exposure to Master Data Management (MDM) or Data Governance frameworks.Experience working in Agile or DevOps :Bachelor's degree in Computer Science, Information Systems, or related field. Informatica or Cloud certifications are a plus.(ref : hirist.tech)