Function
Support and enhancement to existing applications , migration activities
Development of Data Pipelines in Informatica DEI and Databricks
Documentation : Documenting the entire data flow processes, including workflows
Qualifications & Experience
- Graduation(BE / Btec / MCA) with 3-5 Years
- Must have Minimum 5 + years of exp in Informatica Data Engineering tool in Hadoop environment and Big Data Ecosystem.
- Must have Minimum 5+ years of experience in designing Data Pipeline in Azure Cloud Data Lake and Informatica Data Engineering.
- Must have a understanding of Azure Stack (Azure Data Bricks, Azure streaming Analytics & Azure Synapse)
- Must Have Unix knowledge for writing shell scripts and troubleshoot of existing shell scripts.
- Experience in any programming language like Python / Scala.
- Must have SQL knowledge of writing & tuning complex query / script in RDBMS / Hive.
- Understanding of informatica components / services / Repository.
- Good understanding of informatica different type of connectors
- Experience in Financial & Banking domain will added advantage.
Skills Required
Unix, Hadoop, Azure Data Bricks, Scala, Sql, Azure Synapse, Hive, Rdbms, Databricks, Python