Key Responsibilities :
- Experience in Azure Data Factory, Databricks, Azure Data Lake, and Azure SQL Server.
- Develop ETL / ELT processes using SSIS and / or Azure Data Factory.
- Build complex pipelines and dataflows using Azure Data Factory.
- Design and implement data pipelines using Azure Data Factory (ADF).
- Improve functionality / performance of existing data pipelines.
- Performance tuning of processes dealing with very large data sets.
- Configuration and deployment of ADF packages.
- Proficient in using ARM Template, Key Vault, and Integration Runtime.
- Adapt to work with ETL frameworks and standards.
- Strong analytical and troubleshooting skills to root cause issues and find solutions.
- Propose innovative, feasible, and best solutions for the business requirements.
- Knowledge of Azure technologies / services such as Blob Storage, ADLS, Logic Apps, Azure SQL, Web Jobs.
Additional Skills :
Expert in ServiceNow, Incidents, and JIRA.Should have exposure to Agile methodology.Expert in understanding and building PowerBI reports using the latest methodologies.Competencies :
Strong analytical skills and problem-solving abilities.Excellent communication and collaboration with cross-functional teams.Innovative thinking and ability to provide effective business solutions.Key Skills :
Azure, ADF, Databricks.Migration project experience.Soft Skills :
Same as Data Engineer role :Strong problem-solving and analytical thinking.Ability to work effectively in a collaborative team environment.Excellent communication skills.Qualification :
Engineer Graduate.Certifications (Preferred) :
Azure Certification.Databricks Certification.Skills Required
Azure Data Factory, Databricks, Azure Data Lake