Note : If shortlisted, you will be invited for initial rounds on 8th November'25 (Saturday) in :
About the role A Data Engineer is responsible for designing and modelling Data lake solutions / data ingestion pipelines by collaborating with our customers and understanding requirements on digital transformation.
Able to navigate through ambiguous situations, to deal with aggressively changing environment.
The drive to collaborate, gather feedback, solve problems, and tackle challenges through test and learn is highly valuable in this position.
Should work closely with Software Engineers, Data Scientists, Azure & Network Administrator teams to build a scalable & compliant data system. Work with different stakeholders as SME for Data Engineering work.
Responsibilities :
- Be the owner for data engineering solutions within Healthcare Intelligence. Ensure the continuity of data processes and the associated batch jobs.
- Identify, design and implement / coordinate implementation of scalable processes and infrastructure to have good governance of automated data processes.
- Manage the end-to-end data solutions of our customers : from raw data analysis to data flow and predictive framework configurations.
- Should be able to work on a problem independently and prepare client ready deliverable with minimal supervision. Elicit, analyze, and validate customer data from ingestion to production
- Monitor all data update processes and outputs to ensure predictive quality
- Communicate with customers to discuss any issues with received data and help them identify and fix data issues
- Solve day-to-day Data problems and customer challenges
- Own the automation, deployment and operation of data pipelines on MS Azure
- Build tools and mechanism to monitor and optimize different parts of the systems
- Build custom integrations between cloud-based systems using APIs
Skills and qualifications Required :
Expertise in ETL tools like Informatica / DataStage / Ab initio.Familiarity with Linux / Unix scripting, Python, SQL Queries and Database concepts required.Exposure to cloud onboard from legacy data sets.Working experience on Azure Data Platform and cloud computingExpertise in technologies like Datalake ADLSGen2, DataBricks, Azure Data factory, Azure SQL, AzureSynapse etc.
Experience in creating data orchestration using ADF and optimizing them through regular monitoring.Good to have :
Experience in batch scheduling & rationalization through Control MExposure to reporting tools like PowerBI / Tableau etc.(ref : hirist.tech)