Responsibilities and Accountabilities :
- Data Pipeline Development : Design, build, and optimize data pipelines using DWH technologies, Databricks, QLIK and other platforms as required. Ensuring data quality, reliability, and scalability.
- Application Transition : Support the migration of internal applications to Databricks (or equivalent) based solutions. Collaborate with application teams to ensure a seamless transition.
- Managing Continuous Improvement, Continuous Development, DevOps and RunOps activities at application, data and infra levels either on cloud or on premise.
- Mentorship and Leadership : Lead and mentor junior data engineers. Share best practices, provide technical guidance, and foster a culture of continuous learning.
- Data Strategy Contribution : Contribute to the organization’s data strategy by identifying opportunities for data-driven insights and improvements.
- Participate in smaller focused mission teams to deliver value driven solutions aligned to our global and bold move priority initiatives and beyond.
- Design, develop and implement robust and scalable data analytics using modern technologies.
- Collaborate with cross functional teams and practises across the organisation including Commercial, Manufacturing, Medical, FoundationX, GrowthX and support other X (transformation) Hubs and Practices as appropriate, to understand user needs and translate them into technical solutions.
- Provide Technical Support to internal users troubleshooting complex issues and ensuring system uptime as soon as possible.
- Champion continuous improvement initiatives identifying opportunities to optimise performance security and maintainability of existing data and platform architecture and other technology investments.
- Participate in the continuous delivery pipeline. Adhering to DevOps best practises for version control automation and deployment. Ensuring effective management of the FoundationX backlog.
- Leverage your knowledge of data engineering principles to integrate with existing data pipelines and explore new possibilities for data utilization.
- Stay-up to date on the latest trends and technologies in data engineering and cloud platforms.
Requirements
Experience :
At least 5+ years demonstrable experience in :
Data engineering with a strong understanding of PySpark and SQL, building data pipelines and optimization.Data engineering and integration tools (e.g., Databricks, Change Data Capture)Utilizing cloud platforms (AWS, Azure, GCP). A deeper understanding / certification of AWS and Azure is considered a plus.Experience with relational and non-relational databases.Qualifications :
Bachelor's degree in computer science, Information Technology, or related field (Master’s preferred) or equivalent experience.
Any relevant cloud-based integration certification at associate or professional level. For example :
AWS certified DevOps engineer (Associate or Professional),AWS Certified Developer (Associate or Professional)DataBricks Certified EngineerQlik Sense Data Architect / Business Analyst (or similar platform)Mulesoft Certified integration architect Level 1,Microsoft Certified Azure Integration and Security.Proficient in RESTful APIsAWS, CDMP, MDM, DBA, SQL, SAP, TOGAF, API, CISSP, VCP (any relevant certification)MuleSoft
Understanding of MuleSoft's Anypoint Platform and its componentsExperience with designing and managing API-led connectivity solutionsKnowledge of integration patterns and best practicesAWS
Experience provisioning, operating, and managing AWS environmentsExperience developing code in at least one high-level programming languageUnderstanding of modern development and operations processes and methodologiesAbility to automate the deployment and configuration of infrastructure using AWS services and toolsExperience with continuous integration and continuous delivery (CI / CD) methodologies and toolsMicrosoft Azure
Fundamental understanding of Microsoft Azure and AWS and the data services providedExperience with Azure services related to computing, networking, storage, and securityKnowledge of general IT security principles and best practicesUnderstanding of cloud integration patterns and Azure integration services such as Logic Apps, Service Bus, and API ManagementPreferred Qualifications :
Subject Matter Expertise : possess a strong understanding of data architecture / engineering / operations / reporting within Life Sciences / Pharma industry across Commercial, Manufacturing and Medical domains.Other complex and highly regulated industry experience will also be considered for e.g. healthcare, government or financial services.Data Analysis and Automation Skills : Proficient in identifying, standardizing, and automating critical reporting metrics and modelling toolsAnalytical Thinking : Demonstrated ability to lead ad hoc analyses, identify performance gaps, and foster a culture of continuous improvement.Technical Proficiency : Strong coding skills in SQL, R, and / or Python, coupled with expertise in machine learning techniques, statistical analysis, and data visualization.Agile Champion : Adherence to DevOps principles and a proven track record with CI / CD pipelines for continuous delivery.Other critical skills required
Cross-Cultural Experience : Work experience across multiple cultures and regions, facilitating effective collaboration in diverse environments.Innovation and Creativity : Ability to think innovatively and propose creative solutions to complex technical challenges.Global Perspective : Demonstrated understanding of global pharmaceutical or healthcare technical delivery, providing exceptional customer service and enabling strategic insights and decision-making.