Position Description
Data Engineering at Elanco is growing across ingestion, integration, transformation, and consumption capabilities to deliver data products that will transform how the organization leverages data. The Data Engineering and Platforms team is seeking an experienced Data Engineer to provide technical expertise and guidance to both internal and partner teams working within our Enterprise Data environment. This role will be instrumental in shaping technical roadmaps, defining best practices, and influencing architectural decisions to drive innovation at scale.
This role demands deep technical expertise, strategic contribution, and architectural insight across data engineering initiatives. It will require collaboration with leadership teams, enterprise architects, and key stakeholders to contribute to enterprise-wide data engineering strategies, ensuring scalable, resilient, and cost-optimized data solutions.
To be successful in an engineering role at Elanco requires a highly motivated individual with an innovative mindset and willingness to drive tangible outcomes. The individual must be able to articulate complex technical topics, collaborate with internal and external partners, and ensure quality delivery of the required data products.
Reporting to the Associate Director - Data Engineering, the Consultant is responsible for unlocking and orchestrating the smooth flow of data, ensuring stable pipelines and data products, and communicating our capabilities and patterns in easily consumable, compelling ways. This role focuses on speed to value, improving our organization's access to useful data, and championing continual improvement.
Responsibilities
Strategic Technical Expertise : Provide expert guidance and hands-on contribution to the vision, design, and execution of scalable, high-performance data platforms and solutions.
Business Impact & Value Creation : Drive data-driven decision-making, ensuring solutions deliver measurable business impact across the organization.
Enterprise Data Architecture : Define and implement data architecture principles, ensuring alignment with Elanco's enterprise-wide data strategy.
Innovation & Modernization : Contribute to modernization initiatives to transition legacy data products to modern data architectures, ensuring optimal performance and scalability.
Technical Governance : Establish enterprise standards, frameworks, and patterns for data engineering, ensuring alignment with security, compliance, and performance best practices.
Hands-on Technical Contribution : Provide architectural guidance and hands-on technical expertise to engineering teams, ensuring best-in-class data product development.
Data Pipeline Optimization : Architect and oversee the development of highly efficient, fault-tolerant, and cost-optimized data pipelines across Azure, Databricks, and GCP.
Security and Compliance : Partner with security teams to ensure data engineering solutions adhere to security and regulatory standards, implementing governance best practices.
Cross-Functional Collaboration : Work closely with Product Owners, Data Architects, and Engineering Squads to deliver robust data solutions in agile sprints.
Future-Proofing Data Engineering Capabilities : Continuously evaluate new tools, frameworks, and industry trends to future-proof Elanco's data engineering landscape. Advise on the adoption of AI-driven automation, DataOps, and DevSecOps methodologies.
Provide architectural expertise and support in Agile delivery, contributing to sprint planning, backlog refinement, and iterative solution design. Embed a culture of "working out loud" to drive transparency and collaboration.
Drive proof-of-concept initiatives, rapid prototyping, and pilot implementations to test and validate new data engineering approaches. Offer hands-on guidance for the development of highly scalable and reliable data products.
Provide expert-level technical solutions and support for complex data engineering challenges, diagnosing issues across ingestion, processing, and storage layers.
Collaborate with Data Architects and Engineering Teams to ensure consistency in data engineering patterns across multiple domains, enabling a unified data ecosystem with seamless data interoperability.
Leverage modern product approaches to influence and shape the business, e.g. discovery, rapid prototyping, and embedding a culture of working out loud.
Qualifications
Bachelor's Degree in Computer Science, Software Engineering, or equivalent professional experience.
8-12 years of experience engineering and delivering enterprise scale data solutions, with examples in the cloud (especially Databricks, Azure, and GCP) strongly preferred.
Additional Skills / Preferences
Proven track record in contributing to and delivering on complex data projects.
Expertise in data management, information integration and analytics practices and capabilities.
Experience working with modern data architecture and engineering methodologies (Domain driven data architecture, Scalable data pipelines, DataOps (CI / CD), API-Centric Design, SQL / NoSQL, FAIR data principles, etc.)
Profound expertise in Azure Data Factory (ADF) for developing robust and scalable data pipelines.
Demonstrated proficiency and extensive experience with GitHub for version control, collaboration, and CI / CD workflows.
Experience working within a "DevSecOps" culture, including modern software development practices, covering Continuous Integration and Continuous Delivery (CI / CD), Test-Driven Development (TDD), etc.
Familiarity with machine learning workflows, data quality, and data governance.
Experience working in complex, diverse landscapes (business, technology, regulatory, partners, providers, geographies, etc.)
Good interpersonal and communication skills; proven ability to work effectively within a team.
Data Engineer • KA, India