The Data Engineer plays a pivotal role in this vision.
You wont just be moving data from point A to point B youll be designing the pipelines and platforms that power predictive services, anomaly detection, and intelligent automation.
From ServiceNow tickets to mainframe telemetry, youll turn raw, messy signals into high-quality, AI / ML-ready datasets that fuel real-time insights and proactive operations.
This is not a back-office role its a frontline enabler of transformation.
The work you do directly impacts uptime, cost optimization, and the ability for Ensono to move from manual, reactive support to a zero-touch, predictive model.
We are looking for engineers who dont just architect pipelines but get stuff donebuilders who can deliver working solutions, iterate quickly, and collaborate with data scientists, ML engineers, and ops teams to make sure models dont just run in notebooks but actually change how work gets done.
If you want to be part of the team thats rewiring managed services for the AI era, this is your role.
What You Will Do :
- Data Pipeline Development Build, optimize, and maintain ELT / ETL pipelines that move, clean, and organize data from ServiceNow, mainframe, distributed, and cloud systems.
- Integration with ServiceNow Develop robust data extraction, transformation, and ingestion patterns tailored for operational data (incidents, alerts, changes, requests) to make it AI / ML-ready.
- Data Infrastructure & Architecture Design scalable data models, storage frameworks, and integration layers in Snowflake and other modern platforms.
- Data Quality & Governance Implement standards, monitoring, and validation frameworks to ensure clean, trustworthy data across all pipelines.
- Collaboration with AI / ML Teams Partner with Data Scientists, ML Engineers, and MLOps to deliver production-grade datasets powering predictive models, anomaly detection, and intelligent runbooks.
- Automation & Optimization Identify opportunities to streamline data workflows, reduce manual intervention, and lower costs while improving reliability.
- Cross-functional Enablement Work with Finance, Procurement, Cloud Ops, Mainframe Ops, and Service Operations teams to ensure data is aligned to high-value business outcomes.
Required Skills & Experience :
Strong SQL & Data Modeling skills.Expertise in ELT / ETL pipeline development and orchestration.Python (must-have) plus at least one of Java, Scala, or C++.Hands-on experience with Snowflake or equivalent cloud data warehouse platforms.Proven experience extracting, transforming, and operationalizing data from ServiceNow and common monitoring platforms / other enterprise systems (Workday, Concur, etc).Familiarity with observability tooling and distributed data systems.Knowledge of enterprise data governance, compliance, and lineage.4+ years of experience preferred.Bonus : experience working directly with AI / ML feature pipelines.Mindset & Values :
Get Stuff Done Youre biased toward execution and results, not endless design cycles.Business Impact Driven You build pipelines that move the needle on uptime, cost reduction, and predictive operations.Collaborative Partner You thrive in a cross-functional environment, sitting at the intersection of Ops, AI / ML, and business stakeholders.Continuous Learner Always looking for ways to apply new tools and technologies to accelerate delivery.Success Looks Like :
Reliable pipelines that pull ServiceNow data into Snowflake for real-time incident prediction.Faster transition of AI / ML proof of concepts into production pipelines.Demonstrated cost savings through automated workload optimization and capacity forecasting.Enabling predictive services to scale seamlessly across mainframe, distributed, and cloud environments.(ref : hirist.tech)