Job Title : ETL Developer - Data Lake Integration with ServiceNow.
Location : Remote.
Experience Required : 6+ years.
Work Hours : 6 : 30 PM to 3 : 30 AM IST].
Employment Type : Full-time.
Job Overview :
We are seeking a highly skilled ETL Developer with strong experience in designing and building data pipelines for enterprise Data Lakes and integrating with ServiceNow.
The ideal candidate will have hands-on expertise in ETL tools and frameworks, excellent knowledge of data ingestion patterns, and a solid understanding of the ServiceNow data model.
This role involves designing, developing, and maintaining scalable ETL solutions that support real-time and batch data loads into centralized Data Lakes.
Key Responsibilities :
- Design and develop robust, scalable ETL pipelines for structured and unstructured data into enterprise Data Lake platforms (e.g., Azure Data Lake, AWS S3, or GCP BigQuery).
- Build integrations and data extraction routines from ServiceNow, including ITSM, ITOM, CMDB, and related modules.
- Perform data transformation, cleansing, mapping, and enrichment using ETL / ELT tools.
- Develop and maintain end-to-end data ingestion workflows for operational and analytics purposes.
- Collaborate with architects, ServiceNow developers, and business teams to define data needs and ensure high data quality.
- Optimize ETL processes for performance, scalability, and reliability.
- Implement job scheduling, logging, error handling, and alerting mechanisms.
- Create technical documentation, data flow diagrams, and maintain metadata management.
Required Skills & Experience :
6+ years of experience in ETL development, preferably with tools like Informatica, Talend, Apache NiFi, Azure Data Factory, Glue, or similar.Proven expertise in working with Data Lakes (Azure Data Lake, AWS Lake Formation, GCP, etc.) - mandatory.Solid SQL skills and experience with relational and non-relational databases.Experience in extracting data from ServiceNow via REST / SOAP APIs, direct DB connectors, or ServiceNow Data Export tools.Good understanding of the ServiceNow data model - especially for ITSM, CMDB, or ITOM.Proficiency in scripting languages (e.g., Python, Shell) to support automation and transformation tasks.Familiarity with cloud storage, data warehouses, and modern data lakehouse architecture.Knowledge of data governance, data lineage, and security best practices.Preferred Qualifications :
Experience in building or supporting Business of IT Data Lakes using ServiceNow data.Familiarity with ServiceNow tables like cmdb_ci, incident, task, alm_asset, etc.Prior experience with data cataloging and metadata management tools.Working knowledge of DevOps, CI / CD pipelines, and source control (e.g., Git).ServiceNow Certified System Administrator or ETL / data certifications are a plus.Soft Skills :
Strong problem-solving and analytical skills.Excellent communication and documentation abilities.Ability to work independently in a fast-paced environment and collaborate across teams.(ref : hirist.tech)