About the Role :
We are seeking an experienced Enterprise Data Manager with deep expertise in Azure Data Factory (ADF) and cloud-based data management solutions.
The ideal candidate will lead enterprise-wide data initiatives, oversee ETL / ELT pipelines, and drive data strategy across business units.
This role requires a strong combination of technical hands-on skills, leadership, and stakeholder management to ensure high-quality, scalable, and reliable data solutions.
The Enterprise Data Manager will work closely with business stakeholders, architects, and engineering teams to manage data integration, governance, and analytics pipelines on Azure Cloud.
Key Responsibilities :
- Lead enterprise data management initiatives including data integration, transformation, and migration using Azure Data Factory.
- Define and enforce data governance, quality, and security standards across the organization.
- Develop and implement data architecture standards, ensuring alignment with overall enterprise architecture.
- Act as a subject matter expert in Azure cloud data services, including ADF, Data Lake, Synapse Analytics, and Databricks.
- Design, develop, and maintain ETL / ELT pipelines in Azure Data Factory to move and transform data efficiently.
- Monitor, optimize, and troubleshoot ADF pipelines, ensuring high availability and performance.
- Collaborate with Data Engineers and Data Analysts to ensure data is accessible, accurate, and reliable.
- Implement reusable templates, modular pipelines, and best practices for scalable data integration.
- Lead and mentor a team of Data Engineers and ETL developers, fostering a culture of innovation and best practices.
- Work closely with Data Architects, Business Analysts, and Product Owners to translate business requirements into technical solutions.
- Ensure effective project management and timely delivery of data solutions in line with business priorities.
- Conduct code reviews, pipeline audits, and ensure adherence to coding, documentation, and operational standards.
- Define data lineage, data cataloging, and metadata management standards.
- Implement security and compliance measures to safeguard enterprise data (e.g., GDPR, HIPAA).
- Collaborate with IT Security and Compliance teams to enforce access controls and audit requirements.
- Monitor pipeline performance, storage utilization, and data throughput.
- Identify bottlenecks and implement performance tuning, cost optimization, and automation strategies.
- Establish metrics, dashboards, and reporting mechanisms to track data pipeline health and usage.
Required Skills & Expertise :
8+ years of experience in data engineering, ETL / ELT, or data integration.Strong hands-on experience with Azure Data Factory, Azure Data Lake, Synapse Analytics, Databricks, and related cloud services.Proficiency in SQL, Python, or other data transformation languages.Experience designing and implementing scalable, high-performance ETL pipelines.Strong understanding of data governance, data quality, and metadata management.Familiarity with version control (Git), CI / CD pipelines, and data deployment best practices.Excellent analytical, problem-solving, and communication skills(ref : hirist.tech)