Description :
Experience : 1015 Years
Role Type : Full-Time / Leadership
Domain : Data Engineering, BI & Analytics
1. Role Overview :
The Data Warehouse Delivery Manager will lead the end-to-end delivery of enterprise data engineering, analytics, and BI platforms.
This role is responsible for managing multiple data projects across Azure Data Factory (ADF), Databricks, Power BI, and modern Data Warehouse architectures.
The manager will oversee requirement analysis, solution design, development, deployment, release management, and stakeholder communication while ensuring on-time, high-quality delivery.
2. Key Responsibilities :
A. Delivery & Project Management :
- Own the full delivery lifecycle for Data Warehouse, ADF pipelines, Databricks notebooks, and Power BI solutions.
- Define delivery plans, sprint schedules, resource allocations, risks, dependencies, and timelines.
- Ensure adherence to SDLC, DevOps, CI / CD, data governance, and release management best practices.
- Track project KPIs, velocity, backlog grooming, and ensure milestone compliance.
- Facilitate daily stand-ups, sprint reviews, and release planning sessions.
B. Technical Leadership :
Lead architecture discussions for ADF pipelines, ETL / ELT frameworks, Data Lakehouse patterns, and DWH transformations.Review high-level and low-level design (HLD / LLD) documents for data pipelines, Delta Lake tables, medallion architecture, and Power BI semantic models.Define and enforce coding standards, pipeline patterns, and QA / automation checkpoints.Provide technical guidance on :o ADF : orchestration, pipeline optimization, triggers, parameterization, IR management.
o Databricks : PySpark, Delta Lake, structured streaming, performance tuning.
o Data Warehouse : dimensional modeling, SCD types, fact / dim design, incremental loads, CDC.
o Power BI : data modeling, DAX governance, RLS / OLS, gateway configuration, performance tuning.
C. Stakeholder & Client Management :
Act as the primary liaison between business stakeholders, SMEs, data architects, and development teams.Convert ambiguous high-level business requirements into detailed technical specifications.Present architectural options, progress updates, risk registers, mitigation plans, and dashboards.Manage escalations, SLAs, quality adherence, and deliverable review cycles.D. Team Leadership & People Management :
Lead a multi-disciplinary team of Data Engineers, BI Developers, QA, and DevOps engineers.Mentor and upskill team members on ADF, Databricks, SQL performance tuning, DAX optimization, and dataset governance.Conduct performance reviews, identify training needs, and develop capability maturity within the team.E. Governance, Quality & Compliance :
Establish frameworks for data validation, data quality checks, schema enforcement, and data lineage.Ensure compliance with data security, PII handling, encryption, and Azure policies.Implement monitoring frameworks using Azure Monitor, Log Analytics, Databricks metrics, and Power BI usage analytics.3. Required Skills & Experience :
Core Technical Skills
8+ years of experience in modern Data Warehouse engineering using Azure ecosystem.Hands-on expertise in :o Azure Data Factory (ADF) pipelines, dataflows, triggers, parameterization, CI / CD.
o Databricks PySpark, SQL, Delta Lake, notebooks, clusters, optimization.
o Power BI data modeling, DAX, Power Query, incremental refresh, RLS / OLS.
o Azure Data Lake Storage (ADLS) folder structures, ACLs, best practices.
o SQL T-SQL, UDFs, stored procedures, performance tuning.
o Data Warehouse Concepts dimensional modeling, SCD, fact / dim design, ETL / ELT.
Management & Leadership Skills
Minimum 35 years in delivery or program management roles.Proven ability to manage cross-functional teams in agile and hybrid delivery models.Strong experience in stakeholder management, escalation handling, and client communication.Ability to prioritize tasks, manage budgets, and deliver complex multi-stream programs.4. Good to Have :
Experience with Azure Synapse, Fabric, or Snowflake.Exposure to metadata-driven frameworks or DataOps automation.Experience with enterprise data governance (Purview).Understanding of DevOps (Azure DevOps, Git, CI / CD).Knowledge of API, Logic Apps, or Event Hub integrations.5. Education :
Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or equivalent.6. Personal Attributes :
Strong problem-solving skills with a data-driven mindset.Excellent communication, documentation, and presentation skills.Ability to work under pressure and manage multiple priorities.High ownership, accountability, and commitment to delivery excellence.(ref : hirist.tech)