Job Description
When you join any Marmon group company / business, you become part of a winning team! Who We Are? Marmon Holdings, Inc., part of Berkshire Hathaway Inc., is a global industrial organization comprising more than 100 autonomous manufacturing and service businesses. Each Marmon business operates independently within a group structure that provides access to the expertise of other businesses with related products and services or which serve the same customers or markets. These groups include : Marmon businesses operate more than 400 manufacturing, distribution, and service facilities, and employ more than ~30,000 people worldwide. Revenues exceeded $12 billion in 2024. Our Global headquarters is in Chicago, IL, USA. Marmon Technologies India, located in Bengaluru, is a subsidiary of Marmon Holdings, providing both Engineering and Non-Engineering services to various Marmon businesses. In addition to hosting several trading businesses, Marmon Technologies is poised to explore manufacturing and assembly operations for Marmon businesses soon. Global Innovation Centre (GIC) (an intercompany vertical under Marmon Technologies) supports many of the above business groups in Engineering & Non-Engineering functions. Skills and Capabilities at Marmon GIC : Video link for Marmon GIC Marmon Holdings, Inc., Job Description : Designation : Sr. Lead Data Reporting to : Manager Software Location : Bangalore, Full Time Qualification : B.Tech. / M.Tech. in CS or any relevant degree related to CS Experience : 8 – 10 Years Position Overview : We are seeking an experienced Lead Data Engineer to drive the execution of enterprise data migration from diverse ERP systems into the Microsoft Fabric platform. This role focuses on delivering high-quality, scalable data solutions leveraging Fabric’s Lakehouse architecture, Delta Lake, and modern data engineering tools. The Lead Data Engineer will collaborate closely with technical leads and business stakeholders to understand requirements and ensure the delivery of robust data pipelines and models that support advanced reporting needs. This position demands strong technical expertise combined with effective cross-team collaboration to achieve successful data modernization outcomes. Key Responsibilities :
- Lead data migration efforts from diverse ERP systems into Microsoft Fabric, leveraging Lakehouse / OneLake architecture to support fabric-based reporting.
- Design, build, and maintain robust data pipelines and ETL / ELT workflows using Fabric tools such as Dataflows Gen2, Pipelines (formerly ADF), notebooks, and Delta Lake.
- Implement data modeling and transformation using PySpark, Spark SQL, T-SQL, and Power Query (M), feeding structured datasets into semantic models and Power BI.
- Optimize Power BI datasets and performance, utilizing DAX, semantic modelling, and Direct Lake connectivity to minimize resource utilization and enhance report performance
- Establish and enforce data governance and security protocols, including role-based access, data lineage, and integration with tools like Microsoft Purview
- Collaborate with business teams to translate reporting requirements into technical specifications, oversee data validation, and support testing phases.
- Create technical documentation, data mappings, migration guides, and deliver training to stakeholders and internal users.
- Mentor and coach team members, enabling knowledge sharing on Microsoft Fabric tools, migration patterns, and performance optimization techniques.
- Preferred Experience & Qualifications
- Proven experience as a Data Lead or similar role using Microsoft Fabric components : Lakehouse / OneLake, Synapse Data Engineering (Spark), Fabric data pipelines, and Power BI
- Strong command of SQL / T SQL, PySpark, Python, or Power Query (M) for transformation tasks.
- Expertise with data warehouse modeling (star / snowflake schemas) and modern architecture patterns (e.g. medallion layers — Bronze / Silver / Gold)
- In-depth proficiency in Power BI, including Custom visuals, DAX optimization and dataset tuning for high-volume reporting.
- Familiarity with DevOps for data, including CI / CD pipelines using Azure DevOps or GitHub Actions for Fabric artifacts and deployments.
- Practical experience with ERP to Analytics migration, ideally involving tools like ADF, Synapse, or Fabric migration assistant components
- Excellent stakeholder collaboration, documentation, and mentoring abilities.
Requirements
Preferred Experience & Qualifications
Proven experience as a Data Lead or similar role using Microsoft Fabric components : Lakehouse / OneLake, Synapse Data Engineering (Spark), Fabric data pipelines, and Power BIStrong command of SQL / T SQL, PySpark, Python, or Power Query (M) for transformation tasks.Expertise with data warehouse modeling (star / snowflake schemas) and modern architecture patterns (e.g. medallion layers — Bronze / Silver / Gold)In-depth proficiency in Power BI, including Custom visuals, DAX optimization and dataset tuning for high-volume reporting.Familiarity with DevOps for data, including CI / CD pipelines using Azure DevOps or GitHub Actions for Fabric artifacts and deployments.Practical experience with ERP to Analytics migration, ideally involving tools like ADF, Synapse, or Fabric migration assistant componentsExcellent stakeholder collaboration, documentation, and mentoring abilities.