Wissen Technology is Hiring for MHR Azure DE Lead
About Wissen Technology :
At Wissen Technology, we deliver niche, custom-built products that solve complex business challenges across industries worldwide. Founded in 2015, our core philosophy is built around a strong product engineering mindset—ensuring every solution is architected and delivered right the first time. Today, Wissen Technology has a global footprint with 2000+ employees across offices in the US, UK, UAE, India, and Australia . Our commitment to excellence translates into delivering 2X impact compared to traditional service providers. How do we achieve this? Through a combination of deep domain knowledge, cutting-edge technology expertise, and a relentless focus on quality. We don’t just meet expectations—we exceed them by ensuring faster time-to-market, reduced rework, and greater alignment with client objectives. We have a proven track record of building mission-critical systems across industries, including financial services, healthcare, retail, manufacturing, and more. Wissen stands apart through its unique delivery models. Our outcome-based projects ensure predictable costs and timelines, while our agile pods provide clients the flexibility to adapt to their evolving business needs. Wissen leverages its thought leadership and technology prowess to drive superior business outcomes. Our success is powered by top-tier talent. Our mission is clear : to be the partner of choice for building world-class custom products that deliver exceptional impact—the first time, every time.
Job Summary : We are seeking a highly skilled Senior Data Engineer with deep expertise in Microsoft Fabric to design, build, and optimize scalable data solutions. In this role, you will be responsible for end-to-end architecture and implementation of modern data pipelines, Lakehouse solutions, and data transformation processes that drive business insights and advanced analytics.
Experience : 10-12Years
Location : Bangalore
Mode of Work : Full Time
Key Responsibilities :
Data Architecture & Engineering
- Design and implement scalable data ingestion pipelines using Microsoft Fabric Data Factory (Dataflows) and OneLake.
- Build and manage Lakehouse and Warehouse architectures leveraging Delta Lake, Spark Notebooks, and SQL Endpoints.
- Define and enforce Medallion Architecture (Bronze, Silver, Gold) standards for structured data refinement and quality control.
Data Modeling & Transformation
Develop robust and reusable data transformation logic using PySpark, Scala, and Fabric SQL.Implement advanced data modeling strategies including :Slowly Changing Dimensions (SCD Type 2)Change Data Capture (CDC)Time-windowed aggregations and historical trend analysisPerformance Optimization
Monitor and optimize data pipeline performance with a focus on throughput, cost-efficiency, and reliability.Apply tuning techniques including partitioning, indexing, caching, and parallelism across Lakehouse and Warehouse compute.Data Quality & Governance
Integrate Microsoft Purview for metadata management, lineage tracking, and data cataloging.Establish automated data quality rules, anomaly detection mechanisms, and proactive alerting systems.DevOps & Automation
Implement Infrastructure-as-Code (IaC) using ARM templates or Terraform to provision Microsoft Fabric resources.Set up and maintain CI / CD pipelines using Git and Azure DevOps for deployment and version control across environments.Stakeholder Collaboration & Support
Collaborate with data scientists, BI developers, and business stakeholders to understand requirements and deliver impactful data solutions.Provide production support, troubleshoot pipeline issues, and lead root-cause analysis to ensure system stability.Requirements :
10–12 years of professional experience in data engineering , with a minimum of 1 year working hands-on with Microsoft Fabric .Proficiency in :Languages : SQL (T-SQL), Python, ScalaMicrosoft Fabric Components : Data Factory Dataflows, OneLake, Spark Notebooks, Lakehouse, WarehouseData Formats : Delta Lake, Parquet, CSV, JSONStrong understanding of data modeling techniques : star schema , snowflake schema , normalized / denormalized structures .Experience in CI / CD practices and Infrastructure-as-Code (IaC) using Git , ARM templates , or terraform .Familiarity with data governance platforms such as Microsoft Purview .Excellent problem-solving and analytical skills with the ability to articulate complex technical concepts to diverse audiences.Good To Have Skills :
Microsoft Certified : Fabric Analytics Engineer or related certifications.Experience in agile development environments.Knowledge of cost management and optimization strategies in cloud environments.Wissen Sites :