Location : Pune, Baner (Work from Office)
Experience : 7+ years
Working Days : Monday to Friday (9 : 00 AM 6 : 00 PM)
Education : Bachelors or Masters in Computer Science, Engineering, Mathematics, or related field
Key Responsibilities :
- Design and develop robust data pipelines for large-scale data ingestion, transformation, and analytics.
- Implement scalable Lakehouse architectures using tools like Microsoft Fabric for structured and semi-structured data.
- Work with Python, PySpark, and Azure services to support data modelling, automation, and predictive insights.
- Develop custom KQL queries and manage data using Power BI, Azure Cosmos DB, or similar tools.
- Collaborate with cross-functional teams to integrate data-driven components with application backends and frontends.
- Ensure secure, efficient, and reliable CI / CD pipelines for automated deployments and data updates.
Skills & Experience Required :
Strong proficiency in Python, PySpark, and cloud-native data toolsExperience with Microsoft Azure services (e.g., App Services, Functions, Cosmos DB, Active Directory)Hands-on experience with Microsoft Fabric (preferred or good to have)Working knowledge of Power BI and building interactive dashboards for business insightsFamiliarity with CI / CD practices for automated deploymentsExposure to machine learning integration into data workflows (nice to have)Strong analytical and problem-solving skills with attention to detailGood to Have :
Experience with KQL (Kusto Query Language)Background in simulation models or mathematical modelingKnowledge of Power Platform integration (Power Pages, Power Apps)Benefits :
Competitive salary.Health insurance coverage.Professional development opportunities.Dynamic and collaborative work environment.ref : hirist.tech)