Job Summary :
We are seeking a skilled Data Engineer with a solid background in Property & Casualty (P&C) Insurance data and strong expertise in data integration, migration, and Azure data services. The ideal candidate will design, develop, and optimize data pipelines and architecture to support business intelligence, analytics, and digital transformation initiatives.
MUST HAVE : P&C with Azure, ADF, Data Fabric, Integration, migration project experience is mandatory
NOTE : Do Not apply without P&C Domain experience
Required Skills & Qualifications :
- 5–8+ years of experience as a Data Engineer or related role.
- Strong experience in P&C Insurance domain — familiarity with policy, claims, billing, and underwriting data.
- Proven experience with data integration or migration projects (on-prem to Azure or cross-platform).
- Hands-on experience with :
- Azure Data Factory (ADF)
- Azure Databricks / Spark
- Azure Synapse / Data Lake / SQL Database
- Python, PySpark, or Scala for data processing
- SQL and performance tuning
- Strong understanding of data modeling, warehousing concepts, and ETL / ELT frameworks.
- Experience with API-based or file-based data integrations.
- Familiarity with version control (Git), CI / CD pipelines, and Agile delivery methods.
Preferred Skills :
Experience in insurance data models such as ACORD or ISO.Exposure to data governance tools (e.g., Purview, Collibra) and metadata management.Knowledge of Power BI or other reporting tools for data consumption.Azure certification (e.g., DP-203 : Data Engineer Associate) preferred.