Looking for Technical Architect Data
Experience 12 + yrs
JD –
Mandatory : -
- Architectural experience with Microsoft Fabric, including Fabric Workspaces, Lakehouses, Real-Time Analytics, DirectLake, and semantic layer generation.
- Minimum 2–3 Azure data implementations involving on-prem to cloud migration and end-to-end solution delivery.
- Expertise in Azure Data Factory, Azure Databricks, Azure Synapse Analytics, ADLS Gen2, Azure SQL DB, and Blob Storage.
- Batch processing solutions that use Data Factory and Azure Databricks
- Proficiency in Python, SQL, or Scala for scalable transformation and data engineering.
- Strong command of Apache Spark (PySpark or Spark SQL) and Delta Lake for big data pipelines.
- Must have experience of working with streaming data sources and Kafka (preferred).
Preferred
Must - Architectural experience with Microsoft Fabric, including Fabric Workspaces, Lakehouses, Real-Time Analytics, DirectLake, and semantic layer generation.
Non-relational cloud data storesSolutions using Cosmos DB,Cosmos DB APIReal-time processing by using Stream Analytics and Azure DatabricksDeep understanding of data modeling, schema design, and modern data warehousing techniques.Hands-on knowledge of CI / CD, version control with Git, and IaC with Terraform / ARM.Security for data policies and standardsData encryption for data at rest and in transitData auditing and data maskingData privacy and data classificationData retention policy