Databricks Engineer
Experience : 7–12 years
Location-Remote
Overview
We are looking for an experienced Databricks Engineer with deep expertise in designing, building, and optimizing modern data platforms on Azure and Databricks . The ideal candidate will serve as the go-to expert for Databricks platform architecture, Unity Catalog, and emerging Databricks capabilities such as Genie, Vector Search, and Agent Bricks.
Key Responsibilities
- Architect, design, and implement scalable, secure, and high-performance data solutions using Azure and Databricks .
- Act as a subject matter expert on Azure Data Services and Databricks platform architecture.
- Design and implement POCs leveraging modern data architecture patterns (e.g., Lakehouse, Delta Lake, Medallion architecture).
- Build and optimize data pipelines for both real-time and batch processing, including API integrations .
- Support DevOps, CI / CD , and monitoring practices for data workloads.
- Implement self-serve data layers tailored to different user personas.
- Collaborate effectively with client architecture and platform teams; strong communication and leadership in technical discussions.
- Evaluate and experiment with new Databricks offerings (e.g., Genie, Vector Search, Agent Bricks, LakeFlow ) through rapid POCs.
Required Skills & Experience
8–12 years of total experience in Data Engineering / Data Architecture .Strong hands-on experience with Databricks , Unity Catalog , and Azure Data Services (e.g., Azure Data Factory, Synapse, ADLS).Solid understanding of data management, governance , and security principles.Experience with Delta Lake , Lakehouse architecture , and modern ETL / ELT best practices.Familiarity with Python , PySpark , and SQL for data engineering.Exposure to real-time streaming frameworks (e.g., Kafka, Event Hubs) is a plus.Excellent communication and client-facing skills.