Overview
We are seeking an experienced Databricks Consultant to join our Data Practice team. As part of our specialized data services group you will leverage deep Databricks expertise to solve complex data challenges for our clients. The ideal candidate will combine technical Databricks platform knowledge with data strategy consulting skills to deliver highvalue scalable data solutions that drive business outcomes.
Responsibilities
- Serve as a key technical advisor within our Data Practice specializing in Databricks Lakehouse Platform implementation
- Assess clients existing data landscapes and develop strategic roadmaps for Databricks adoption
- Design and implement endtoend data engineering solutions using Databricks
- Modernize legacy data warehouses and data lakes by migrating to Databricks Lakehouse architecture
- Develop optimized data pipelines using Apache Spark Delta Lake and Databricks Workflows
- Implement data governance frameworks using Unity Catalog and other Databricks tools
- Collaborate with other practice areas (Analytics ML / AI Cloud) to deliver integrated solutions
- Lead client workshops and discovery sessions to gather requirements and align on solutions
- Provide thought leadership through knowledge sharing blog posts and community participation
- Mentor junior consultants and contribute to the growth of our Data Practice capabilities
Requirements
10 years of experience in data engineering and analytics consulting3 years of handson experience implementing Databricks solutions for enterprise clientsStrong understanding of modern data architecture patterns and the Lakehouse paradigmProficient in building data pipelines using Spark and Delta Lake on DatabricksExperience integrating Databricks with broader data ecosystems and BI toolsAbility to translate business requirements into technical solutionsExperience in one or more cloud platforms where Databricks is deployed (AWS Azure GCP)Strong clientfacing communication skills and consulting mindsetTechnical Skills
Databricks Platform : Databricks Workflows Unity Catalog Databricks SQLData Engineering : Apache Spark Delta Lake ETL / ELT pipelinesLanguages : Python SQL Scala (preferred)Cloud Platforms : AWS Azure GCPData Integration : Experience with common data sources and targetsAnalytics & BI : Integration with tools like Power BI Tableau or LookerConsulting Skills
Requirements gathering and solution designTechnical documentation and presentationClient relationship managementProject estimation and planningWorkshop facilitationChange managementKey Skills Databricks, Spark, Delta Lake, Workflows, Unity Catalog Languages Python & SQL mandatory, Scala is optional Pipeline Expertise Strong hands-on with ETL / ELT, Apache Spark, and Delta Lake on Databricks Domain Flexibility Healthcare domain is a plus, but not mandatory Platform Focus Preference : Azure Databricks, but AWS also acceptable BI Tool Integration Flexibility among Power BI, Tableau, Looker Cloud Platform Azure >
AWS >
GCP (flexible)
Education
BE,Any Graduate
Key Skills
Sales Experience,Direct Sales,Hyperion,Financial Services,Financial Concepts,Banking,Oracle EBS,Securities Law,Peoplesoft,Oracle,Financial Management,Workday
Employment Type : Full Time
Experience : years
Vacancy : 1