Inviting applications for the role of Principal Consultant, Sr. Databricks Developer!
In this role, the Sr. Databricks developer is responsible for solving real world cutting edge problem and mentor a team of one or more junior developer to address the goal.
Responsibilities
Design and develop end-to-end data pipelines using PySpark, SQL / Spark-SQL,
and Delta Lake.
Translate requirements into scalable, high-performance data solutions.
Participate in architecture discussions and provide technical inputs.
Implement ETL / ELT workflows in Lakehouse (Bronze, Silver, Gold layers).
Optimize pipelines, clusters, and queries for performance and cost.
Integrate Databricks with cloud services, APIs, and messaging systems.
Implement data quality checks, unit tests, and integration tests.
Develop both batch and streaming pipelines for real-time use cases.
Contribute to shared assets, frameworks, and accelerators.
Collaborate with leads, architects, and analysts during delivery.
Support data migration projects to Databricks Lakehouse.
Document technical solutions, coding standards, and best practices.
Act as a strategic advisor on Databricks adoption, modernization, and data strategy.
Define enterprise-scale architectures leveraging the Databricks Lakehouse platform.
Establish data governance frameworks with Unity Catalog (lineage, auditing, security, compliance).
Oversee multiple delivery streams, ensuring alignment with standards and best practices.
Drive reusable accelerators, frameworks, and reference architectures.
Engage with stakeholders on cost optimization, innovation, and roadmap planning.
Lead thought leadership initiatives (whitepapers, blogs, conferences).
Mentor and coach developers, senior developers, and leads.
Support Centers of Excellence (CoE) and competency frameworks.
Stay ahead of emerging technologies, trends, and Databricks innovations.
Qualifications we seek in you!
Minimum qualifications
Bachelor's Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience.
Relevant years in data engineering with hands-on Databricks experience.
End-to-end implementation of at least 2 Databricks projects(migration / integration).
Strong background in batch and streaming data pipelines.
Proficiency in Python (preferred) or Scala for Spark-based development.
Expertise in SQL & Spark-SQL, data structures, and algorithms.
Deep knowledge of Databricks components : Delta Lake, DLT, dbConnect, REST API 2.0, Workflows orchestration.
Design, develop, and optimize large-scale batch pipelines for ingestion, transformation, and analytics using Databricks.
Build and manage low-latency streaming pipelines using Spark Structured Streaming, Delta Live Tables, or other Databricks-native frameworks to enable real-time insights and decision-making.
Strong in performance optimization for pipelines (efficiency, scalability, cost reduction).
Hands-on experience with Apache Spark, Hive, and Lakehouse architecture.
Cloud expertise (Azure / AWS) includes storage (ADLS / S3), messaging (ASB / SQS), compute (ADF / Lambda), and databases (CosmosDB / DynamoDB / Cloud SQL).
Experience writing unit tests and integration tests for data pipelines.
Ability to work with architects and lead engineers to design solutions meeting functional & non-functional requirements.
Excellent technical skills to enabling the creation of future-proof, complex global solutions
Team player with experience leading teams of 5+ engineers.
Strong communication and client-facing skills.
Keeps updated with emerging technologies and industry trends.
Strong analytical and problem-solving abilities.
Positive attitude towards continuous learning and upskilling
Good to have Databricks SQL Endpoint understanding.
Good to have understanding on LakeflowConnect, Lakeflow Declarative Pipelines
Good To have CI / CD experience to build the pipeline for Databricks jobs.
Good to have if worked on migration project to build Unified data platform.
Good to have knowledge of DBT.
Good to have knowledge of docker and Kubernetes.
Certification on Databricks Associate or Professional level.
Any one Cloud Certification (AWS / Azure) Associate / Professional level -Data Engineer or Architect level is an added advantage
Principal Consultant • Hyderabad / Secunderabad, Telangana