Talent.com
Sr. Data Engineer (Databricks)

Sr. Data Engineer (Databricks)

xponent.ainarela, delhi, in
1 day ago
Job description

We are an Australian Data and AI company and are expanding our Databricks Capability in India and are seeking a Senior Databricks Data Engineer to build scalable, production-grade data platforms that enable AI and advanced analytics for our global clients. We are open, transparent, no-nonsense, innovative problem solvers and eternal learners as a team where individual passion and brilliance is combined with a great team spirit to nurture everyone’s journey to be their best and achieve what they aspire.

Role Overview :

As a Senior Databricks Data Engineer , you will architect, design, and develop data pipelines and modeling layers that power analytical and AI-driven workloads. You are likely to work on end-to-end Databricks projects —from ingestion to curation to enablement for ML / AI—while collaborating with consulting and client delivery teams across regions.

Responsibilities :

  • Solution Ownership : Take end-to-end responsibility for project delivery, including planning, prioritization, and ensuring data solutions meet quality and performance expectations.
  • Stakeholder Collaboration : Engage with business and product teams to understand data needs, translate requirements, and align engineering solutions with strategic goals.
  • Cross-Functional Communication : Present complex data concepts in a clear, actionable manner to non-technical stakeholders and leadership.
  • Continuous Improvement : Identify process inefficiencies and lead initiatives to improve data workflows, scalability, and team productivity.
  • Design and build scalable, reliable data pipelines using Databricks Auto Loader, Delta Live Tables (DLT), and Workflows.
  • Develop and maintain curated Medallion architecture layers (Bronze, Silver, Gold) using Delta Lake and Databricks SQL.
  • Orchestrate and deploy pipelines using Databricks Asset Bundles (DABs) and integrate with CI / CD tools (GitHub Actions, Azure DevOps).
  • Implement data ingestion using Lakeflow Connect and integrate with external sources such as Kafka, ADLS, and REST APIs.
  • Develop real-time and batch processing jobs with PySpark, SQL, and Delta Lake.
  • Ensure data quality, lineage, and governance using Unity Catalog and DLT Expectations.
  • Collaborate with data scientists and ML engineers to prepare feature-ready datasets for MLflow and AI workloads.
  • Optimise Databricks clusters and jobs for cost, performance, and reliability.
  • Participate in client workshops, provide architectural input, and communicate technical outcomes effectively to business stakeholders.
  • Take ownership of projects ensuring exceptional customer outcomes
  • Contribute to a data-driven culture that fosters innovation and agility.

Required Skills & Competencies

3 - 5 years of hands-on experience with Databricks , including proficiency in :

  • Demonstrated experience in client-facing delivery , including conducting online workshops, eliciting data requirements, and presenting technical findings to business and technical stakeholders.
  • Databricks Certification (Data Engineer Associate or Professional) required.
  • PySpark, SQL, and data modeling (Dimensional, ELT).
  • Medallion architecture and Delta Lake optimization (Z-Ordering, Optimize, Compaction).
  • Unity Catalog, Workflows, and Feature Store.
  • Azure Data Services (ADLS, ADF, Synapse) or equivalent on AWS / GCP.
  • Auto Loader for incremental ingestion
  • Delta Live Tables (DLT) for declarative ETL pipelines
  • Lakeflow Connect for source integration
  • Databricks SQL for analytics and transformation
  • Ability to thrive in a fast-paced, entrepreneurial, team environment
  • Preferred (Nice to have)

  • Prior experience in Retail, CPG, or Transportation & Logistics domains.
  • CI / CD skills for Databricks using GitHub Actions or Azure DevOps and Databricks Asset Bundles (DABs) for deployment automation
  • Experience implementing data quality monitoring and governance frameworks .
  • What we offer

    At our core, we are building a youthful, forward-looking Data & AI brand — where engineers, consultants, and innovators thrive on solving meaningful problems with cutting-edge technologies.

    We offer an environment that blends technical depth, consulting exposure, and personal growth in equal measure.

  • Flexible Work Options – Work 100% remotely or in a hybrid mode from our Chennai base — we trust our people to deliver outcomes, not fill timesheets.
  • Learning & Mentorship Culture – Work directly with senior data and AI leaders who invest in your career growth, leadership development, and technical mastery .
  • High-Calibre Clients & Projects – Collaborate with mature global clients pursuing ambitious Data and AI transformation goals — exposure to real enterprise-grade challenges.
  • Create a job alert for this search

    Sr Data Engineer • narela, delhi, in