Talent.com
This job offer is not available in your country.
Senior Principal Consultant - Databricks Architect!

Senior Principal Consultant - Databricks Architect!

Genpactbangalore, India
7 hours ago
Job description

Ready to shape the future of work?

At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies’ most complex challenges.

If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment.

Genpact (NYSE : G) is anadvanced technology services and solutions company that deliverslastingvalue for leading enterprisesglobally.Through ourdeep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead.Powered by curiosity, courage, and innovation,our teamsimplementdata, technology, and AItocreate tomorrow, today.Get to know us atgenpact.comand onLinkedIn,X,YouTube, andFacebook.

Inviting applications for the role of Senior Principal Consultant- Databricks Developer!

In this role, the Databricks Developer is responsible for solving the real world cutting edge problem to meet both functional and non-functional requirements.

Responsibilities

Maintains close awareness of new and emerging technologies and their potential application for service offerings and products.

Work with architect and lead engineers for solutions to meet functional and non-functional requirements.

Demonstrated knowledge of relevant industry trends and standards.

Demonstrate strong analytical and technical problem-solving skills.

Must have experience in Data Engineering domain .

Qualifications we seek in you!

Minimum qualifications

Bachelor’s Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience.

Maintains close awareness of new and emerging technologies and their potential application for service offerings and products.

Work with architect and lead engineers for solutions to meet functional and non-functional requirements.

Demonstrated knowledge of relevant industry trends and standards.

Demonstrate strong analytical and technical problem-solving skills.

Must have excellent coding skills either Python or Scala, preferably Python.

Must have experience in Data Engineering domain .

Must have implemented at least 4 project end-to-end in Databricks.

Must have at least experience on databricks which consists of various components as below

Must have skills : Azure data factory, Azure data bricks, Python and Pyspark

Expert with database technologies and ETL tools.

Hands-on experience on designing and developing scripts for custom ETL processes and automation in Azure data factory, Azure databricks, Python, Pyspark etc.

Good knowledge of AZURE, AWS, GCP Cloud platform services stack

Hands-on experience on designing and developing scripts for custom ETL processes and automation in Azure data factory, Azure databricks,

Delta lake, Databricks workflows orchestration, Python, Pyspark etc.

Good Knowledge on Unity Catalog implementation.

Good Knowledge on integration with other tools like – DBT, other transformation tools.

Good knowledge on Unity Catalog integration with Snowlflake

Must be well versed with Databricks Lakehouse concept and its implementation

in enterprise environments.

Must have good understanding to create complex data pipeline

Must have good knowledge of Data structure & algorithms.

Must be strong in SQL and sprak-sql.

Must have strong performance optimization skills to improve efficiency and reduce cost.

Must have worked on both Batch and streaming data pipeline.

Must have extensive knowledge of Spark and Hive data processing framework.

Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS / S3, ADF / Lambda, CosmosDB / DynamoDB, ASB / SQS, Cloud databases.

Must be strong in writing unit test case and integration test

Must have strong communication skills and have worked on the team of size 5 plus

Must have great attitude towards learning new skills and upskilling the existing skills.

Preferred Qualifications

Good to have Unity catalog and basic governance knowledge.

Good to have Databricks SQL Endpoint understanding.

Good To have CI / CD experience to build the pipeline for Databricks jobs.

Good to have if worked on migration project to build Unified data platform.

Good to have knowledge of DBT.

Good to have knowledge of docker and Kubernetes.

Why join Genpact?

Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation

Make an impact – Drive change for global enterprises and solve business challenges that matter

Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities

Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day

Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress

Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters : Up.

Let’s build tomorrow together.

Create a job alert for this search

Principal Architect • bangalore, India