You will be based in our Bangalore or Gurugram office as a member of Periscope’s technology team.
Periscope is the asset-based arm of McKinsey’s Marketing & Sales practice and is at the leading edge of the new ways we serve clients. This integrated model of serving clients, i.e. combining our generalist consulting approaches with the solutions, is a proof of the firm’s commitment to continued innovation in the spirit of bringing the best of the firm to our clients.
Periscope® By McKinsey enables better commercial decisions by uncovering actionable insights. The Periscope platform combines world leading intellectual property, prescriptive analytics, and cloud-based tools to provide more than 25 solutions focused on insights and marketing, with expert support and training. It is a unique combination that drives revenue growth both now and in the future. Customer experience, performance, pricing, category, and sales optimization are powered by the Periscope platform. Periscope has a presence in 26 locations across 16 countries with a team of 600+ business and IT professionals and a network of 300+ experts. To learn more about how Periscope’s solutions and experts are helping businesses continually drive better performance, visit http : / / www.periscopesolutions.com /
You will be a core member of Periscope’s technology team with responsibilities that range from developing and deploying our core enterprise products to ensuring that McKinsey’s craft stays on the leading edge of technology.
In this role you will design, develop, and maintain scalable data pipelines and systems using Databricks on Azure. You will Implement and optimize ETL processes with a focus on performance, cost-efficiency, and reliability. You will utilize advanced Databricks technologies (DLT, Unity Catalog, Delta Sharing, SQL Warehouse) to enhance data management and sharing capabilities.
You will be leading and executing data engineering projects from inception to completion, ensuring timely delivery and high quality while you continuously monitor, troubleshoot, and improve data pipelines and workflows to ensure optimal performance and cost-effectiveness.
You will collaborate with cross-functional teams to understand data needs and deliver solutions that meet business requirements, mentor, and educate junior engineers on best practices, emerging technologies, and efficient data engineering techniques.