Data Engineer and Operations Lead
WHO WE ARE :
Saks Global is the largest multi-brand luxury retailer in the world, comprising Saks Fifth Avenue, Neiman Marcus, Bergdorf Goodman, Saks OFF 5TH, Last Call and Horchow. Its retail portfolio includes 70 full-line luxury locations, additional off-price locations and five distinct e-commerce experiences. With talented colleagues focused on delivering on our strategic vision,
The Art of You,
Saks Global is redefining luxury shopping by offering each customer a personalized experience that is unmistakably their own .
By leveraging the most comprehensive luxury customer data platform in North America, cutting-edge technology, and strong partnerships with the world's most esteemed brands, Saks Global is shaping the future of luxury retail.
Saks Global Properties & Investments includes Saks Fifth Avenue and Neiman Marcus flagship properties and represents nearly 13 million square feet of prime U.S. real estate holdings and investments in luxury markets.
YOU WILL BE :
You will play a pivotal role in building the data engineering and commercial operations capabilities for SAKS Global, focusing on designing, implementing, and optimizing sophisticated data workflows tailored to our fast-paced marketing environment. You will take ownership of end-to-end data solutions — from ingestion to transformation — enabling business intelligence, analytics, and data science use cases across the organization. You will lead efforts to design modern, cloud-native data pipelines, establish strong engineering standards and ensure data quality, performance and reliability across the organization.
WHAT YOU WILL DO :
Your responsibilities include ensuring seamless data delivery to analytics platforms, proactively improving data reliability and performance, and working closely with stakeholders to drive strategic marketing decisions through high-quality data.
Modern Data Stack Development : Implement cutting edge data orchestration and transformation frameworks (eg. Airflow, dbt ). Utilizing advanced skills in Airflow deployed on Kubernetes, you will develop, enhance, and oversee robust pipelines that integrate diverse and complex marketing data sources, including advanced customer segmentation models, campaign analytics, and multichannel advertising insights.
Architect and Scale : Design and evolve data architecture and pipelines that support multu-source, high volume data processing across batch and streaming workflows.
Architect, implement, and optimize data warehouse solutions (e.g., Snowflake, BigQuery,).
Work with analysts and data scientists to deliver clean, reliable, and well-modeled data sets.
Build and automate data ingestion frameworks using APIs, streaming, and batch processes.
Implement data validation, quality monitoring, and governance standards.
Manage and optimize data workflows through orchestration tools like Airflow etc.
Support performance tuning, cost optimization, and cloud resource management.
Collaborate closely with engineering, product, and analytics teams to translate business requirements into data solutions.
Contribute to code reviews, best practices, and CI / CD pipelines for data engineering.
WHAT YOU WILL BRING :
Hands-on experience architecting, deploying, and managing scalable data solutions on cloud platforms such as Google Cloud Platform or AWS.
Strong knowledge of streaming technologies, event-driven architecture, and real-time data integration frameworks (e.g., Apache Kafka).
Demonstrated expertise in implementing sophisticated data quality processes, validation methods, and compliance standards within marketing analytics.
Extensive experience with agile methodologies, CI / CD processes, and continuous improvement practices.
Deep understanding of luxury retail analytics, customer lifecycle analytics, and specific business performance metrics relevant to high-end retail environments.
Exceptional interpersonal skills, with a proven track record of collaborating effectively with technical teams and business stakeholders.
Key Qualifications :
Bachelor’s degree in Computer Science, Information Systems, Engineering, or related technical discipline.
7+ years of professional experience building and managing data pipelines using Apache Airflow, particularly in Kubernetes environments.
Extensive experience integrating, managing, and optimizing complex data streams from diverse marketing sources such as Google Analytics, Salesforce Marketing Cloud, Facebook Ads, or similar platforms.
Advanced proficiency in SQL, with expertise in data transformation and complex querying.
In-depth scripting and automation experience, particularly with Python, for complex data workflows and systems.
Comprehensive understanding of data warehousing, dimensional modeling, ETL frameworks, and performance optimization strategies.
Proven problem-solving capabilities with an emphasis on diagnosing complex data issues, enhancing pipeline reliability, and ensuring data integrity.
YOUR LIFE AND CAREER AT SAKS GLOBAL :
Opportunity to work in a dynamic fast paced environment at a company experiencing growth and transformation
Exposure to rewarding career advancement opportunities across the largest multi-brand luxury retailer from retail to distribution, to digital or corporate
Comprehensive benefits package for all eligible full-time employees (including medical, vision and dental)
Data Engineer • India