Talent.com
This job offer is not available in your country.
Senior Data Engineer [T500-18489]

Senior Data Engineer [T500-18489]

Albertsons Companies Indiathane, maharashtra, in
6 days ago
Job description

About Albertsons Companies India :

At Albertsons Companies India, we're not just pushing the boundaries of technology and retail innovation, we're cultivating a space where ideas flourish and careers thrive. Our workplace in India is a vital extension of the Albertsons Companies Inc. workforce and important to the next phase in the company’s technology journey to support millions of customers’ lives every day.

At the Albertsons Companies India, we are raising the bar to grow across Technology & Engineering, AI, Digital and other company functions, and transform a 165-year-old American retailer. At Albertsons Companies India associates collaborate directly with international teams, enhancing decision-making processes and organizational agility through exciting and pivotal projects. Your work will make history and help millions of lives each day come together around the joys of food and inspire their well-being.

Position Title : Senior Engineer Data

The Data Engineering team at Albertsons Companies is looking for experienced Data Engineers to work for the most transformational food and drug retailers in the United States. Albertsons operates over 2,300 stores under 19 well-known banners including Albertsons, Safeway, Vons, Jewel-Osco, Shaw's, Acme, Tom Thumb, Randalls, United Supermarkets, Pavilions, Star Market, Haggen and Carrs. The company reported revenue of over $60billion from over 34 million weekly shoppers and is the third largest private company in the country.

Position Purpose :

We are seeking a highly skilled and motivated Senior Engineer Data to join our dynamic and growing data team. This role is ideal for someone who thrives in a fast-paced, cloud-native environment and is passionate about building scalable, efficient data solutions. You will play a key role in designing, developing, and maintaining our data infrastructure, with a strong focus on Google BigQuery and other Google Cloud Platform (GCP) services.

Key Responsibilities :

  • Design, develop, and optimize BigQuery stored procedures and complex SQL queries.
  • Build and maintain robust ETL pipelines using Python and Google Cloud Dataflow.
  • Manage data ingestion and transformation workflows using Google Cloud Composer or Stonebranch.
  • Utilize Google Cloud Storage (GCS) for data staging and archival.
  • Implement and maintain scalable data models and data warehousing solutions.
  • Collaborate with cross-functional teams using Agile methodologies (Epics, Stories, Standups, Retrospectives).
  • Document data processes and flows using Confluence.
  • Track and manage tasks using Jira.
  • Self-starter, with a demonstrated ability to learn beyond formal training with a strong aptitude for delivering quality products.
  • Ability to work with teams that are geographically distributed and work across different time zones
  • Provide guidance to less experienced team members and / or resolve highly complex production problems.
  • Uphold and support documentation standards, procedures and approval hierarchies.
  • Lead self and provides technical leadership on projects. Foster teamwork and manage multiple delivery work streams.
  • Identify and provide guidance to less experienced team members and / or resolve highly complex production problems.

Required Qualifications :

  • Advanced proficiency in SQL
  • Experience with Python (or another scripting language) and automation.
  • Hands-on expertise with GCP services : BigQuery, GCS, Dataflow, and Cloud Composer
  • Strong understanding of data modeling, ETL processes, and data warehousing concepts.
  • Experience with streaming systems such as Kafka.
  • Familiarity with version control systems like Git.
  • Proven experience working in Agile environments.
  • Nice to Have :

  • Experience with Power BI or ThoughtSpot for data visualization and analytics.
  • Exposure to other cloud platforms, especially Microsoft Azure.
  • Awareness of Snowflake, Oracle, and DB2.
  • Expertise in AI, preferably in the context of data engineering.
  • Experience with Stonebranch for job scheduling.
  • Knowledge of CI / CD pipelines and DevOps practices.
  • Create a job alert for this search

    Senior Data Engineer • thane, maharashtra, in