Talent.com
Advance Data Infrastructure Engineer (DevOps & Data Platform Focus) [T500-21049]

Advance Data Infrastructure Engineer (DevOps & Data Platform Focus) [T500-21049]

Albertsons Companies IndiaKottayam, IN
7 hours ago
Job description

About Albertsons Companies Inc. :

As a leading food and drug retailer in the United States, Albertsons Companies, Inc. operates over 2,200 stores across 35 states and the District of Columbia. Our well-known banners across the United States, including Albertsons, Safeway, Vons, Jewel-Osco and others, serve more than 36 million U.S customers each week.

We build and shape technology solutions that solve customers’ problems every day, making things easier for them when they shop with us online or in a store. We have made bold, strategic moves to migrate and modernize our core foundational capabilities, positioning ourselves as the first fully cloud-based grocery tech company in the industry.

Our success is built on a one-team approach, driven by the desire to understand and enhance the customer experience. By constantly pushing the boundaries of retail, we are transforming shopping into an experience that is easy, efficient, fun and engaging.

About Albertsons Companies India :

At Albertsons Companies India, we're not just pushing the boundaries of technology and retail innovation, we're cultivating a space where ideas flourish and careers thrive. Our workplace in India is a vital extension of the Albertsons Companies Inc. workforce and important to the next phase in the company’s technology journey to support millions of customers’ lives every day.

At the Albertsons Companies India, we are raising the bar to grow across Technology & Engineering, AI, Digital and other company functions, and transform a 165-year-old American retailer. At Albertsons Companies India, associates collaborate directly with international teams, enhancing decision-making processes and organizational agility through exciting and pivotal projects. Your work will make history and help millions of lives each day come together around the joys of food and inspire their well-being.

Role Overview :

We are seeking an experienced Advance Data Infrastructure Engineer with a strong background in DevOps practices and data platform engineering . This role is ideal for professionals who specialize in cloud infrastructure, automation, and data pipeline orchestration , not application development.

Key Responsibilities :

  • Deploy, and manage scalable, secure, and cost-effective data infrastructure on Azure and Google Cloud Platform (GCP) .
  • Configure and govern Databricks environments including cluster policies, job orchestration, and workspace access.
  • Build and automate Vertex AI pipelines for ML model training, deployment, and monitoring.
  • Orchestrate and manage data pipelines using Azure Data Factory (ADF) and Databricks Workflows , ensuring data quality and lineage.
  • Provision and manage VMs for custom workloads, including image creation, autoscaling, and patching.
  • Implement and manage IAM policies , service accounts, and RBAC across cloud platforms.
  • Automate infrastructure provisioning using Terraform , ARM templates , or Google Deployment Manager .
  • Build and maintain CI / CD pipelines for infrastructure and data workflows using GitHub Actions , Azure DevOps , or Cloud Build .
  • Write and maintain Python / Bash scripts for automation and operational tasks.
  • Set up and manage monitoring and alerting using Azure Monitor , Google Cloud Operations Suite , Prometheus , and Grafana .
  • Configure cloud networking (VPCs, subnets, firewalls, peering) for secure and performant infrastructure.
  • Optimize cloud resource usage through autoscaling, spot / preemptible instances, and cost monitoring.
  • Participate in incident response and post-mortem reviews to improve platform reliability.

Experience Required :

  • 3–6 years in DevOps , Cloud Infrastructure , or Data Platform Devops Engineering roles.
  • Hands-on experience with Databricks , Vertex AI , and ADF .
  • Proven skills in VM provisioning , IAM / RBAC , and Infrastructure as Code .
  • Strong scripting skills in Python and / or Bash .
  • Experience with CI / CD pipelines , cloud monitoring, and networking.
  • Exposure to cost optimization , security best practices , and agile environments .
  • Core Skills :

  • Cloud Platforms : Azure, GCP
  • Data Tools : Databricks, ADF, Vertex AI
  • Infrastructure : Terraform, ARM templates, Deployment Manager
  • Scripting : Python, Bash
  • CI / CD : GitHub Actions, Azure DevOps, Cloud Build
  • Monitoring : Azure Monitor, Stackdriver, Prometheus, Grafana
  • Networking : VPCs, subnets, firewalls, private endpoints
  • Preferred Skills :

  • Containerization : Docker, Kubernetes (AKS / GKE)
  • Workflow Orchestration : Composer, Dataflow
  • Data Governance : Metadata management, security policies
  • Cloud Cost Optimization : Spot instances, tagging, autoscaling
  • Certifications : Cloud or Databricks certifications
  • Multi-cloud Experience : Hybrid or cross-cloud setups
  • Create a job alert for this search

    Data Engineer • Kottayam, IN