Talent.com
DE-Cloud Platform Engineer-Data Pipeline GDSN02
DE-Cloud Platform Engineer-Data Pipeline GDSN02EY Société d'Avocats • Delhi, Delhi, India
DE-Cloud Platform Engineer-Data Pipeline GDSN02

DE-Cloud Platform Engineer-Data Pipeline GDSN02

EY Société d'Avocats • Delhi, Delhi, India
30+ days ago
Job description

At EY youll have the chance to build a career as unique as you are with the global scale support inclusive culture and technology to become the best version of you. And were counting on your unique voice and perspective to help EY become even better too. Join us and build an exceptional experience for yourself and a better working world for all.

The opportunity

Your role will be a Technology Lead or Senior Technology Lead in the Cloud Engineering team. You will be responsible to be a part of the delivery of IT projects for our customers across the globe.

Your key responsibilities

  • Architect implement and manage CI / CD pipelines using Azure DevOps and AWS tools for data and analytics workloads.
  • Build and maintain CI / CD pipelines across on-premises and multi-cloud platforms (Azure AWS GCP) ensuring consistent delivery practices.
  • Orchestrate end-to-end DevOps workflows using tools like ArgoCD Azure DevOps Harness GitHub Actions or GitLab CI.
  • Maintain and optimize source control systems (Git SVN) enforcing branching strategies and code quality standards.
  • Integrate DevSecOps practices including automated testing code quality and vulnerability scanning (SonarQube Checkmarx Veracode Fortify etc.) within CI / CD pipelines.
  • Build and maintain data pipelines leveraging Azure Data Lake Databricks and AWS data services (e.g. S3 Glue Redshift).
  • Support migration and modernization of legacy data platforms to Azure and AWS.
  • Collaborate with data engineers analysts and business stakeholders to deliver end-to-end data solutions.
  • Automate infrastructure provisioning using Terraform ARM Templates CloudFormation or Bicep.
  • Automate Kubernetes-based deployments (AKS / EKS / GKE) using Helm charts and manage service mesh and traffic routing with Istio for enhanced observability and resilience.
  • Develop automation scripts using Python PowerShell Shell Bash Groovy and leverage cloud-native CLI tools (Azure CLI AWS CLI) for operational tasks.
  • Manage configuration and orchestration using Ansible Chef or Puppet.
  • Innovate in building independent automation solutions.
  • Implement data governance security and compliance best practices across Azure and AWS environments.
  • Monitor troubleshoot and optimize data pipelines and platform performance using Azure Monitor AWS CloudWatch Log Analytics and related tools.
  • Demonstrate working knowledge of cloud-native services across Azure AWS and GCP including PaaS SaaS and IaaS offerings.
  • Ensure cloud security best practices are followed with a strong understanding of identity access management and network security in cloud environments.
  • Comprehensive understanding of how IT operations are managed.

Skills and attributes for success

  • Strong hands-on experience with Azure DevOps GitHub Actions Jenkins and AWS Native tools for CI / CD automation release management and environment provisioning.
  • Deep expertise in Azure Data Lake Databricks and AWS data services (e.g. S3 Glue Redshift).
  • Proficiency in scripting languages : Python PowerShell Shell.
  • Experience with infrastructure automation tools : Terraform ARM Templates CloudFormation Bicep.
  • Knowledge of data governance security and compliance in cloud environments.
  • Familiarity with monitoring and observability tools : Azure Monitor AWS CloudWatch Log Analytics.
  • Ability to design scalable secure and cost-effective data architectures.
  • Strong understanding of cloud-native services across Azure AWS and GCP including PaaS SaaS and IaaS offerings.
  • Demonstrated experience in designing building and implementing DevOps solutions for projects of varying complexity with emphasis on Kubernetes (AKS / EKS / GKE) and automation.
  • Capability to identify communicate and mitigate project risks.
  • Ability to create sustainable systems and services through automation and continuous improvement.
  • Experience implementing DevSecOps practices including automated testing code quality and vulnerability scanning (e.g. SonarQube Checkmarx Veracode Fortify).
  • Strong understanding of agile methodologies.
  • Ability to deliver best practices around provisioning operations and management of multi-cloud environments.
  • Excellent communication analytical and problem-solving skills.
  • Ability to work collaboratively in cross-functional teams manage communication and deliverables from offshore teams and mentor others.
  • Capability to assist the team in debugging and troubleshooting imperative and declarative scripts.
  • Experience identifying software packages and solutions to meet client requirements developing RFPs and assisting in proposal evaluation (business and technology fit pricing and support).
  • Experience designing and developing AI-infused DevOps frameworks is a plus.
  • To qualify for the role you must have

  • BE / with a sound industry experience of 6 to 8 years
  • Strong knowledge of cloud computing in multi-cloud environments with Azure as the primary platform and exposure to AWS.
  • DevOps Setting up CI / CD pipeline using Azure DevOps GitHub Actions or GitLab CI
  • Hands-on experience with Azure Data Lake Databricks (including Spark Delta Lake) and familiarity with AWS data services.
  • Practical experience with Docker and Kubernetes (AKS / EKS / GKE).
  • Proficiency in PowerShell Python Groovy Shell scripting and cloud-native CLI tools.
  • Must be good in IAC tools such as Terraform or ARM or Bicep
  • Understanding data governance security and compliance in cloud environments.
  • Preferred Skills

  • Microsoft Certified : Azure DevOps Engineer Expert (AZ-400)
  • Microsoft Certified : Azure Data Engineer Associate (DP-203)
  • Microsoft Certified : Azure Solutions Architect Expert (AZ-305)
  • AWS Certified DevOps Engineer Professional
  • AWS Certified Solutions Architect Associate
  • Databricks Certified Data Engineer Associate / Professional
  • Experience with GitHub Actions GitLab CI or other modern CI / CD tools
  • Experience with configuration management tools such as Ansible Chef or Puppet
  • Experience with container orchestration and management (Kubernetes AKS EKS)
  • Familiarity with monitoring and observability tools (Azure Monitor AWS CloudWatch Prometheus Grafana)
  • Exposure to GenAI technologies and integration with data platforms
  • Experience working in agile and cross-functional teams
  • Strong documentation presentation and stakeholder communication skills
  • Your people responsibilities

    Foster teamwork and lead by example

    Participating in the organization-wide people initiatives

    Ability to travel in accordance with client and other job requirements

    Excellent written and oral communication skills; writing publishing and conference-level presentation skills a plus

    Technologies and Tools

  • Cloud platform AzureAWS
  • SDLC Methodologies : Agile / Scrum
  • Version Control Tools GitHub / Bitbucket / GitLab
  • CI / CD Automation Tools Azure DevOps / Github actions / AWS CodePipeline / GitHub Actions / GitLab CI / Jenkins / Harness / ArgoCD
  • Data Platform- Azure Data Lake Databricks Microsoft Fabric AWS S3 Glue Redshift
  • Container Management Tools Docker / Kubernetes / Docker Swarm
  • Application Performance Management Tools Prometheus / Dynatrace / AppDynamics
  • Monitoring Tools Splunk / Datadog / Grafana
  • IAC Tools Terraform / ARM Templates / Bicep
  • Artifact Management Tools Jfrog Artifactory / Nexus / CloudRepo / Azure Artifactory
  • Scripting - Python / Groovy / PowerShell / Shell Scripting
  • SAST / DAST : SonarQube / Veracode / Fortify
  • GitOps Tool Argo CD / Flux CD
  • GenAI Technology Chat GPT OpenAI
  • What we look for

  • Demonstrated experience in building and automating data platforms using Azure AWS and Databricks.
  • Proven track record in implementing CI / CD for data workloads and multiple technologies with strong use of DevOps tools and containerization.
  • Strong understanding of Microsoft Fabric AWS data services and modern data architectures.
  • Experience with infrastructure automation (IaC tools such as Terraform ARM Templates CloudFormation Bicep) and application automation using Azure DevOps and AWS DevOps tools.
  • Working experience in Azure and AWS with a solid grasp of cloud architecture strategy and cloud-related concepts.
  • Good exposure to cloud and container monitoring logging and troubleshooting (Azure Monitor AWS CloudWatch etc.).
  • Ability to design conduct and experiment with new technologies and approaches.
  • Ability to work collaboratively in cross-functional teams and mentor others.
  • Excellent communication analytical and problem-solving skills.
  • What we offer

    EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations Argentina China India the Philippines Poland and the UK and with teams from all EY service lines geographies and sectors playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants we offer a wide variety of fulfilling career opportunities that span all business GDS you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. Well introduce you to an ever-expanding ecosystem of people learning skills and insights that will stay with you throughout your career.

  • Continuous learning : Youll develop the mindset and skills to navigate whatever comes next.
  • Success as defined by you : Well provide the tools and flexibility so you can make a meaningful impact your way.
  • Transformative leadership : Well give you the insights coaching and confidence to be the leader the world needs.
  • Diverse and inclusive culture : Youll be embraced for who you are and empowered to use your voice to help others find theirs.
  • EY Building a better working world

    EY exists to build a better working world helping to create long-term value for clients people and society and build trust in the capital markets.

    Enabled by data and technology diverse EY teams in over 150 countries provide trust through assurance and help clients grow transform and operate.

    Working across assurance consulting law strategy tax and transactions EY teams ask better questions to find new answers for the complex issues facing our world today.

    Key Skills

    Continuous Integration,Docker,Jenkins,Kubernetes,Build Automation,S3,ASME Codes & Standards,Redshift,Spark,CI / CD,Kafka,Scala

    Employment Type : Full Time

    Experience : years

    Vacancy : 1

    Create a job alert for this search

    Pipeline • Delhi, Delhi, India

    Related jobs
    AWS Data Platform Engineer

    AWS Data Platform Engineer

    Persistent Systems • Delhi, India
    We’re looking for an AWS Data Platform Engineer to help automate and scale our cloud-based analytics environment.You’ll work with our BI and Data Engineering teams to build secure, automated, and r...Show more
    Last updated: 17 days ago • Promoted
    AWS Data Engineer

    AWS Data Engineer

    ACL Digital • Delhi, India
    We have urgent openings for AWS Data Engineer.We are seeking an experienced Data Engineer to join our team and contribute to the building, development, and support of ETL / ELT pipelines in a cloud-b...Show more
    Last updated: 12 days ago • Promoted
    Cloud Data Engineer + QA

    Cloud Data Engineer + QA

    Reqpedia • Delhi, India, India
    Cloud Data Modernization | Data Engineering | Quality Assurance.We are seeking a skilled and detail-oriented.The role involves migrating an on-premises Enterprise Data Warehouse (SQL Server) to a m...Show more
    Last updated: 11 days ago • Promoted
    Cloud Data Engineer | Immediate Joiner | Goa |

    Cloud Data Engineer | Immediate Joiner | Goa |

    Zimetrics • Delhi, India
    Key Responsibilities : Collaborate with client POCs and customers on technical discussions, resolve impediments, and drive mutual agreements. Design, build, and optimize scalable.Perform exploratory ...Show more
    Last updated: 21 days ago • Promoted
    Lead Data Engineer

    Lead Data Engineer

    Persistent Systems • Delhi, India
    We’re looking for a Lead Technical Engineer who can bridge data engineering depth with FinOps insight.You’ll design and implement scalable data solutions across Databricks and Snowflake, drive perf...Show more
    Last updated: 12 days ago • Promoted
    Cloud Data Engineer

    Cloud Data Engineer

    Lemongrass • Ghaziabad, IN
    Lemongrass is a software-enabled services provider, synonymous with SAP on Cloud, focused on delivering superior, highly automated Managed Services to Enterprise customers.Our customers span multip...Show more
    Last updated: 30+ days ago • Promoted
    AWS Data Engineer

    AWS Data Engineer

    Coforge • Noida, Uttar Pradesh, India
    We are seeking a highly skilled and experienced.The ideal candidate will have hands-on experience in architecting and implementing scalable data solutions using modern cloud-native tools and framew...Show more
    Last updated: 30+ days ago • Promoted
    Cloud Data Engineer

    Cloud Data Engineer

    Blue Cloud Softech Solutions Limited • Delhi, India
    Job Overview – Cloud Data Engineer.We are building a world class advanced analytics team that will solve some of the most complex strategic problems and deliver topline growth and operational effic...Show more
    Last updated: 10 days ago • Promoted
    AWS Lead Data engineer

    AWS Lead Data engineer

    Tata Consultancy Services • Delhi, India
    Job Summary : In this key leadership role, you will lead the development of foundational components for a Lakehouse architecture on AWS and drive the migration of existing data processing workflows ...Show more
    Last updated: 22 days ago • Promoted
    AWS Data Engineer

    AWS Data Engineer

    Emids • Delhi, India
    We are looking for a highly skilled Data Engineer with strong expertise in AWS, Databricks, PySpark, and Airflow to join our growing Data Engineering team. The ideal candidate will be responsible fo...Show more
    Last updated: 3 days ago • Promoted
    AWS-Scala Engineer

    AWS-Scala Engineer

    Coforge • Noida, Uttar Pradesh, India
    We are seeking a highly skilled and experienced.The ideal candidate will have hands-on experience in architecting and implementing scalable data solutions using modern cloud-native tools and framew...Show more
    Last updated: 19 days ago • Promoted
    Lead Data Engineer (Databricks)

    Lead Data Engineer (Databricks)

    SII Group India • Noida, Uttar Pradesh, India
    We are looking for a highly skilled.The ideal candidate will be responsible for designing, developing, and maintaining scalable data pipelines and infrastructure to support analytics, reporting, an...Show more
    Last updated: 6 days ago • Promoted
    Cloud Engineer

    Cloud Engineer

    Everest Technologies, Inc • Delhi, India
    Cloud Developer Description : We are looking for a skilled.The ideal candidate will have strong experience working in cloud environments (AWS and Azure) with a focus on automation, infrastructure as...Show more
    Last updated: 15 days ago • Promoted
    Cloud Lead Engineer- Azure

    Cloud Lead Engineer- Azure

    Dailoqa • Noida, Uttar Pradesh, India
    AI solutions, ensuring scalable, secure, and efficient deployment across infrastructure.You will work closely with data scientists, ML engineers, and business stakeholders to build and maintain rob...Show more
    Last updated: 17 days ago • Promoted
    Azure Cloud Engineer

    Azure Cloud Engineer

    Deloitte • Delhi, India
    India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations ...Show more
    Last updated: 2 days ago • Promoted
    DE-Cloud Data Platform Engineer-GDSN02

    DE-Cloud Data Platform Engineer-GDSN02

    EY Studio+ Nederland • Delhi, Delhi, India
    At EY youll have the chance to build a career as unique as you are with the global scale support inclusive culture and technology to become the best version of you. And were counting on your unique ...Show more
    Last updated: 16 days ago • Promoted
    Data Engineer

    Data Engineer

    Straive • Delhi, India
    Key Responsibilities : • Design, build and maintain scalable.Scala, Python & SQL • Implement core ETL / ELT logic in Scala and Python. author efficient Spark DataFrame / Dataset jobs.Write and optimize ...Show more
    Last updated: 30+ days ago • Promoted
    Data Engineer

    Data Engineer

    IntraEdge • Delhi, India
    We are seeking a highly skilled Data Engineer with strong experience in Python, PySpark, Snowflake, and AWS Glue to join our growing data team. You will be responsible for building scalable and reli...Show more
    Last updated: 30+ days ago • Promoted