Talent.com
This job offer is not available in your country.
Senior GCP Data Engineer

Senior GCP Data Engineer

F337 Deutsche India Private Limited, Pune BranchBusiness Bay, Pune
30+ days ago
Job type
  • Full-time
Job description

Description

Deutsche Bank has set for itself ambitious goals in the areas of Sustainable Finance, ESG Risk Mitigation as well as Corporate Sustainability.

As Climate Change throws new Challenges and opportunities, Bank has set out to invest in developing a Sustainability Technology Platform, Sustainability data products and various sustainability applications which will aid Bank’s goals.

As part of this initiative, we are building an exciting global team of technologists who are passionate about Climate Change, want to contribute to greater good leveraging their Technology Skillset in in Cloud / Hybrid Architecture.

As part of this Role, we are seeking a highly motivated and experienced Senior GCP Data Engineer to join our team. In this role, you will play a critical role in designing, developing, and maintaining robust data pipelines that transform raw data into valuable insights for our organization.

What we’ll offer you

As part of our flexible scheme, here are just some of the benefits that you’ll enjoy

  • Best in class leave policy.
  • Gender neutral parental leaves
  • 100% reimbursement under childcare assistance benefit (gender neutral)
  • Sponsorship for Industry relevant certifications and education
  • Employee Assistance Program for you and your family members
  • Comprehensive Hospitalization Insurance for you and your dependents
  • Accident and Term life Insurance
  • Complementary Health screening for 35 yrs. and above

Your key responsibilities

  • Design, develop, and maintain data pipelines using GCP services like Dataflow, Dataproc, and Pub / Sub.
  • Develop and implement data ingestion and transformation processes using tools like Apache Beam and Apache Spark.
  • Manage and optimize data storage solutions on GCP, including Big Query, Cloud Storage, and Cloud SQL.
  • Implement data security and access controls using GCP's Identity and Access Management (IAM) and Cloud Security Command Center.
  • Monitor and troubleshoot data pipelines and storage solutions using GCP's Stackdriver and Cloud Monitoring tools.
  • Collaborate with data experts, analysts, and product teams to understand data needs and deliver effective solutions.
  • Automate data processing tasks using scripting languages like Python.
  • Participate in code reviews and contribute to establishing best practices for data engineering on GCP.
  • Stay up to date on the latest advancements and innovations in GCP services and technologies.
  • Your skills and experience

  • 10-15 years of experience as a Data Engineer or similar role.
  • Proven expertise in designing, developing, and deploying data pipelines.
  • In-depth knowledge of Google Cloud Platform (GCP) and its core data services (GCS, BigQuery, Cloud Storage, Dataflow, etc.).
  • Strong proficiency in Python & SQL for data manipulation and querying.
  • Experience with distributed data processing frameworks like Apache Beam or Apache Spark (a plus).
  • Familiarity with data security and access control principles.
  • Excellent communication, collaboration, and problem-solving skills.
  • Ability to work independently, manage multiple projects, and meet deadlines
  • Knowledge of Sustainable Finance / ESG Risk / CSRD / Regulatory Reporting will be a plus
  • Knowledge of cloud infrastructure and data governance best practices will be a plus.
  • Knowledge of Terraform will be a plus
  • How we’ll support you

  • Training and development to help you excel in your career.
  • Coaching and support from experts in your team
  • A culture of continuous learning to aid progression.
  • A range of flexible benefits that you can tailor to suit your needs.