(Client is Top most FINTECH)-Only diversity we required.
Your key responsibilities :
- You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain
- You are responsible for the support of the migration of current functionalities to Google Cloud
- You are responsible for the stability of the application landscape and support software releases
- You also support in L3 topics and application governance
- You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot)
Your skills and experience :
You have experience with databases (HDFS, BigQuery, etc.) and development preferably for Big Data and GCP technologiesStrong understanding of Data Mesh Approach and integration patternsUnderstanding of Party data and integration with Product dataYour architectural skills for big data solutions, especially interface architecture allows a fast startYou have experience in at least : Spark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scriptingYou have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processesYou can work very well in teams but also independent and are constructive and target orientedYour English skills are good and you can both communicate professionally but also informally in small talks with the team(ref : hirist.tech)