Core skills :
- Advanced data engineering skills with strong experience using Spark, SQL, Python and Airflow
- Strong knowledge of big data querying tools such as Presto or Trino.
- Experience in data warehousing platforms like AWS Redshift
- Experience in data architecture principles, including data access patterns and data
- modelling.
- Experience with data quality measurement and monitoring
- Experience with metadata, including control totals and check sums for load
- assurance.
- Comfortable taking ownership of issues and driving resolution.
- Familiar with cloud computing concepts (AWS) including any or all of EC2, S3, IAM,
- EKS and RDS and Linux
- Experienced with DevOps approaches.
- Good understanding of Data Warehousing / ETL concepts
- Experience with CI / CD tools
Optional but highly desirable : Experience with event-based and message-driven distributed systems, i.e Kafka, Solace Systems
Spark, AWS, ETL
Experience in banking finance
AML / Transaction monitoring
Spark, AWS, ETL
Skills Required
Spark, Aws, Etl, Sql, Python