Responsibilities :
- Create technology blueprints and engineering roadmaps for multi-year data transformational
programs.
knowledgeable of architectural patterns, cloud-native solutions on AWS, microservices
architecture, solutions integration, and containerisation.
Extensive knowledge of AWS services, particularly those related to data storage and processing,including S3, RDS, Redshift, DataZone, and Glue.
Expert in the data development lifecycle, with a focus on data ingestion processes, datatransformation pipelines, data integration, and visualisation.
Excellent stakeholder management skills and 'Can-do' attitude.Mandatory Skills : Amazon Kinesis
Apache Kafka
AWS Glue
AWS Solutions Architect
PySpark
Mandatory Skills Description :
8+ years of hands-on experience as AWS Platform EngineerHands-on experience in AWS compute and services, including EC2 and experience building datatechnology stack, including Hadoop / EMR Serverless, Glue, Redshift, Airflow, Aurora PostgreSQL
Worked on event-driven architecture using Apache Kafka, AWS Kinesis, or similar. Infrastructureas code using CloudFormation or Terraform
Proficient in Python, SQLBuilding CI / CD tooling (Github / Jenkins)Nice-to-Have Skills Description :
AWS Solution Architect certificationContainerisation (Docker, Kubernetes)Data visualisation tools and integration to Tableau, PowerBI