Primary Responsibilities
- Responsible for designing developing testing and supporting data pipelines and applications
- Industrialize data feeds
- Experience in working with cloud environments AWS
- Creates data pipelines into existing systems
- Experience with enforcing security controls and best practices to protect sensitive data within AWS data pipelines including encryption access controls and auditing mechanisms.
- Improves data cleansing and facilitates connectivity of data and applied technologies between both external and internal data sources.
- Establishes a continuous quality improvement process and to systematically optimizes data quality
- Translates data requirements from data users to ingestion activities
- in Computer Science or related field and 3 years of relevant industry experience
- Interest in solving challenging technical problems
- Nice to have test driven development and CI / CD workflows
- Knowledge of version control software such as Git and experience in working with major hosting services (e. g. Azure DevOps Github Bitbucket Gitlab)
- Nice to have in working with cloud environments such as AWSe especially creating serverless architectures and using infrastructure as code facilities such as CloudFormation / CDK Terraform ARM.
- Hands-on experience in working with various frontend and backend languages (e.g. Python R Java Scala C / C Rust Typescript ...)
Key Skills
APIs,Docker,Jenkins,REST,Python,AWS,NoSQL,MySQL,JavaScript,Postgresql,Django,GIT
Employment Type : Full Time
Experience : years
Vacancy : 1