Design, develop and maintain data pipelines for collecting, transforming and loading data into various data stores.
Build and maintain data warehousing and data lake solutions.
Develop and deploy data models that support various business :
Design, develop and maintain data pipelines for collecting, transforming and loading data into various data stores.
Build and maintain data warehousing and data lake solutions.
Develop and deploy data models that support various business requirements.
Write efficient and scalable code in languages such as Python, Scala or Java.
Skills :
Extensive experience leading AWS and cloud data platform transformations.
Proven track record of delivering large-scale data and analytical solutions in a cloud environment.
Hands-on experience with end-to-end data pipeline implementation on AWS, including data preparation, extraction, transformation & loading, normalization, aggregation, warehousing, data lakes, and data governance.
Expertise in developing Data Warehouses.
Desired requirements :
Strong knowledge of data architecture and data modelling practices.
Cost-effective management of data pipelines.
Familiarity with CI / CD driven data pipeline and infrastructure.
Agile delivery approach using Scrum and Kanban methodologies.
Benefits :
Ability to scope, estimate, and deliver committed work within deadlines, both independently and as part of an agile team.
Supporting QA and user acceptance testing processes.
Innovative problem-solving skills and ability to provide clear recommendations.
Understanding of the impact of changes to business rules on data processes