Are You Ready to Make It Happen at Mondel z International
Join our Mission to Lead the Future of Snacking. Make It With Pride.
Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results.
How you will contribute
You will :
- Execute the business analytics agenda in conjunction with analytics team leaders
- Work with best-in-class external partners who leverage analytics tools and processes
- Use models / algorithms to uncover signals / patterns and trends to drive long-term business performance
- Execute the business analytics agenda using a methodical approach that conveys to stakeholders what business analytics will deliver
What you will bring
A desire to drive your future and accelerate your career and the following experience and knowledge :
Using data analysis to make recommendations to analytic leadersUnderstanding in best-in-class analytics practicesKnowledge of Indicators (KPIs) and scorecardsKnowledge of BI tools like Tableau, Excel, Alteryx, R, Python, etc. is a plusAre You Ready to Make It Happen at Mondel z International
Join our Mission to Lead the Future of Snacking. Make It with Pride
In This Role
As a DaaS Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices.
Role & Responsibilities :
Design and Build : Develop and implement scalable, secure, and cost-effective cloud-based data solutions.Manage Data Pipelines : Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes.Ensure Data Quality : Implement data quality and validation processes to ensure data accuracy and integrity.Optimize Data Storage : Ensure efficient data storage and retrieval for optimal performance.Collaborate and Innovate : Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices to remain current in the field.Technical Requirements :
Programming : Python, PySpark, Go / JavaDatabase : SQL, PL / SQLETL & Integration : DBT, Databricks + DLT, AecorSoft, Talend, Informatica / Pentaho / Ab-Initio, Fivetran.Data Warehousing : SCD, Schema Types, Data Mart.Visualization : Databricks Notebook, PowerBI, Tableau, Looker.GCP Cloud Services : Big Query, GCS, Cloud Function, PubSub, Dataflow, DataProc, Dataplex.AWS Cloud Services : S3, Redshift, Lambda, Glue, CloudWatch, EMR, SNS, Kinesis.Supporting Technologies : Graph Database / Neo4j, Erwin, Collibra, Ataccama DQ, Kafka, Airflow.Experience with RGM.ai product would have an added advantage.Soft Skills :
Problem-Solving : The ability to identify and solve complex data-related challenges.Communication : Effective communication skills to collaborate with Product Owners, analysts, and stakeholders.Analytical Thinking : The capacity to analyse data and draw meaningful insights.Attention to Detail : Meticulousness in data preparation and pipeline development.Adaptability : The ability to stay updated with emerging technologies and trends in the data engineering field.Skills Required
Data Mart, dbt, Python, Sql