Description :
Innovior is a leading boutique Digital Transformation consultancy.
We began as a specialist in Intelligent Automation and quickly expanded our services to include Salesforce, Anaplan, Data & Analytics and Managed Services.
This strategic shift enabled us to offer our clients a comprehensive suite of bespoke solutions for their digital transformation initiatives.
Since our inception in 2016, our team has expanded to include 120 dedicated professionals based in Sydney, Melbourne, India, and the Philippines.
Our expertise spans Management Consulting, Experience Design, Supply Chain Optimisation, Workforce Planning, and Data & AI.
Leveraging innovative technologies, we help clients optimise their operations, enhance customer experiences, and maintain a competitive edge.
Our client-centric approach, coupled with deep industry knowledge, allows us to deliver tailored solutions that drive meaningful outcomes and fuel success.
Join our passionate team and be part of our exciting growth journey at Innovior, and help us shape the future of digital transformation!
Job Responsibilities :
- Data Modelling & Architecture Design and implement scalable data lake and warehouse solutions using Azure Fabric, Synapse, Snowflake, or Databricks.
- Develop and maintain star / snowflake schemas, partitioning strategies, and data models optimised for performance and cost.
- Ensure data architecture aligns with business needs, governance standards, and best practices.
- Scalable Data Pipelines Design and develop robust, scalable ingestion and transformation pipelines using Azure Data Factory, Databricks, or other ETL / ELT tools such as Matillion, Fivetran, or dbt, with a solid understanding of metadata-driven ingestion pipeline frameworks.
- Delta Lake & Iceberg Tables Demonstrate understanding of Delta Lake and Apache Iceberg table formats, and how to create and manage tables using these methods for scalable, ACID-compliant data processing.
- Coding Skills Strong hands-on experience with Python and PySpark for data processing and transformation tasks.
Skills Required :
Power BI or Tableau Development & Optimization Build interactive dashboards and semantic models using Power BI Desktop.Strong DAX and Power Query skills required.Responsibilities include implementing Row-Level Security (RLS), optimising dataset performance, and managing report deployment and access via Power BI Service or Tableau Server.Performance Optimization Tune SQL queries and pipeline performance with a focus on cost-efficiency (Storage & Processing Time).DevOps & CI / CD Implement and manage CI / CD pipelines using Azure DevOps.Experience with infrastructure-as-code using ARM templates, Bicep, or Terraform to provision and manage cloud resources effectively.Ensure automated deployment, version control, and environment consistency across development and production.Mentoring & Technical Leadership Provide guidance to junior data engineers.Any Certifications in Snowflake, Databricks, dbt, Microsoft Fabric, or relevant cloud / data platforms (e.g. Azure, AWS, GCP) would also be beneficial.(ref : hirist.tech)