Digital Health Technology team powers digital experiences and engagement to enhance the lives of millions of people every day through connected care. We build, deliver and manage a portfolio of data management platforms and mobile offerings in support of our core businesses. We thrive on simple and elegant architecture and agility. You’ll be immersed in a dynamic high-growth environment and empowered to excel, take informed risks, and drive ingenuity across the enterprise.
The primary role of Engineering function within Product Development team is to create specifications and designs for new products or improvements and enhancements to existing products. Works closely with customers, developers and other stakeholders to understand and define customer needs in these designs. Features are aligned to a timetable and areas of responsibility. Developers may solicit customers for feedback regarding product usability and desired future enhancements. Software Engineers who design, write and test code for the product should be matched in the either Software Engineering Applications or Systems. Product developers who specialize in hardware systems should be matched to Hardware Engineering.
Let’s talk about Responsibilities
- The primary objective of Data Engineering specialization on the Global Data Platform Analytics Engineering team is to build large-scale data processing systems to serve the analytics needs of users across Resmed.
- Key responsibilities include implementing data pipeline integrations and solutions, incorporating highly scalable cloud computing and large scale data stores including data lakes, data warehouses, and data marts.
- Works closely with data architects to determine what data management systems are appropriate, and data scientists / analysts to determine which data are needed for analysis and provides required data analytics tools.
- Manages Github repositories including working with Github actions
- Manages infrastructure of team's analytics platform
- Develops analytics models using SQL and dbt on the Snowflake data platform
- Develops data pipelines using Kafka and Flink
- Develops Tableau and Power BI Data sources
- Orchestrates data pipeline operations using Dagster and Python
- Networks with contacts outside own area of expertise.
- Leads a cooperative effort among members of a project team.
- Works independently, with guidance in only the most complex situations.
- Frequently leads sub-functional teams or projects and train and mentor junior team members.
- Serves as a best practice resource within own area of work.
Let’s talk about Qualifications and Experience
Required :
At least 5 years of experience with SQL on a large analytics data platformAt least 5 years of experience with developing software requirements, software coding, and software testingAt least 3 years of experience with Python, shared version control systems, and maintaining data pipelinesAt least 2 years of experience with Github and using actions to deploy cloud applicationsPreferred :
Experience with dbt, Dagster, Snowflake data platformExperience with managing software integration and deployment on GithubBachelor’s degree.Minimum of 8 years of related experience.Applies functional knowledge and existing methodologies to solve complex problems or execute specialized projects.Exercises judgment in selecting methods, techniques and evaluation criteria for obtaining results.Determines methods and procedures on new assignments.Joining us is more than saying “yes” to making the world a healthier place. It’s discovering a career that’s challenging, supportive and inspiring. Where a culture driven by excellence helps you not only meet your goals, but also create new ones. We focus on creating a diverse and inclusive culture, encouraging individual expression in the workplace and thrive on the innovative ideas this generates. If this sounds like the workplace for you, apply now! We commit to respond to every applicant.