Join leading Australian organisation | Start of GCP data program | Collaborative data team | Bring your GCP experience to the team | CBD | Hybrid
- Initial 12 Month FTC + Extension Options
- Strong Data Engineering Experience within GCP
- Collaborative | Supportive | Data Function Team
Our client is seeking the expertise of an additional GCP Senior Data Engineer to join the team as they start their data journey, moving from on-prem to the Cloud. Having chosen GCP as their platform, they are looking for an additional Data Engineer to help with this program.
As a senior Data Engineer, you will design and deliver secure, scalable, and efficient data pipelines within the Google Cloud Platform environment. You will work closely with technical teams and business stakeholders to create robust data solutions that support analytics, reporting, and innovation across the organisation.
What You Will Do
Develop, test, and maintain cloud-native data pipelines using BigQuery, Dataflow, Cloud Composer, and Cloud Storage services.Transform raw data into structured, reliable datasets that power analytics and reporting.Implement best practices for data governance, privacy, security, and compliance.Collaborate with cross-functional teams to gather requirements and translate them into actionable data solutions.Optimise workflows for data integration, transformation, and automation.Create technical documentation, solution designs, and architectural diagrams to ensure maintainability.Monitor and enhance system performance, ensuring service levels are consistently achieved.Support and resolve incidents related to data and analytics with a proactive approach.Contribute to workshops and stakeholder sessions, turning business needs into scalable technical solutionsWhat You Bring
Strong experience in data engineering, warehousing, and analytics using Google Cloud Platform. You MUST have specific GCP experience.Strong expertise with BigQuery, Dataflow, Cloud Functions, Cloud Composer, and related GCP tools.Proven skills in SQL development, performance tuning, and data modelling.Experience designing and maintaining ETL and ELT workflows in cloud-first environments.Understanding of DevOps, CI / CD pipelines, and metadata-driven engineering practices.Knowledge of governance, compliance, and security frameworks for cloud data environments.Strong problem-solving skills with the ability to work independently or in a collaborative team.Excellent communication skills, capable of engaging both technical and non-technical stakeholders.Tertiary qualifications in Computer Science, Information Technology, or a related field.Desirable Extras
Professional certification in Google Cloud (Data Engineer or Cloud Architect).Experience with Looker, Tableau, or Power BI connected to BigQuery.Familiarity with infrastructure-as-code tools such as Terraform.Exposure to machine learning or advanced analytics using GCP services like Vertex AI or BigQuery ML.Understanding of Agile delivery practices and use of tools such as Jira and Confluence.To be considered for this role, you must be an Australian Permanent Resident or citizen.
If this position is of interest to you, please click ‘Apply’ or send your details to tracee@konnexuscg.com.au
Konnexus specialise in the recruitment of permanent and contract professionals within Data Analytics | AI / ML | Data & Platform Engineering. If this role is not right but you are wanting a new position or a conversation about the market, please call us on 03 9052 5900 and we will pass you onto the right Consultant.