We are seeking a skilled Data Engineer to lead the development of our new data platform.
This is an exciting opportunity to work with cutting-edge tools on Google Cloud Platform (GCP) to centralize, transform, and model operational data for company-wide reporting and insights.
- Design, build, and maintain the end-to-end data pipeline on GCP.
- Ingest and organize raw data in Google Cloud Storage (GCS) for a reliable data lake.
- Set up and manage large-scale data lakes, ensuring scalability, reliability, and optimized data flow.
- Automate ongoing data ingestion using Cloud Functions + Cloud Scheduler.
- Develop robust ETL / ELT processes using Python and advanced SQL.
- Transform and model raw operational data into analytics-ready BigQuery tables.
- Partner with business teams to turn reporting needs into efficient data models.
- Serve as the backbone for Looker Studio dashboards, ensuring performance, accuracy, and scalability.
Requirements :
Minimum 10 years of experience in Data Engineering or a closely related field.Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, Mathematics, Statistics, or a related quantitative field.Extensive experience setting up and running Data Lakes on cloud environments (preferably GCP).Strong knowledge of Google Cloud Platform (GCS, BigQuery, Cloud Functions, Cloud Scheduler).Hands-on expertise with Advanced SQL, Python scripting, and building Looker dashboards.Proven experience designing and optimizing ETL / ELT pipelines and data modeling.Deep understanding of BI tools (preferably Looker Studio) for reporting and dashboards.About This Role
This role offers the ideal opportunity to leverage your technical skills and passion for data engineering to drive business growth and innovation.
We offer competitive salary and benefits based on experience and qualifications.