Project-Based Data Platform Engineer (AWS + GCP / S3 + BigQuery)
About Kikeke Vortexium Technologies
Kikeke Vortexium Technologies Private Limited is a technology company based in Indore, India , developing high-performance data and gaming platforms for the Casino Elite Group (Belgium) .
We are currently expanding our data infrastructure to enable advanced analytics, business intelligence, and financial reporting across multiple business entities in Europe and Asia.
Our next milestone is to build a hybrid data platform combining AWS and Google Cloud technologies — keeping Amazon S3 as our data lake and BigQuery as our analytics warehouse.
Project Overview
We are seeking an experienced Data Platform Engineer (contract / freelance) to design and implement our next-generation data architecture.
This project will establish our full data foundation — from ingestion to analytics — connecting our transactional systems, marketing data, and payment flows into a single, governed analytics environment.
Responsibilities
- Design and implement a multi-zone data lake in Amazon S3 (raw, staging, curated)
- Develop data ingestion and transformation pipelines (Aurora → S3 → BigQuery)
- Set up orchestration and scheduling using Airflow / Cloud Composer
- Manage schema evolution , partitioning , and data cataloging (AWS Glue / GCP Data Catalog)
- Implement data validation, monitoring, and error handling
- Optimize cost and performance across both clouds
- Create data models and views in BigQuery for analytics teams
- Provide clear documentation and handover for long-term maintainability
Technical Requirements
4–8 years of experience in data engineering or data platform architectureProven expertise with :AWS : S3, Glue, Lambda, AthenaGCP : BigQuery, Dataflow, Cloud ComposerPython & SQL for ETL / ELT pipelinesAirflow / Cloud Composer for orchestrationTerraform / IaC (nice to have)dbt or equivalent for analytics transformations (bonus)Experience building cross-cloud pipelines (AWS → GCP)Strong understanding of data lake design, governance, and optimizationExperience integrating with marketing or transactional systems is a plusEngagement Details
Type : Project-based / ContractLocation : Remote (India-based)Start Date : ImmediateEstimated Duration : 2–3 monthsBudget : Competitive, depending on experience and deliverablesWhat We Offer
Work on a real-world, production-grade hybrid data architectureCollaboration with an international team across Belgium and IndiaOpportunity for follow-up or retainer work after successful deliveryHow to Apply
Interested candidates can send :
A short cover message with availabilityLinkedIn profile or CVExamples of previous data platform work (S3 / BigQuery / pipeline projects)Expected project feeSkills Required
S3, Airflow, BigQuery, Sql, Lambda, Gcp, Terraform, dbt, glue , DataFlow, Python, Aws