Talent.com
Project-Based Data Platform Engineer (AWS + GCP / S3 + BigQuery)

Project-Based Data Platform Engineer (AWS + GCP / S3 + BigQuery)

ConfidentialIndore, India
18 days ago
Job description

Project-Based Data Platform Engineer (AWS + GCP / S3 + BigQuery)

About Kikeke Vortexium Technologies

Kikeke Vortexium Technologies Private Limited is a technology company based in Indore, India , developing high-performance data and gaming platforms for the Casino Elite Group (Belgium) .

We are currently expanding our data infrastructure to enable advanced analytics, business intelligence, and financial reporting across multiple business entities in Europe and Asia.

Our next milestone is to build a hybrid data platform combining AWS and Google Cloud technologies — keeping Amazon S3 as our data lake and BigQuery as our analytics warehouse.

Project Overview

We are seeking an experienced Data Platform Engineer (contract / freelance) to design and implement our next-generation data architecture.

This project will establish our full data foundation — from ingestion to analytics — connecting our transactional systems, marketing data, and payment flows into a single, governed analytics environment.

Responsibilities

  • Design and implement a multi-zone data lake in Amazon S3 (raw, staging, curated)
  • Develop data ingestion and transformation pipelines (Aurora → S3 → BigQuery)
  • Set up orchestration and scheduling using Airflow / Cloud Composer
  • Manage schema evolution , partitioning , and data cataloging (AWS Glue / GCP Data Catalog)
  • Implement data validation, monitoring, and error handling
  • Optimize cost and performance across both clouds
  • Create data models and views in BigQuery for analytics teams
  • Provide clear documentation and handover for long-term maintainability

Technical Requirements

  • 4–8 years of experience in data engineering or data platform architecture
  • Proven expertise with :
  • AWS : S3, Glue, Lambda, Athena
  • GCP : BigQuery, Dataflow, Cloud Composer
  • Python & SQL for ETL / ELT pipelines
  • Airflow / Cloud Composer for orchestration
  • Terraform / IaC (nice to have)
  • dbt or equivalent for analytics transformations (bonus)
  • Experience building cross-cloud pipelines (AWS → GCP)
  • Strong understanding of data lake design, governance, and optimization
  • Experience integrating with marketing or transactional systems is a plus
  • Engagement Details

  • Type : Project-based / Contract
  • Location : Remote (India-based)
  • Start Date : Immediate
  • Estimated Duration : 2–3 months
  • Budget : Competitive, depending on experience and deliverables
  • What We Offer

  • Work on a real-world, production-grade hybrid data architecture
  • Collaboration with an international team across Belgium and India
  • Opportunity for follow-up or retainer work after successful delivery
  • How to Apply

    Interested candidates can send :

  • A short cover message with availability
  • LinkedIn profile or CV
  • Examples of previous data platform work (S3 / BigQuery / pipeline projects)
  • Expected project fee
  • Skills Required

    S3, Airflow, BigQuery, Sql, Lambda, Gcp, Terraform, dbt, glue , DataFlow, Python, Aws

    Create a job alert for this search

    Gcp Data Engineer • Indore, India