About the Role :
We are looking for a highly experienced and hands-on GCP Data Engineer (AdTech) to lead our analytics team. This role is ideal for someone with a strong background in log-level data handling, cross-platform data engineering, and a solid command of modern BI tools. You'll play a key role in building scalable data pipelines, leading analytics strategy, and mentoring a team of analysts.
Experience Required : 4–6 years of experience in data analytics or data engineering roles, with at least 1–2 years in a leadership capacity.
Note : The default working hours are 5pm – 2am.
Responsibilities :
- Lead and mentor a team of data analysts, ensuring quality delivery and technical upskilling.
- Design, develop, and maintain scalable ETL / ELT pipelines using GCP tools (BigQuery, Dataflow, Cloud Composer, Cloud Functions, Pub / Sub).
- Ingest and process log-level data from platforms like Google Ad Manager, Google Analytics (GA4 / UA), DV360, and other advertising and marketing tech sources.
- Build and optimize data pipelines from diverse sources via APIs, cloud connectors, and third-party tools (e.g., Supermetrics, Fivetran, Stitch).
- Integrate and manage data across multiple cloud platforms and data warehouses such as BigQuery, Snowflake, DOMO, and AWS (Redshift, S3).
- Own the creation of data models, data marts, and analytical layers to support dashboards and deep-dive analyses.
- Build and maintain scalable, intuitive dashboards using Looker Studio, Tableau, Power BI, or Looker.
- Partner with engineering, product, revenue ops, and client teams to gather requirements and drive strategic insights from data.
- Ensure data governance, security, and quality standards are followed across the analytics ecosystem.
Required Qualifications :
4–6 years of experience in data analytics or data engineering roles, with at least 1–2 years in a leadership capacity.Deep expertise working with log-level AdTech data—Google Ad Manager, Google Analytics, GA4, programmatic delivery logs, and campaign-level data.Strong knowledge of SQL and Google BigQuery for large-scale data querying and transformation.Hands-on experience building data pipelines using GCP tools (Dataflow, Composer, Cloud Functions, Pub / Sub, Cloud Storage).Proven experience integrating data from various APIs and third-party connectors.Experience working with multiple data warehouses : Snowflake, DOMO, AWS Redshift, etc.Strong skills in data visualization tools : Looker Studio, Tableau, Power BI, or Looker.Excellent stakeholder communication and documentation skills.Preferred Qualifications :
Scripting experience in Python or JavaScript for automation and custom ETL development.Familiarity with version control (e.g., Git), CI / CD pipelines, and workflow orchestration.Exposure to privacy regulations and consent-based data handling in digital advertising (GDPR, CCPA).Experience working in agile environments and managing delivery timelines across multiple stakeholders.