Job Description
As an Analytics Engineer, you will lead data pipeline, data strategy, and data visualization initiatives for the Data & Analytics organization at
You’re an engineer who not only understands how to leverage big data to answer complex business questions, but also how to design semantic layers to support self-service analytics.
You will manage projects from requirements gathering to planning to implementation of full-stack data solutions (pipelines → data tables → visualizations).
You will work closely with cross-functional partners to ensure business logic is accurately represented in the semantic layer and production environments, empowering the wider Product Analytics team to drive business insights and strategy.
What Will You Do?
Design and implement data models that support flexible querying and data visualization.
Partner with Product stakeholders to understand business questions and build advanced analytical solutions.
Advance automation efforts to reduce time spent on data manipulation and increase time for analysis.
Build frameworks that enhance team productivity and are intuitive for other data teams to leverage.
Establish and support analytics development standards and best practices.
Create systematic solutions for solving data anomalies : identifying, alerting, and root cause analysis.
Collaborate proactively with stakeholders to prepare data solutions for new product / feature releases, ensuring data quality and addressing any nuances.
Identify and explore new opportunities through creative analytical and engineering methods.
What To Bring
Bachelor’s degree in Engineering
4–8 years of relevant experience in Business Intelligence / Data Engineering
Expertise in SQL (clean, efficient code is essential) and strong grasp of data warehousing concepts such as star schemas, slowly changing dimensions, ELT / ETL, and MPP databases
Hands-on experience with dbt for building transformations
Proven ability in transforming flawed or changing data into consistent, trustworthy datasets, and developing DAGs to batch-process large data volumes
Experience with general-purpose programming languages (e.g., Python, Java, Go) and familiarity with data structures, algorithms, and serialization formats
Advanced proficiency in building reports and dashboards with BI tools such as Looker and Tableau
Experience with analytics tools such as Athena, Redshift / BigQuery, Splunk, etc.
Proficiency in Git (or similar version control systems) and CI / CD best practices
Experience managing workflows using Agile practices
Excellent documentation and communication skills, with a high degree of precision
Strong problem-solving skills and ability to work independently on ambiguous challenges
Ability to manage multiple projects and time constraints effectively
Attention to data quality — ensuring processed data is interpreted and used correctly
Experience with digital products, streaming services, or subscription-based products is preferred
Strong written and verbal communication skills
Engineer • Noida, UP, in