Manager, Data Engineering
Pune, Maharashtra, India
Full-time
Region : India
Position Overview :
We're seeking a Manager - Data Engineering to lead our growing data engineering team based in Pune. This leadership role will focus on building scalable, secure, and high-performance data platforms, pipelines, and products for our global, multi-tenant SaaS applications.
You will work closely with cross-functional teams including Product, Architecture, UX, QA, and DevOps to deliver robust data solutions that power products and reporting systems - especially in the areas of ETL pipelines, cloud-native data warehousing, and unstructured data lakes.
Key Responsibilities :
Team Leadership & Development :
- Lead, mentor, and retain a high-performing team of data engineers
- Conduct regular 1 : 1s, performance reviews, and growth planning
- Foster a collaborative team culture and instill best practices
Project & Delivery Management :
Drive delivery of data solutions aligned with sprint and release goalsEnsure on-time delivery, high code quality, and scalabilityFacilitate agile ceremonies : sprint planning, retrospectives, and stand-upsTechnical Execution & Architecture
Architect and guide the development of scalable ETL / ELT pipelinesBuild and maintain data lake solutions using AWS tools to manage unstructured and semi-structured dataWork with large-scale datasets from diverse sources including APIs, logs, files, and internal systemsOptimize performance, security, and maintainability of data pipelinesPromote usage of tools such as Snowflake, dbt, Python, and SQLData Governance & Best Practices
Ensure adherence to internal coding standards and data security guidelinesImplement best practices for data modeling, quality checks, and documentationCollaborate with architecture and infrastructure teams on cloud cost optimization and performanceRequired Skills & Experience
12- 15 years in Data Engineering, Data Architecture, or similar rolesMinimum 3 years in a leadership or managerial capacityProven experience building robust ETL pipelines, preferably for multi-tenant SaaS platformsStrong technical hands-on expertise with :1. AWS Services : S3, Glue, Lambda, Redshift, EMR
2. Data Platforms : Snowflake, dbt
3. Programming : Python, SQL
4. Unstructured Data Handling : Data lakes, JSON, XML, log data
Expertise in SQL-based data warehousing and RDBMS systemsKnowledge of CI / CD, version control (GitHub), and Agile / Scrum methodologiesAbility to balance technical depth with stakeholder communication and delivery trackingUnderstanding of modern lakehouse architecture and tools like Apache Hudi, Iceberg, or Delta Lake.Good to HaveExperience in product-based companies (SaaS, ESG, Supply Chain domains preferred)Familiarity with data security standards (e.g., GDPR, SOC2)Experience with orchestration tools (Airflow, Step Functions), data cataloging, or cost optimization(ref : hirist.tech)