Tezo is a new generation Digital & AI solutions provider, with a history of creating remarkable outcomes for our customers. We bring exceptional experiences using cutting-edge analytics, data proficiency, technology, and digital excellence.
Sr Data Engineer – Azure & Snowflake
Location : Hyderabad
Experience Level : 8–13 Years
- 8+ years of experience in data engineering and ETL development .
- Strong proficiency in SQL (performance tuning, window functions, stored procedures).
- Hands-on experience with Azure Data Factory , Azure Synapse , Azure Data Lake , and Databricks .
- Snowflake expertise in data loading, transformations, and query optimization.
- Strong Python skills, including working with Pandas / PySpark dataframes in Jupyter notebooks .
- Experience with data orchestration , workflow scheduling , and CI / CD for data pipelines .
- Good understanding of data modeling principles , data governance , and metadata management .
Key Responsibilities
Data Engineering & Pipeline Development
Design, build, and maintain scalable and reusable data pipelines for batch and real-time ingestion using Azure Data Factory , Databricks , or PySpark .Develop and optimize ETL / ELT workflows to extract data from heterogeneous sources (files, APIs, databases, applications).Build and maintain data processing logic using Python, PySpark , and SQL .Work with dataframes in Jupyter notebooks for transformation, validation, and feature creation.Data Architecture & Cloud Integration
Implement and maintain data solutions on Azure and Snowflake .Design and manage data storage layers — staging, raw, curated, and consumption zones within Data Lake / Lakehouse architecture.Integrate structured and semi-structured data (JSON, Parquet, CSV, etc.) from multiple systems and sources.Collaborate with architects and modelers to align pipelines with data models and governance standards .Performance Optimization & Data Quality
Optimize pipelines for performance, scalability, and cost efficiency.Implement data validation, error handling, and audit mechanisms in ETL workflows.Participate in data profiling and quality assessments to ensure reliability and accuracy.Collaboration & Stakeholder Engagement
Work closely with data modelers, analysts, and business teams to translate functional requirements into technical solutions.Participate in code reviews , architecture discussions , and Agile sprint ceremonies .Document processes, data flow diagrams, and technical designs.