Key Responsibilities :
- Design and architect scalable, secure, and high-performing data solutions using Snowflake, DBT and Airflow.
- Translate business requirements into technical data models and solutions in collaboration with stakeholders, data engineers, and analysts.
- Define and enforce best practices for data modeling, ELT pipeline development, and metadata management.
- Provide architectural guidance and mentorship to data engineering teams.
- Lead technical reviews, architecture governance, and data quality assessments.
- Ensure adherence to data privacy, security, and compliance regulations.
- Monitor, troubleshoot, and optimize the performance of Snowflake workloads, Airflow DAGs and DBT models.
- Evaluate and recommend new tools, frameworks, and architecture Skills & Qualifications :
- 10+ years of experience in data engineering and architecture roles.
Proven hands-on experience with :
1. Snowflake : Data warehouse architecture, query optimization, role-based access control
2. DBT : Model creation, testing, documentation, and version control
3. Apache Airflow : DAG development, orchestration, monitoring, scheduling
Strong command of SQL, data modelling (Star / Snowflake schemas), and ETL / ELT processesProficiency in Python and scripting for automation and data pipeline developmentExperience with at least one major cloud platform (AWS, Azure, or GCP)Familiarity with CI / CD practices for data deploymentsExcellent communication, leadership, and analytical problem-solving skills(ref : hirist.tech)