Responsibilities :
- Lead and drive the design and architecture of data solutions, ensuring alignment with business goals and technical requirements.
- Oversee the implementation and automation of data pipelines and workflows, using tools such as DBT, DataVault (AutomateDV), Snowflake, Airflow, and GitLab.
- Collaborate with engineering and business teams to gather requirements and ensure solutions meet functional and non-functional expectations.
- Provide mentorship and guidance to senior and junior team members on best practices, architecture, and development techniques.
- Evaluate and recommend new technologies, tools, and frameworks to enhance the team's capabilities.
- Conduct code reviews, implement robust testing strategies, and ensure high standards of quality and performance.
- Manage and optimize data storage and processing within Snowflake to ensure scalability, reliability, and cost-effectiveness.
- Work with DevOps and Infrastructure teams to automate and streamline data workflows and deployment processes.
Requirements :
5+ years of experience in data engineering or related fields, with 3+ years in an architecture or leadership role.Extensive experience with DBT, DataVault (AutomateDV), Snowflake, Airflow, and GitLab.Strong experience in designing and building scalable, reliable, and efficient data pipelines and data storage solutions.Expertise in cloud platforms (especially Snowflake) and big data technologies.Strong communication skills and ability to collaborate with stakeholders at various levels.Ability to mentor and lead teams in building high-performance, maintainable data solutions.Preferred Skills (a plus, but not mandatory) :
Experience with Microservices architecture and development.Familiarity with enterprise applications like NetSuite and Salesforce.Strong background in agile methodologies and DevOps practices.(ref : hirist.tech)