Job Description
We are seeking an experienced Data Solutions Architect to join our team. The successful candidate will be responsible for designing and implementing scalable data platforms and solutions.
- The ideal candidate will have advanced hands-on experience with Snowflake, Databricks, DBT, Airflow, and Python.
- They will also possess deep expertise in building robust workflows with Apache Airflow and strong programming skills in Python.
- A proven track record of implementing AI / ML solutions is essential, along with basic exposure to GenAI (LLMs, prompt engineering, AI API integration).
Responsibilities include :
Architecting and implementing scalable data platforms and solutions using Snowflake, Databricks, DBT, Airflow, and Python.Designing and implementing robust ELT / ETL workflows for batch and real-time processing.Mentoring data engineering teams in coding standards, CI / CD, and best practices.Collaborating with business stakeholders to define requirements, translating needs into technical architectures.Qualifications and Skills
Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or related field.7+ years of professional experience in data architecture, engineering, or analytics roles.Advanced knowledge of cloud platforms, including Azure, AWS, or GCP.Strong problem-solving, communication, and stakeholder management skills.Certifications in Snowflake, Databricks, Python, or cloud platforms are a plus.Benefits
This role offers a unique opportunity to work on cutting-edge projects and collaborate with experienced professionals in the field.