Job Overview :
We are looking for highly skilled Senior and Mid-to-Junior Technical Consultants with expertise in SQL, PySpark, Python, Airflow, and API development. The ideal candidates will have hands-on experience in data warehousing concepts (Fact & Dimension) and a strong understanding of supply chain domain processes. You will work closely with cross-functional teams to develop, optimize, and implement scalable data solutions.
Key Responsibilities :
- Design, develop, and optimize data pipelines using PySpark, SQL, and Python.
- Implement and manage Airflow DAGs for workflow automation.
- Work with APIs to integrate data sources and ensure seamless data exchange.
- Develop and maintain data models based on fact and dimension tables for efficient reporting and analytics.
- Optimize query performance and data processing for large datasets.
- Collaborate with business analysts, data engineers, and stakeholders to understand business requirements and translate them into technical solutions.
- Ensure data quality, reliability, and scalability of solutions.
- Provide mentorship to junior team members (for the Senior Technical Consultant Skills & Qualifications :
- Strong proficiency in SQL, PySpark, and Python.
- Hands-on experience with Airflow for scheduling and orchestrating workflows.
- Expertise in working with APIs (development and integration).
- Solid understanding of data warehousing concepts (Fact & Dimension modeling).
- Experience in the supply chain domain is highly preferred.
- Knowledge of cloud platforms (AWS, Azure, or GCP) is a plus and not mandatory.
- Excellent problem-solving skills and ability to work in an agile environment.
- Strong communication skills to effectively collaborate with cross-functional teams
(ref : hirist.tech)