Location :  Remote
Type :  Part-time (20–25 hours per week)
About the Role
We are seeking a skilled  Data Engineer  to support our growing data operations on a part-time basis. The ideal candidate has hands-on experience in building, optimizing, and maintaining data pipelines and architectures. You’ll work closely with analysts, developers, and business teams to ensure reliable, efficient, and scalable data systems.
Key Responsibilities
- Design, develop, and maintain  ETL / ELT pipelines  to collect, process, and store data from multiple sources.
- Build and optimize  data models and warehouse structures  for analytics and reporting.
- Ensure  data quality, integrity, and security  across all stages of the data lifecycle.
- Collaborate with stakeholders to understand data requirements and provide scalable solutions.
- Automate repetitive processes to improve performance and reliability.
- Monitor, troubleshoot, and resolve issues within data workflows and integrations.
Required Skills & Qualifications
Bachelor’s degree in Computer Science, Engineering, or related field.2–4 years of experience  in data engineering, data architecture, or a related role.Proficiency in  SQL  and one or more programming languages (Python preferred).Hands-on experience with  data pipeline tools  (e.g., Airflow, dbt, Luigi).Experience with  cloud platforms  such as AWS, Azure, or GCP.Knowledge of  data warehousing  (e.g., Snowflake, BigQuery, Redshift).Understanding of  APIs, JSON, REST, and data integration techniques.Familiarity with  version control (Git)  and CI / CD workflows.Preferred (Nice to Have)
Experience with  data visualization tools  (Power BI, Tableau, Looker).Exposure to  machine learning pipelines  or data lake architectures.Prior experience in a  part-time or remote setup .What We Offer
Flexible working hours (ideal for professionals balancing other commitments).Remote-first environment with collaborative tools.Opportunity to work with modern data technologies.Supportive and growth-focused team culture.