Looking for a Freelance Data Quality Engineer to join a team of rockstar developers. The candidate should have a minimum of 8+ yrs. of experience.
There are multiple openings. If you're looking for freelance / part time opportunity (along with your day job) & a chance to work with the top 0.1% of developers in the industry, this one is for you! You will report into IIT'ans / BITS grads with 10+ years of development experience + work with F500 companies (our customers).
Company Background - We are a multinational software company that is growing at a fast pace. We have offices in Florida & New Delhi. Our clientele spreads across the US, Australia & APAC. To give you a sense of our growth rate, we've added 70+ employees in the last 6 weeks itself and expect another 125+ by the end of Q3 2025.
Key Responsibilities :
- Design, develop, and maintain scalable data pipelines using Python, SQL, and PySpark / ETL tools.
- Implement and manage data warehouse solutions using Snowflake or equivalent platforms.
- Ensure data quality, integrity, and validation through automated testing and data quality frameworks.
- Collaborate with data analysts, data scientists, and business teams to define and deliver clean, reliable datasets.
- Develop and maintain CI / CD pipelines for data workflows and manage deployments across environments.
- Work within cloud ecosystems (AWS, GCP, or Azure) for data storage, processing, and orchestration.
- Optimize data performance and troubleshoot pipeline or integration issues.
Must-Have Skills :
Python & SQL – strong hands-on coding and query optimization experience.PySpark or ETL Tools – experience building and maintaining ETL / ELT pipelines.Snowflake / Data Warehousing Expertise – schema design, data modeling, performance tuning.Data Quality Frameworks & Testing – implementation of data validation, reconciliation, and test automation.CI / CD & Cloud (AWS / GCP / Azure) – understanding of cloud-based data workflows and DevOps integration.Good-to-Have Skills :
Experience with Airflow, dbt , or similar orchestration tools.Familiarity with Docker / Kubernetes for containerized deployments.Exposure to data governance, cataloging, and lineage tools .Basic understanding of BI tools (e.g., Power BI, Tableau, Looker).What we need
~35 hours of work per week.100% remote from our sideYou will be paid out every month.Min 5yrs of experiencePlease apply only if you have a 100% remote job currentlyIf you do well, this will continue for a long time