About Us
MyRemoteTeam, Inc is a fast-growing distributed workforce enabler, helping companies scale with top global talent. We empower businesses by providing world-class software engineers, operations support, and infrastructure to help them grow faster and better.
Position : Snowflake & Databricks Data Engineer / Data Science Engineer
Experience : 9+ Years
Location Preference : Any Location
Joining : Immediate
Job Summary
We are looking for an experienced professional with strong hands-on expertise in building scalable data pipelines, designing data warehousing solutions, and working with modern cloud-based data engineering frameworks.
Key Responsibilities
- Design, build, and optimise ETL / ELT data pipelines using Snowflake and Databricks
- Develop scalable data models, schemas, and data warehousing solutions
- Implement complex transformations using PySpark, SQL, and Spark
- Work with cloud platforms (AWS / Azure / GCP) for data ingestion, orchestration, and automation
- Ensure strong data quality, governance, and performance optimisation
- Collaborate with data scientists, analysts, and cross-functional teams
- Build and deploy ML workflows and notebooks (preferred)
Required Skills
Strong hands-on experience with Snowflake (warehouses, stages, streams, tasks, stored procedures)Expertise in Databricks (PySpark, Delta Lake, notebooks, jobs)Strong proficiency in SQL, Python , and distributed data processingExperience with ETL tools and automated pipelinesFamiliarity with cloud tools like AWS Glue, Azure Data Factory, or GCP equivalentsKnowledge of CI / CD for data engineering (added advantage)Interview Process (Remote)
Pre-Screening (15 minutes) – Basic communication + initial technical checkTechnical Screening Round – Detailed technical discussionClient Technical Round – Final technical evaluation