About Tech Phoenix
Tech Phoenix is a data science and AI startup that delivers cutting-edge solutions to clients across industries. We take on challenging projects ranging from advanced analytics and machine learning to AI-driven automation, helping businesses unlock the full potential of their data. As a growing company, we're constantly pushing boundaries with the latest technologies and are now looking to expand our team with a motivated Data Engineer Intern who wants to work on real client projects from day one.
Why Join Us?
At Tech Phoenix, you won't be doing busywork. As a startup, we move fast and give our team members real responsibility early on. You'll work directly on client projects, gain exposure to diverse industries and use cases, and develop skills that matter in today's data-driven world. If you're looking for an internship where you can make a genuine impact while learning from experienced professionals, this is it .
Role Overview
As a Data Engineer Intern, you'll play a hands-on role in designing, building, and maintaining scalable data pipelines and systems for our clients. Working primarily on Databricks, you'll help transform raw data into actionable insights that drive business decisions. This is an excellent opportunity for someone looking to gain real-world experience in a fast-paced, remote-first startup environment.
What You'll Do
Design and develop efficient, scalable data pipelines using Databricks
Build and maintain ETL processes to integrate data from multiple sources
Create and optimize data models and warehouse structures
Support analytics and AI initiatives by preparing clean, reliable datasets
Collaborate with team members and clients to identify data needs and deliver solutions
Document processes, pipelines, and data flows for future reference
Troubleshoot and resolve data quality issues as they arise
Contribute to client-facing projects and learn how real-world data solutions are delivered
What We're Looking For
Required Skills
Solid understanding of data engineering principles and data modeling
Hands-on experience with ETL processes and data warehousing
Strong proficiency in SQL and relational databases
Working knowledge of Databricks
Familiarity with Azure and AWS cloud platforms
Experience with Python and PyCharm (or similar IDEs)
Ability to analyze large datasets and extract meaningful insights
Strong problem-solving skills with excellent attention to detail
Soft Skills
Self-motivated with the ability to work independently in a remote setting
Adaptable and comfortable with changing priorities and tight deadlines
Strong communication skills for remote collaboration with team and clients
Leadership potential, confidence, and integrity
Eagerness to learn and grow in a startup environment
Nice to Have
Experience with big data frameworks (Spark, Hadoop)
Familiarity with data visualization tools (Power BI, Tableau)
Knowledge of version control systems (Git)
Exposure to machine learning workflows or AI projects
Previous internship or project experience in data engineering
What We Offer
Fully remote position with flexible working hours
Hands-on experience with cutting-edge data and AI technologies
Direct involvement in real client projects from the start
Mentorship and guidance from experienced data professional
A fast-paced startup culture where your contributions truly matter
Room to grow, experiment, and take ownership of your work
Duration & Career Path
This internship begins with a 6-month contract. Based on your performance and contributions, there is a clear path to permanent employment. High performers will have the opportunity to lead their own projects and potentially relocate to the Netherlands to work with our core team.
Compensation
We offer competitive compensation aligned with the labor laws and market standards of your country of residence. As your role grows and your contributions increase, so will your compensation and position within the company.
How to Apply
Send us your CV along with a brief cover letter explaining why you're interested in this role and what you can bring to Tech Phoenix. If you have a portfolio, GitHub profile, or examples of past data or AI projects, we'd love to see them.
Data • Vapi, Gujarat, India