About Us
TELUS Digital Experience
TELUS Digital (TD) Experience partners with the world’s most innovative brands, from tech startups to industry leaders in fintech, gaming, healthcare, and more. We empower businesses to scale and redefine possibilities with integrated customer experience and cutting-edge digital solutions.
Backed by TELUS, our multi-billion-dollar parent company, we offer scalable, multi-language,
and multi-shore capabilities. Our expertise spans digital transformation, AI-driven consulting, IT
lifecycle management, and more – delivered with secure infrastructure, value-driven pricing, and
exceptional service.
AI Data Solutions : Shaping the Future of AI
For nearly two decades, TELUS Digital AI Data Solutions has been a global leader in providing
premium data services for the ever-evolving AI ecosystem. From machine learning to computer
vision and Generative AI (GenAI), we empower the next generation of AI-powered experiences
with high-quality data and human intelligence to test, train, and improve AI models.
Backed by a community of over one million contributors and proprietary AI-driven tools, we deliver solutions designed to cover the training data needs of every project. From custom data
collection to advanced data annotation and fine-tuning, our purpose-built tools deliver multimodal data for AI training projects of any complexity – from experimental pilots to ambitious large-scale programs. Examples include empowering GenAI models with human aligned datasets and fine-tuning data across 20+ domains and 100+ languages, enabling autonomous driving and advancing extended reality applications with industry-leading data labeling.
Join us to be part of an innovative team shaping the future of AI and driving digital transformation to new heights!
More : https : / / www.telusdigital.com / solutions / ai-data-solutions
About the Role
We are looking for a hands-on, action-oriented Data Engineer who thrives on execution.
This role is perfect for someone who enjoys rolling up their sleeves, building robust data pipelines, and delivering solutions that directly impact the business. If you prefer doing over discussing, and value outcomes over process, we want you on our team.
Key Responsibilities
Build, deploy, and manage data pipelines using Python and PySpark
Develop and optimize ETL / ELT processes to support data integration across systems
Work directly with GCP and AWS services to implement scalable cloud-based data solutions
Own data workflows end-to-end — from ingestion to transformation to storage
Continuously monitor and improve pipeline reliability, speed, and data quality
Use GenAI and automation tools to speed up development and reduce manual effort
Proactively debug, troubleshoot, and resolve data engineering issues
Ensure data is available and trustworthy for analytics and downstream systems
Deliver high-quality code and documentation with a bias for action
Required Skills & Qualifications
Strong proficiency in Python and PySpark
Working experience with cloud platforms like GCP and / or AWS
Solid understanding of ETL / ELT, data warehouse and data lake concepts.
Proficient in working with relational databases—preferably PostgreSQL and MySQL—as
well as non-relational databases, with a focus on MongoDB.
Driven by delivery and results — you get things done efficiently
Self-starter attitude with minimal need for hand-holding
Excitement for automating work using GenAI or scripting tools
Familiarity with SCD, CDC, Real-time Streaming vs Batch-processing.
Nice to Have
Experience with CI / CD pipelines and Docker
Understanding of data governance and observability
Prior experience in fast-paced, execution-heavy teams
Why Join Us?
High-impact role where execution speed is valued and recognised
Freedom to build, ship, and iterate without red tape
Work with a lean, high-performing data team
Opportunities to innovate with Generative AI tools
Fast learning environment and ownership from day one
If you're someone who prefers delivering over deliberating — apply now and help us build
data infrastructure that moves the needle.
Data Engineer • Bengaluru, Karnataka, India