Job Description
We’re looking for a smart, adaptable Senior Software Engineer to help build and evolve our real-time data ingestion platform. You’ll work at the intersection of software engineering and data infrastructure, designing and maintaining scalable systems that stream high-quality, trusted data from Kafka to Snowflake via Snowpipe Streaming.
You’ll focus on data quality, observability, and pipeline reliability, helping develop robust monitoring and alerting systems. You’ll work closely with a collaborative team, contribute to architecture decisions, and help shape how data flows across Zendesk’s ecosystem.
Please read the role before applying. Note that Zendesk can only hire candidates who are physically located and plan to work from Karnataka or Maharashtra. Please refer to the location posted on the requisition for where this role is based.
Key Responsibilities :
Design, build, and maintain data quality systems and pipelines.
Work with tools such as Snowflake, Docker / Kubernetes, and Kafka to enable scalable, observable data movement.
Collaborate cross-functionally to close skill gaps in DQ and data platform tooling.
Contribute to building internal tooling that supports schema validation, data experimentation, and automated checks.
Collaborate cross-functionally with data producers, analytics engineers, platform teams, and business stakeholders.
Own the reliability, scalability, and performance of ingestion systems deployed on AWS
Architect and build core components of our real-time ingestion platform using Kafka, Snowpipe Streaming.
Champion software engineering excellence — including testing, observability, CI / CD, and automation
Drive the development of platform tools that ensure data quality, observability, and lineage through Protobuf-based schema management..
Participate in the implementation of ingestion best practices and reusable frameworks across data and software engineering teams.
Core Skills :
Solid programming experience (preferably in Java)
Experience with distributed data systems ( Kafka, Snowflake )
Familiarity with Data Quality tooling and concepts
Good working knowledge of SQL (especially for diagnostics and DQ workflows)
Experience with containerization (Docker, Kubernetes)
Strong debugging, observability, and pipeline reliability practices
What You Bring :
A systems mindset with strong software engineering fundamentals.
Passion for building resilient, high-throughput, real-time platforms.
Ability to influence technical direction across teams and drive alignment.
Strong communication and mentoring skills.
A bias toward automation, continuous improvement, and platform thinking.
Nice to Haves :
Experience with GenAI tools or supporting ML / AI data workflows
Familiarity with cloud-native data platforms (e.g., AWS, GCP)
Exposure to dbt or ELT frameworks
Why Join Us?
You’ll work with experienced engineers who value collaboration, pragmatism, and adaptability. We’re aiming to build resilient systems and a great team culture—not just implement buzzwords.
Please note that Zendesk can only hire candidates who are physically located and plan to work from Karnataka or Maharashtra. Please refer to the location posted on the requisition for where this role is based.
Hybrid : In this role, our hybrid experience is designed at the team level to give you a rich onsite experience packed with connection, collaboration, learning, and celebration - while also giving you flexibility to work remotely for part of the week. This role must attend our local office for part of the week. The specific in-office schedule is to be determined by the hiring manager.
Senior Engineer Data • pune, India