Talent.com
Agentic Data Layer Systems Engineer

Agentic Data Layer Systems Engineer

SouthbridgeAIRepublic Of India, IN
4 days ago
Job description

Build Antibrittle Agents With Us

https : / / jobs.Gem.Com / southbridge-ai / am9icG9zdDoargQV0uUom0F1c9EN2wkd

Any expression of intelligence is a transformation of data.

At Southbridge, we make that transformation reliable. We are a team of engineers and researchers building the first agentic data layer—systems where long-horizon AI agents can ingest, structure, and retrieve enterprise data without shattering. We call these antibrittle agents : orchestration stacks that get stronger the longer they run, because we design them with room to adapt.

Why Context Comes First

AI models keep getting sharper, but most still feel like brilliant interns who can’t find anything. The bottleneck is context. Vector stores slice documents into fragments. ETL scripts drift out of date faster than they’re written. Enterprises teach new team members the shape of their data over six-year journeys—then watch the schema change.

We build for the people who live that frustration. Our walking-RAG pipelines read 1,000-page manuals, engineering diagrams, and telemetry streams without losing their relationships. Tadpole, our long-horizon runner, keeps Claude-class models productive for hours by trenching state, resetting cleanly, and capturing receipts with every step.

How We Think

  • Context, not chunks – Every table remembers its document;

every value knows its column. We keep structure intact from ingestion to retrieval.

  • Agents, not brittle pipelines – Static workflows crack when reality shifts. Our agents reason about ingestion, transformation, and retrieval in real time and adapt when schemas move.
  • Receipts above claims – We care about “nines of accountability.” Costs, decisions, failures, recoveries—everything is inspectable, everyone holds the same bar.
  • Building blocks, not black boxes – We publish what we can under Apache 2.0 because shared primitives help us move faster and keep us honest. Offmute, diagen, wishful-search, tip20, next-cursor-base - they all started as tools we needed first.
  • 180s are expected – Models evolve weekly. When the data asks us to change direction, we do it together.
  • What You’ll Work On

  • Agentic orchestration – Extend Tadpole’s execution architecture so agents coordinate ingestion, schema inference, and retrieval with accountable state. You’ll do this alongside the people who sketched the first trenches.
  • Context-preserving retrieval – Move past chunks. Build walking pipelines that understand nested JSON, spreadsheets, and industrial PDFs end-to-end, with teammates who have scars from legacy ETL.
  • Human-first interfaces – Surface agent decisions with clarity : live traces, streaming diffs, receipts that help partners trust what they see, and that help us debug together.
  • Reusable primitives – Extract the tools we need into open-source micropackages—utilities that the wider AI community wants to fork tomorrow, and that we’re proud to support.
  • Your first few months are collaborative by design. You’ll co-own a meaningful slice of Tadpole, ship it to real users, and share at least one internal note or open-source artefact. We’ll bring you into customer deployments early so you see the stakes up close.

    What We’ve Already Built

  • Antibrittle agent runners that keep Claude, Gemini, and Kimi K2 productive for multi-hour tasks.
  • Walking RAG systems that read technical manuals, diagrams, and mixed-format archives for Fortune 150 teams.
  • Open-source tooling with real adoption—Lumentis, offmute, diagen, tip20, wishful-search, and more.
  • Evaluation harnesses that track new model releases the week they land, so we can pivot with evidence.
  • Backend Engineer

    Full-time

  • Founding Team
  • https : / / jobs.Gem.Com / southbridge-ai / am9icG9zdDoargQV0uUom0F1c9EN2wkd

    We’re building Tadpole—the execution engine that keeps long-horizon agents steady for hours instead of minutes.

    As a backend engineer at Southbridge, you’ll design the trenches, receipts, and orchestration logic that let Claude-, Gemini-, and Kimi-class systems ingest enterprise data, reason across trillions of schemas, and recover gracefully when reality shifts. You’ll be shoulder to shoulder with teammates who have lived those shifts already and are eager to share the patterns, runbooks, and cautionary tales.

    What You’ll Build

  • Agentic orchestration – Extend Tadpole’s TypeScript core so multiple agents can coordinate ingestion, transformation, and retrieval while remaining observable and interruptible. You’ll decide, with the folks who sketched the first trench diagrams, what “observable” should mean next.
  • Context-preserving ingestion – Stream nested JSON, spreadsheets, industrial PDFs, and telemetry at gigabit speeds without losing structure. Sub-millisecond decisions matter here, and you’ll have teammates profiling and tuning alongside you.
  • Long-horizon reliability – Engineer checkpointing, rollback, and receipt systems so every action is traceable. When things wobble, we want to know where and why, and you’ll help set that shared discipline.
  • Reusable primitives – Extract the tools we need into OSS micropackages under Apache 2.0—utilities the wider community wants to fork tomorrow and that we’re proud to maintain together.
  • Production deployments – Pair with customers in finance, logistics, and infrastructure to harden the system where it meets messy real-world data. Every deployment is a team sport.
  • During your first few months, you’ll co-own a slice of Tadpole’s execution state or tooling APIs with another engineer, ship it to production, and share at least one internal note or open library that survives external scrutiny. We’ll bring you along for a customer deployment early—expect the rest of us on the call, comparing notes in real time.

    Stack & Tooling

  • Runtime – TypeScript everywhere with Bun, Hono services, Drizzle migrations, typed event streams.
  • Data – Postgres as ground truth, DuckDB and ClickHouse for analytical workloads, Redis for fast coordination.
  • AI interfaces – Direct connections to Claude, Gemini, GPT, and open checkpoints routed through our evaluation harnesses.
  • Execution fabric – WebSockets, durable queues, stateful workers, and selective serverless when latency wins.
  • Daily tools – Claude Code, Cline, Gemini CLI, a healthy bench of open-source helpers, plus Cursor, v0, and next-cursor-base when they keep the trenches tidy.
  • How We Work Together

  • We break problems down side by side – Interfaces, isolation, and deleted accidental complexity are shared habits, not solo hero moves.
  • Receipts come above claims – “Nines of accountability” beats “nines of availability.” We gather evidence before we celebrate, and everyone keeps the same bar.
  • Ship with trails – Token costs, tool invocations, diffs, failure paths—every system we touch leaves an audit trail so agents stay trustworthy.
  • Stay wide – Model capabilities change weekly. We prototype with new releases, run evals together, and aren’t precious about switching approaches when the data nudges us elsewhere.
  • Feedback loops are short – You’ll pair with Hrishi and the rest of the founding team. Customer feedback lands in our laps in real time, and we process it together.
  • What You Bring

  • Senior-level TypeScript engineer who treats the type system as a design tool.
  • Experience with distributed systems, data pipelines, or agent tooling where correctness and latency both mattered.
  • Comfort moving from proof-of-concept to production—instrumentation, alerting, operational playbooks included.
  • Ability to reason about the trade-offs between determinism and exploration in AI-driven workflows.
  • Builder of real things—startups, OSS packages, internal platforms—with receipts you can point us to.
  • Signals We Love

  • You’ve made or salvaged an LLM agent that ran longer than an hour without babysitting, and you kept the logs.
  • You enjoy wiring evaluation harnesses;
  • metrics like pass@k, coverage, and hallucination detection feel familiar.

  • You’ve navigated Fortune-100-style compliance or audit constraints and still shipped advanced tooling.
  • You have opinions about schema-on-write vs. schema-on-read, and you’re happy to share the scars.
  • Why This Work Matters

  • You’re building new primitives. The orchestrators you shape become the infrastructure others lean on when they want AI systems that actually understand their data. We work on real problems—1000-page technical manuals, multi-modal telemetry, cross-border compliance datasets—where “pretty good” isn’t enough. You’ll learn at the frontier;
  • we test major model releases as they land, implement papers while the ink dries, and swap notes with the community through our open-source work.

    How to Start the Conversation

  • Please share your Portfolio by sharing a short Loom / Tella / Video platform of your choice walk-through. We work globally so remote async communication skills are important.
  • Share what you've shipped – Links, repos, notes, or whatever you are proud of. We value artifacts over résumés and love the context behind them.
  • Meet one of the team – 30-minute conversation to see if the work resonates and answer your initial questions.
  • Conversation with Hrishi – 45 minutes trading stories with our CEO about approaches, trade-offs, and how you think about agents;
  • bring the questions youwant answered.

  • Take-home challenge – Work through a real problem (e.G., extending Tadpole's state machine or redesigning a streaming ingestion loop). Focused and realistic, no trick questions.
  • Technical deep dive – Review your solution with Hrishi. Code and architecture together, no trivia, plenty of space for clarifications.
  • Your questions for us – Dedicated time for you to interview us on anything that matters to your decision.
  • Pairing session – Co-debug or extend a live micropackage with the team. We share the brief beforehand and treat the session as a two-way fit check.
  • Offer & onboarding plan – We tailor scope, compensation, and relocation support to what lets you do your best work, and we map out how your first months will feel.
  • Bring logs or a story about the longest agent run you’ve tamed—we’re always happy to swap notes.

    https : / / jobs.Gem.Com / southbridge-ai / am9icG9zdDoargQV0uUom0F1c9EN2wkd

    Create a job alert for this search

    System Engineer • Republic Of India, IN

    Related jobs
    • Promoted
    Data Engineer

    Data Engineer

    DigitalzoneNagpur, IN
    As a Data Engineer, you will design, build, and optimize data pipelines and real-time systems that power AI-driven decisioning and analytics. Develop and maintain scalable ETL / ELT pipelines using Py...Show moreLast updated: 4 days ago
    • Promoted
    Principal Engineer - Data Science

    Principal Engineer - Data Science

    Pull LogicNagpur, IN
    Pull Logic is transforming the retail supply chain with a revolutionary availability-oriented paradigm that directly links customer preferences with real-time inventory. Backed by cutting-edge resea...Show moreLast updated: 1 day ago
    • Promoted
    Senior Data Engineer

    Senior Data Engineer

    Guidanz IncNagpur, IN
    BI Connector is the industry leading solution for integrating Oracle Fusion Cloud data into modern BI platforms like Power BI, Tableau, and Data Warehouse, without complex ETL.Our Data Architecture...Show moreLast updated: 4 days ago
    • Promoted
    Data Engineer

    Data Engineer

    RecroIndia, India
    Data Pipeline Engineering : Design, build, and maintain ingestion, transformation, and storage pipelines using Azure Data Factory, Synapse Analytics, and Data Lake. AI Data Enablement : Collaborate wi...Show moreLast updated: 4 days ago
    • Promoted
    Senior Data Engineer - AWS Glue - Full Remote - Contractor in USD

    Senior Data Engineer - AWS Glue - Full Remote - Contractor in USD

    All European CareersNagpur, IN
    Remote
    For an international organization in Geneva, Switzerland, we are urgently looking for a Full Remote Senior Data Engineer. The Senior Data Engineer is responsible for the implementation of the set of...Show moreLast updated: 4 days ago
    • Promoted
    Data Engineer

    Data Engineer

    Response Informaticsnagpur, maharashtra, in
    AWS services : Must be proficient in building scalable data pipelines and managing cloud-native ETL workflows.Snowflake : Moderate understanding of Snowflake architecture. CICD - Terraform or CloudFo...Show moreLast updated: 30+ days ago
    • Promoted
    Lead Data Engineer

    Lead Data Engineer

    Cimpressnagpur, maharashtra, in
    Our Team : Enterprise Business Solutions.Vista’s Enterprise Business Solutions (EBS) domain is working to make our company one of the most data-driven organizations to support Finance, Supply Chain,...Show moreLast updated: 4 days ago
    • Promoted
    AI / ML & Data Engineer

    AI / ML & Data Engineer

    Mindfire SolutionsIndia, India
    We are looking for an experienced AI / ML & Data Engineer to design, develop, and deploy scalable machine learning models and data infrastructure on AWS. You will work closely with cross-functional te...Show moreLast updated: 17 days ago
    • Promoted
    Data Engineer

    Data Engineer

    IntraEdgeNagpur, IN
    We are seeking a highly skilled Data Engineer with strong experience in Python, PySpark, Snowflake, and AWS Glue to join our growing data team. You will be responsible for building scalable and reli...Show moreLast updated: 30+ days ago
    • Promoted
    Data Engineer

    Data Engineer

    Philodesign Technologies IncNagpur, IN
    The ideal candidate will have strong hands-on expertise in.Azure Data Factory (ADF), Synapse, and Databricks.ETL, SQL, and data warehousing concepts. Design, develop, and implement ETL solutions usi...Show moreLast updated: 4 days ago
    • Promoted
    Principal Data Engineer

    Principal Data Engineer

    CodeMyMobileNagpur, IN
    Experience Required - 7 to 10 Years.Are you a Data Engineer who cares about clean engineering, autonomy, and solving real data challenges? If this sounds like you, we’d love to connect!.Email your ...Show moreLast updated: 4 days ago
    • Promoted
    AWS Data Engineer

    AWS Data Engineer

    TerraGiGNagpur, IN
    Design, development, and implementation of performant ETL pipelines using python API (pySpark) of Apache Spark on AWS EMR. Writing reusable, testable, and efficient code.Integration of data storage ...Show moreLast updated: 15 days ago
    • Promoted
    Data Platform Engineer – B2B Intelligence Systems (Life Sciences)

    Data Platform Engineer – B2B Intelligence Systems (Life Sciences)

    BioSalesNagpur, IN
    Data Platform Engineer – B2B Intelligence Systems (Life Sciences).BioSales partners with contract research organizations (CROs) and life sciences companies to provide comprehensive sales and go-to-...Show moreLast updated: 4 days ago
    • Promoted
    Data Engineer II

    Data Engineer II

    ClearDemandNagpur, IN
    Building on the foundation of the SDE-I role, the DE- II position takes on a greater level of responsibility and leadership. You'll play a crucial role in driving the evolution and efficiency of our...Show moreLast updated: 4 days ago
    • Promoted
    Data Engineer – Azure & AWS

    Data Engineer – Azure & AWS

    Yoda TechNagpur, IN
    We are seeking a skilled and motivated Data Engineer to join our team and help build scalable, secure, and efficient data pipelines and platforms. The ideal candidate will have 2 to 4 years of hands...Show moreLast updated: 13 days ago
    • Promoted
    AWS Data Engineer

    AWS Data Engineer

    Tata Consultancy Servicesnagpur, maharashtra, in
    Aws data engineer having strong experience of Python.Technical / Behavioral Competency.Proficient in Python, with experience in deploying Python packages and OOP, Experience in ingesting data from di...Show moreLast updated: 13 days ago
    • Promoted
    Data Engineer

    Data Engineer

    Insight GlobalNagpur, IN
    GCP DATA ENGINEER - Contract (Long term).Data Engineer with hands-on support for Google Looker.Strong experience in data modeling and building data marts. Proficiency in ETL / ELT pipeline development...Show moreLast updated: 30+ days ago
    • Promoted
    Data & Analytics Engineer

    Data & Analytics Engineer

    APPIT Software IncNagpur, IN
    Data Engineer : Snowflake -Mandatory – Hands -on Experience.ETL Tool -Informatica [IVS version],BDT.GCP : Big query – Mandatory -handson experience. Data Modelling & Data Warehouse -Mandatory -Hands...Show moreLast updated: 2 days ago