Role
LLM Workflow Builder – design and wire up prompts, tools, memory and agents with LangChain / LangGraph / LangServe.
API & Micro-Service Owner – ship FastAPI endpoints (REST + streaming) that are secure, observable and lightning fast.
End-to-End Engineer – own the full lifecycle : code → container → CI / CD → K8s → monitoring, with a bias for rapid, iterative releases.
Cross-Functional Partner – collaborate with product and design to scope MVPs, estimate effort, and keep delivery on track.
Responsibilities
Architect, implement and maintain multi-agent workflows and retrieval-augmented pipelines.
Write clean, typed, tested Python (async, pytest, poetry).
Instrument and evaluate everything with LangSmith / LangFuse; build guardrails and regression suites for prompts.
Containerize (Docker) and deploy (Kubernetes, GitHub Actions) with automated rollbacks and alerts.
Optimize latency, throughput and cost; troubleshoot production issues end-to-end.
Mentor teammates on best practices in LLM app engineering and DevOps.
Must-Have Skills
Strong Python craftsmanship and async I / O expertise.
Proven experience with LangChain or LangGraph in real projects.
FastAPI (or similar) for building REST / streaming APIs.
Solid DevOps : Docker, basic Kubernetes, CI / CD pipelines, observability tooling.
Git proficiency : PR hygiene, code reviews, and clear documentation.
Nice-to-Haves
Lightweight fine-tuning (LoRA / QLoRA) or embedding-model selection.
Vector databases (Postgres / pgvector, Milvus, LanceDB).
Serverless / edge deploys (Fly.io, Cloudflare Workers, AWS Lambda).
Experience with multi-agent planning frameworks or autonomous tool-use research.
Ai Engineer • Bhavnagar, Gujarat, India