Role Overview :
Architect and scale document processing pipelines that handle thousands of financial documents daily, ensuring high availability and cost efficiency.
Tools : Github actions, Docker, Kubernetes, Terraform, Ansible, Python, AWS, GCP, Azure, Temporal, Kafka.
Key Responsibilities :
- Build scalable async processing pipelines for document identification, extraction & validation
- Optimize cloud infra costs while maintaining 99.9% uptime for document processing workflows
- Design and implement APIs for document upload, processing status & results retrieval
- Manage kubernetes deployments with autoscaling based on document processing load
- Implement monitoring and observability for complex multistage document workflows
- Optimize db performance for high volume document metadata and processing results
- Build CI / CD pipelines for safe deployments
Technical Requirements - Must haves :
5+ years backend development experience (Python or Golang)Strong experience with async processing (Kafka, Temporal or Similar)Docker containerization and orchestrationCloud platforms (AWS / GCP / Azure) with cost optimization experienceRest API design and developmentDatabase optimization ( MongoDB, Postgresql)Production monitoring and debuggingManaging Kubernetes ClusterTechnical Requirements - Nice to have :
Experience with document processing or ML pipelinesInfra as code ( Terraform / Cloudformation / pulumi etc.)Message queues ( SQS, Rabbit MQ, Kafka)Performance optimization for high throughput systemsCheck Your Resume for Match
Upload your resume and our tool will compare it to the requirements for this job like recruiters do.