Description : Job Summary :
- Manage and optimise AWS infrastructure (EC2, VPC, Route 53, NAT Gateway, Auto Scaling, Kubernetes, EKS) for generative AI solutions, ensuring scalable, secure deployments with Terraform, n8n, and tools like Docker.
Key Requirements (8 Points) :
Deploy LLMs on AWS Bedrock using boto3 for scalable inference.Manage SageMaker pipelines for LLM fine-tuning with EKS / ECS and Auto Scaling.Configure CI / CD with GitlabCI and Terraform.Set up RAG infrastructure with AWS OpenSearch and langchain in VPCs.Deploy AI agents (crewai / autogen) on AWS Lambda with n8n.Orchestrate deployments with Docker on Amazon ECS / EKS using Kubernetes.Manage networking with Route 53 and NAT Gateway for secure access.Monitor infrastructure with Amazon CloudWatch and wandb.Must-Have Skills :
5+ years DevOps experience with AWS and AI workflows.Expertise in AWS Bedrock, SageMaker, EC2, VPC, Route 53, NAT Gateway, Auto Scaling, ECS / EKS, boto3.Proficiency in Terraform, Docker, Kubernetes, langchain, n8n workflows.Experience with CI / CD (CodePipeline / CodeBuild) and monitoring.Preferred Skills :
AWS certification (DevOps Engineer / Solutions Architect).Familiarity with llama-index, n8n templates.(ref : hirist.tech)