About the Job
We are looking for an experienced AI / ML & Data Engineer to design, develop, and deploy scalable machine learning models and data infrastructure on AWS. You will work closely with cross-functional teams to deliver AI-driven solutions, integrate large language models (LLMs), and optimize data workflows while ensuring security, scalability, and performance.
Core Responsibilities
- Develop and deploy ML models and algorithms, including LLMs (GPT, Llama, Gemini, Mistral) and NLP models (BERT, GPT).
- Design and maintain scalable ETL / ELT pipelines, data lakes, and warehouses on AWS (S3, Glue, Redshift, Athena, Lambda, DMS).
- Build APIs using AWS Lambda, API Gateway, and AWS SDK.
- Containerize and deploy applications using Docker, ECR, and ECS.
- Implement Infrastructure as Code (IaC) using tools such as Terraform and CloudFormation, and adhere to CI / CD best practices.
- Integrate LLMs into workflows for classification, summarization, enrichment, and semantic search.
- Optimize data workflows and ML models for performance, scalability, and cost efficiency.
- Ensure compliance with data privacy, security, and governance standards.
Required Skills
Proficiency in Python, SQL, and distributed processing frameworks (PySpark).Strong fundamentals in GenerativeAI and AgenticAI, hands-on with LLM frameworks (CrewAI, LangChain, LlamaIndex),Familiar with LLM as a Service, i.e., AWS Bedrock, Hugging Face, similar.Experience with AWS cloud services and containerized deployments.Knowledge of vector databases (ChromaDB, Pinecone, PGVector) is a plus.Excellent problem-solving, communication, and collaboration skills.Qualifications
Bachelor’s or higher in Computer Science, Information Technology, or related field.4+ years of hands-on experience in AI / ML engineering, data engineering, and AWS cloud infrastructure.