Talent.com
This job offer is not available in your country.
MLOps Data Architect

MLOps Data Architect

ICFBengaluru, Karnataka, India
11 hours ago
Job description

We are seeking a highly skilled and motivated MLOps Data Architect with expertise in small language models, Azure Fabric, Data Governance, AI Model Evaluation, and Building AI Agents. The ideal candidate will have a strong background in computer science, software development, and cloud infrastructure. This role involves designing, implementing, and maintaining scalable and secure cloud infrastructure, as well as collaborating with cross-functional teams to drive innovation in AI and machine learning.

Key Responsibilities :

  • Develop and optimize data pipelines for small language models and AI agents.
  • Ensure data governance and compliance with industry standards.
  • Evaluate AI models and implement best practices for model monitoring and maintenance.
  • Develop and optimize complex data pipelines, applying machine learning engineering principles to enhance efficiency and scalability.
  • Employ experimental methodologies, statistics, and machine learning concepts to create self-running AI systems for predictive modeling.
  • Collaborate with data science teams to review model-ready datasets and feature documentation, ensuring completeness and accuracy.
  • Perform data discovery and analysis of raw data sources, applying business context to meet model development needs.
  • Comfort with exploratory data exploration and tracking data lineage during inception or root cause analysis.
  • Engage with internal stakeholders to understand business processes and translate requirements into analytical approaches.
  • Write and maintain model monitoring scripts, diagnosing issues and coordinating resolutions based on alerts.
  • Serve as a domain expert in machine learning engineering on cross-functional teams for significant initiatives.
  • Stay updated with the latest advancements in AI / ML and apply them to real-world challenges.
  • Participate in special projects and additional duties as assigned.

Qualifications :

  • Undergraduate degree or equivalent experience; a graduate degree is preferred.
  • Minimum of 5 years of relevant work experience.
  • At least 3 years of hands-on experience designing ETL pipelines using AWS services (e.g., Glue, SageMaker).
  • Proficiency in programming languages, particularly Python (including PySpark, PySQL) and familiarity with machine learning libraries and frameworks.
  • Strong understanding of cloud technologies, including AWS and Azure, and experience with NoSQL databases.
  • Familiarity with Feature Store usage, LLMs, GenAI, RAG, Prompt Engineering, and Model Evaluation.
  • Experience with API design and development is a plus.
  • Solid understanding of software engineering principles, including design patterns, testing, security, and version control.
  • Knowledge of Machine Learning Development Lifecycle (MDLC) best practices and protocols.
  • Understanding of solution architecture for building end-to-end machine learning data pipelines.
  • Create a job alert for this search

    Data Architect • Bengaluru, Karnataka, India