Responsibilities
- Manage Data : Extract, clean, and structure both structured and unstructured data.
- Coordinate Pipelines : Utilize tools such as Airflow, Step Functions, or Azure Data Factory to orchestrate data workflows.
- Deploy Models : Develop, fine-tune, and deploy models using platforms like SageMaker, Azure ML, or Vertex AI.
- Scale Solutions : Leverage Spark or Databricks to handle large-scale data processing tasks.
- Automate Processes : Implement automation using tools like Docker, Kubernetes, CI / CD pipelines, MLFlow, Seldon, and Kubeflow.
- Collaborate Effectively : Work alongside engineers, architects, and business stakeholders to address and resolve real-world problems efficiently.
Qualifications
3+ years of hands-on experience in MLOps (4-5 years of overall software development experience).Extensive experience with at least one major cloud provider (AWS, Azure, or GCP).Proficiency in using Databricks, Spark, Python, SQL, TensorFlow, PyTorch, and Scikit-learn.Expertise in debugging Kubernetes and creating efficient Dockerfiles.Experience in prototyping with open-source tools and scaling solutions effectively.Strong analytical skills, humility, and a proactive approach to problem-solving.Preferred Qualifications
Experience with SageMaker, Azure ML, or Vertex AI in a production environment.Commitment to writing clean code, creating clear documentation, and maintaining concise pull requests.Skills : sql,kubeflow,spark,docker,databricks,ml,gcp,mlflow,kubernetes,aws,pytorch,azure,ci / cd,tensorflow,scikit-learn,seldon,python,mlops
Skills Required
Airflow, Step Functions, Sql, Tensorflow, Azure ML, Azure Data Factory, Pytorch, Gcp, Docker, Spark, Databricks, Azure, Kubernetes, Python, Aws