What do you contribute? As part of our AI team, you will be responsible for building &
maintaining scalable Data / ML pipelines that support our high performance, scalable, robust SAAS
platform. You will also contribute in building / improving Data Analytics / ML insights applications
& realtime pipelines for our Martech & product discovery platform.
Must have
- Overall experience of 5-7 years in building web applications at scale
- Proficiency in, at least, one modern programming language such as Golang / Python
- Hands-on experience in working with at least one major cloud provider like AWS / GCP /
Azure
Proficiency in working with containerized applications using Docker etcProficiency in managing deployment workflows using CI / CD tools like Github Actions /Jenkins / Gitlab
Hands-on experience in architecture, implementation, deployment, testing and scalability oflarge scale distributed systems
Proficiency in using version control like Github / GitlabGood To Have
Exposure to Kubernetes, Helm and related ecosystemsHands-on experience in training large scale ML modelsExperience in building high throughput big data pipelines & batch data processing toolsExperience in highly scalable tools like Kafka, Spark, Hive, HadoopExperience in ML toolkits like Kubeflow / AI workflows / AWS Batch is a plusUnderstanding of networking concepts like VPC / Subnets / NAT / IG / DNS