AI Model Inference on K8s



Join this lightning talk as we walk through a journey to learn How to deploy AI/ML Model on K8s. Together we would explore the OpenSource community dedicated to Model Inference.

We would learn AI Model stages from a very basic level to Model Serving in detail. Anyone Who has an interest in learning about the AI/ML field is welcome to join the session.

About the speaker

Vaibhav Jain

I am a Senior Software Engineer with RedHat. My most recent field of interest is AI/ML. I am an active contributor in the AI/ML Open Source community such as KServe & OpenDataHub. My more dedicated efforts are towards the RHODS(RedHat DataScience) ModelServing component.

Want to discuss?
Post it here, our mentors will help you out.