BentoML: Simplifies ML model deployment and serves your models at production scale.
00:00 Intro BentoML and Tim Liu
00:55 Overview of the BentoML Product
01:50 Enabling Data Science Agility
02:26 Demo Start: Saving the trained model
04:10 Dependency versions saved with model
05:38 Batching input requests for inference
07:45 Creating your prediction service
10:04 Data validation and pre/post processing code
10:56 Origin of the "Bento" analogy
12:11 Bentofile.yaml for configuring your bento
13:18 Serving the Bento
14:18 Out of the box Swagger and Prometheus endpoints
15:26 Running BentoML in production mode
18:23 Different methods for scaling up a Bento service
19:55 Deployable to many cloud services
20:23 Wrapping up with links to projects and community
22:33 How can you help contribute?
23:24 Future of BentoML
25:19 Advice for both new and experienced in the field of MLOps
Links:
- BentoML (Simplify Model Deployment): bentoml.com
- BentoML Github (The Unified Model Serving Framework): l.bentoml.com/github-bentoml
- Yatai Github (Model Deployment at scale on Kubernetes): l.bentoml.com/github-yatai
- Bentoctl Github (Fast model deployment on any cloud platform): link.bentoml.com/github-bentoctl
- Join our growing community on Slack: l.bentoml.com/join-slack
MLOps Zoomcamp: github.com/DataTalksClub/mlop...
Join DataTalks.Club: datatalks.club/slack.html
Our events: datatalks.club/events.html
Негізгі бет Ойын-сауық Open-Source Spotlight - BentoML - Tim Liu
Пікірлер: 5