Examples
A collection of example projects for learning BentoML and building your own solutions.
Open Source
The most flexible way to serve AI/ML models in production
Sign In
Sign Up
Dive into the transformative world of AI application development with us! From expert insights to innovative use cases, we bring you the latest in building AI systems at scale.
Neurolabs leverages BentoML's inference platform to build advanced computer vision pipelines and streamline AI deployment.
view more
Serve, deploy and scale Jamba 1.5 Mini with BentoML.
Build and scale a multi-agent CrewAI application with BentoML.
Explore the top open-source embedding models and find answers to some FAQs about them.
Deploy a production-ready LangGraph agent application with Mistral 7B using BentoML.
Explore the top open-source VLMs and find answers to some FAQs about them.
Deploying Llama3.2 Vision model step-by-step, with OpenLLM and BentoCloud.
Explore the top open-source TTS models and find answers to some FAQs about them.
Explore function calling with open-source LLMs: benefits, use cases, challenges, and more.
Best practices for tuning TensorRT-LLM inference configurations to improve the serving performance of LLMs with BentoML.
Understand the differences between serverless and dedicated LLM deployments, focusing on cost analysis, and explore strategies for optimizing LLM cost and scaling.
Explore the trend towards compound AI and how BentoML can help you build and scale compound AI systems.
Join our global Community
Over 1 million new deployments a month 5000+ community members 200+ open-source contributors
Start a free trial
Schedule a demo
Subscribe our newsletter