You must be logged in to sponsor vllm-project
Become a sponsor to vLLM
Your contribution will help fund the development and testing of the vLLM project. We strive to maintain vLLM as the best open-source, community-owned project for LLM inference. However, developing it on GPUs is expensive, and ensuring that it is production-ready requires considerable resources. Please help us sustain it!
Featured work
-
vllm-project/vllm
A high-throughput and memory-efficient inference and serving engine for LLMs
Python 30,800