Skip to content
You must be logged in to sponsor vllm-project

Become a sponsor to vLLM

Your contribution will help fund the development and testing of the vLLM project. We strive to maintain vLLM as the best open-source, community-owned project for LLM inference. However, developing it on GPUs is expensive, and ensuring that it is production-ready requires considerable resources. Please help us sustain it!

Current sponsors 5

@robertgshaw2-neuralmagic
@upstash
@mgoin
@vincentkoc
Private Sponsor
Past sponsors 6
@AnyISalIn
@lukalafaye
@peakji
@youkaichao
@maxdebayser
@yangalan123

Featured work

  1. vllm-project/vllm

    A high-throughput and memory-efficient inference and serving engine for LLMs

    Python 27,611

Select a tier

$ a month

Choose a custom amount.