Skip to content
You must be logged in to sponsor vllm-project

Become a sponsor to vLLM

Your contribution will help fund the development and testing of the vLLM project. We strive to maintain vLLM as the best open-source, community-owned project for LLM inference. However, developing it on GPUs is expensive, and ensuring that it is production-ready requires considerable resources. Please help us sustain it!

Current sponsors 9

@robertgshaw2-neuralmagic
@upstash
@mgoin
@vincentkoc
@dvlpjrs
Private Sponsor
@HiddenPeak
@massif-01
@davedgd
Past sponsors 7
@AnyISalIn
@lukalafaye
@peakji
@youkaichao
@maxdebayser
@yangalan123
Private Sponsor

Featured work

  1. vllm-project/vllm

    A high-throughput and memory-efficient inference and serving engine for LLMs

    Python 30,800

Select a tier

$ a month

Choose a custom amount.