Skip to content

Actions: triton-inference-server/vllm_backend

CodeQL

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
371 workflow runs
371 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

perf: Upgrade vLLM version to 0.6.3.post1
CodeQL #400: Pull request #76 synchronize by oandreeva-nv
December 20, 2024 23:11 1m 13s jacky-vllm-0.6.3.post1
December 20, 2024 23:11 1m 13s
Setting shutdown asyncio event in a thread-safe manner
CodeQL #399: Pull request #78 synchronize by oandreeva-nv
December 20, 2024 20:07 1m 4s oandreeva_asyncio_fix
December 20, 2024 20:07 1m 4s
Setting shutdown asyncio event in a thread-safe manner
CodeQL #398: Pull request #78 opened by oandreeva-nv
December 20, 2024 18:44 1m 12s oandreeva_asyncio_fix
December 20, 2024 18:44 1m 12s
Followup with some fixes
CodeQL #397: Pull request #77 opened by oandreeva-nv
December 20, 2024 02:06 1m 6s oandreeva_metrics_refactor
December 20, 2024 02:06 1m 6s
Add resolve_model_relative_to_config_file config option
CodeQL #396: Pull request #29 synchronize by Legion2
December 16, 2024 23:35 Action required Legion2:local-vllm-models
December 16, 2024 23:35 Action required
perf: Upgrade vLLM version to 0.6.3.post1
CodeQL #395: Pull request #76 synchronize by kthui
December 7, 2024 00:45 1m 32s jacky-vllm-0.6.3.post1
December 7, 2024 00:45 1m 32s
perf: Upgrade vLLM version to 0.6.3.post1
CodeQL #394: Pull request #76 synchronize by kthui
December 7, 2024 00:27 1m 31s jacky-vllm-0.6.3.post1
December 7, 2024 00:27 1m 31s
perf: Upgrade vLLM version to 0.6.3.post1
CodeQL #393: Pull request #76 opened by kthui
December 6, 2024 18:39 1m 32s jacky-vllm-0.6.3.post1
December 6, 2024 18:39 1m 32s
feat: Add log probabilities and number of input tokens to additional outputs
CodeQL #392: Pull request #75 synchronize by kthui
December 4, 2024 01:50 1m 32s jacky-vllm-logprobs
December 4, 2024 01:50 1m 32s
feat: Add log probabilities and number of input tokens to additional outputs
CodeQL #391: Pull request #75 synchronize by kthui
December 4, 2024 00:20 1m 34s jacky-vllm-logprobs
December 4, 2024 00:20 1m 34s
feat: Add log probabilities and number of input tokens to additional outputs
CodeQL #390: Pull request #75 synchronize by kthui
December 3, 2024 03:25 1m 13s jacky-vllm-logprobs
December 3, 2024 03:25 1m 13s
feat: Auto unload model if vLLM health check failed
CodeQL #388: Pull request #73 synchronize by kthui
November 27, 2024 00:04 1m 14s jacky-vllm-health
November 27, 2024 00:04 1m 14s
feat: Auto unload model if vLLM health check failed
CodeQL #387: Pull request #73 synchronize by kthui
November 26, 2024 23:49 1m 15s jacky-vllm-health
November 26, 2024 23:49 1m 15s
Update main branch post 24.11
CodeQL #386: Pull request #74 synchronize by mc-nv
November 26, 2024 23:31 1m 14s mchornyi/after-24.11
November 26, 2024 23:31 1m 14s
feat: Auto unload model if vLLM health check failed
CodeQL #385: Pull request #73 synchronize by kthui
November 26, 2024 23:05 1m 22s jacky-vllm-health
November 26, 2024 23:05 1m 22s
feat: Auto unload model if vLLM health check failed
CodeQL #384: Pull request #73 synchronize by kthui
November 26, 2024 19:10 1m 11s jacky-vllm-health
November 26, 2024 19:10 1m 11s
feat: Auto unload model if vLLM health check failed
CodeQL #383: Pull request #73 synchronize by kthui
November 26, 2024 18:57 1m 16s jacky-vllm-health
November 26, 2024 18:57 1m 16s
feat: Auto unload model if vLLM health check failed
CodeQL #382: Pull request #73 synchronize by kthui
November 26, 2024 18:08 1m 15s jacky-vllm-health
November 26, 2024 18:08 1m 15s
Update main branch post 24.11
CodeQL #378: Pull request #74 opened by mc-nv
November 23, 2024 00:51 1m 8s mchornyi/after-24.11
November 23, 2024 00:51 1m 8s
feat: Auto unload model if vLLM health check failed
CodeQL #377: Pull request #73 synchronize by kthui
November 22, 2024 03:20 1m 19s jacky-vllm-health
November 22, 2024 03:20 1m 19s