Skip to content

Actions: NVIDIA/TensorRT-LLM

Blossom-CI

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
306 workflow runs
306 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

Cannot load built Llama engine due to KeyError with config
Blossom-CI #231: Issue comment #2555 (comment) created by JohnnyRacer
December 19, 2024 17:33 5s
December 19, 2024 17:33 5s
T5 model, large difference in results when remove_input_padding is enabled
Blossom-CI #230: Issue comment #1999 (comment) created by 0xd8b
December 19, 2024 15:20 5s
December 19, 2024 15:20 5s
Cannot load built Llama engine due to KeyError with config
Blossom-CI #229: Issue comment #2555 (comment) created by pcastonguay
December 19, 2024 12:55 5s
December 19, 2024 12:55 5s
tensorrtllm_backend Support for InternVL2
Blossom-CI #228: Issue comment #2568 (comment) created by sunnyqgg
December 19, 2024 11:39 4s
December 19, 2024 11:39 4s
December 19, 2024 09:57 5s
llava-onevision convert bug
Blossom-CI #226: Issue comment #2585 (comment) created by BasicCoder
December 19, 2024 08:47 5s
December 19, 2024 08:47 5s
December 19, 2024 06:58 4s
llava-onevision convert bug
Blossom-CI #224: Issue comment #2585 (comment) created by BasicCoder
December 19, 2024 06:43 5s
December 19, 2024 06:43 5s
December 19, 2024 06:40 5s
December 19, 2024 06:34 4s
Blossom-CI
Blossom-CI #221: created by niukuo
December 19, 2024 00:41 4m 51s
December 19, 2024 00:41 4m 51s
Cannot load built Llama engine due to KeyError with config
Blossom-CI #220: Issue comment #2555 (comment) created by JohnnyRacer
December 18, 2024 23:52 5s
December 18, 2024 23:52 5s
Cannot load built Llama engine due to KeyError with config
Blossom-CI #219: Issue comment #2555 (comment) created by pcastonguay
December 18, 2024 19:27 5s
December 18, 2024 19:27 5s
OOM when building engine for meta-llama/Llama-3.1-405B-FP8 on 8 x A100 80G
Blossom-CI #218: Issue comment #2586 (comment) created by HeyangQin
December 18, 2024 15:10 6s
December 18, 2024 15:10 6s
InternVL deploy
Blossom-CI #217: Issue comment #2565 (comment) created by sunnyqgg
December 18, 2024 10:36 5s
December 18, 2024 10:36 5s
llava-onevision convert bug
Blossom-CI #216: Issue comment #2585 (comment) created by liyi-xia
December 18, 2024 09:35 5s
December 18, 2024 09:35 5s
llava-onevision convert bug
Blossom-CI #215: Issue comment #2585 (comment) created by DylanChen-NV
December 18, 2024 09:31 5s
December 18, 2024 09:31 5s
llava-onevision convert bug
Blossom-CI #214: Issue comment #2585 (comment) created by liyi-xia
December 18, 2024 08:55 5s
December 18, 2024 08:55 5s
Slicing tensor with dynamic shape in custom enc-dec architecture
Blossom-CI #213: Issue comment #2584 (comment) created by AvivSham
December 18, 2024 08:53 5s
December 18, 2024 08:53 5s
llava-onevision convert bug
Blossom-CI #212: Issue comment #2585 (comment) created by DylanChen-NV
December 18, 2024 08:52 5s
December 18, 2024 08:52 5s
Cannot load built Llama engine due to KeyError with config
Blossom-CI #211: Issue comment #2555 (comment) created by nv-guomingz
December 18, 2024 08:46 5s
December 18, 2024 08:46 5s
Error in building llama with eagle for speculative decoding
Blossom-CI #210: Issue comment #2588 (comment) created by nv-guomingz
December 18, 2024 07:36 4s
December 18, 2024 07:36 4s
目前OpenAIServer是不是只支持LLM不支持VLM?
Blossom-CI #209: Issue comment #2581 (comment) created by LinPoly
December 18, 2024 07:03 4s
December 18, 2024 07:03 4s
[feature request] Can we add H200 in infer_cluster_key() method?
Blossom-CI #208: Issue comment #2552 (comment) created by nv-guomingz
December 18, 2024 06:51 4s
December 18, 2024 06:51 4s
trtllm-serve does not support dynamic batching like tritonserver
Blossom-CI #207: Issue comment #2549 (comment) created by nv-guomingz
December 18, 2024 06:48 4s
December 18, 2024 06:48 4s