Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

qwen2-vl使用lmdeploy推理报错。 #2457

Open
bonre opened this issue Nov 15, 2024 · 1 comment
Open

qwen2-vl使用lmdeploy推理报错。 #2457

bonre opened this issue Nov 15, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@bonre
Copy link

bonre commented Nov 15, 2024

非常感谢贵组的工作!
我已经将lmdeploy和swift均拉取到最新版本。
我在使用如下脚本进行推理时,会出现一些问题:

CUDA_VISIBLE_DEVICES=2,3 swift infer \
        --model_type qwen2-vl-7b-instruct \
        --ckpt_dir "/QwenVL/Full_1114/checkpoint-640" \
        --infer_backend lmdeploy \
        --load_args_from_ckpt_dir true \
        --val_dataset /home/workspace/Eval/1024/le.json \
        --show_dataset_sample -1 \
        --verbose true \
        --do_sample true \
        --max_new_tokens 7000 \
        --temperature 0 \
        --repetition_penalty 1.0 \
        --use_flash_attn true \
        --tp 2

1.当我运行时,会告知库有冲突,lmdeploy支持的transformers最高版本为4.4.1,但qwen2vl支持的最低版本为4.5
2.运行之后,在lmdeploy中会无法找到该model_type下的model_chat_template,导致会进行模板注册,此时会找不到在最佳实践中定义的数据格式,貌似是和swift的数据格式产生冲突?

@bonre
Copy link
Author

bonre commented Nov 15, 2024

另外,我测试了下用pt推理是能够正常得到结果的,但是OUTPUT的内容中会出现<|im_end|>在结尾。
这个现象似乎佐证了系chat模板导致的问题?
不知道是否是我哪里设置有问题,我自定义的数据格式是按照最佳实践中的设置来写的。

@Jintao-Huang Jintao-Huang added the bug Something isn't working label Nov 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants