We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
max_slices_num = 9或者80 max_length = 2048或者4096 batch_size = 1或者9 无论这些参数如何配置,显存占用都是打满状态,两张A100 80G @LDLINGLINGLING
帮忙定位下原因
无
- OS: Linux - Python: 3.12 - Transformers: 4.40 - PyTorch: 2.1.2 - CUDA (`python -c 'import torch; print(torch.version.cuda)'`):
The text was updated successfully, but these errors were encountered:
请问您是在微调过程中出现这个问题还是,推理
Sorry, something went wrong.
@LDLINGLINGLING 是微调过程中。 max_slices_num,我猜测是图片分辨率都比较小,max_slices_num=9就能覆盖,所以显存占用变化不大; max_length和batch_size,就不太理解了
No branches or pull requests
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
当前行为 | Current Behavior
max_slices_num = 9或者80
max_length = 2048或者4096
batch_size = 1或者9
无论这些参数如何配置,显存占用都是打满状态,两张A100 80G @LDLINGLINGLING
期望行为 | Expected Behavior
帮忙定位下原因
复现方法 | Steps To Reproduce
无
运行环境 | Environment
备注 | Anything else?
无
The text was updated successfully, but these errors were encountered: