Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] 为什么如何调整参数,显存占用都是接近80G? #660

Open
2 tasks done
DankoZhang opened this issue Nov 8, 2024 · 2 comments
Open
2 tasks done

Comments

@DankoZhang
Copy link

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

  • 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

  • 我已经搜索过FAQ | I have searched FAQ

当前行为 | Current Behavior

max_slices_num = 9或者80
max_length = 2048或者4096
batch_size = 1或者9
无论这些参数如何配置,显存占用都是打满状态,两张A100 80G @LDLINGLINGLING

期望行为 | Expected Behavior

帮忙定位下原因

复现方法 | Steps To Reproduce

运行环境 | Environment

- OS: Linux
- Python: 3.12
- Transformers: 4.40
- PyTorch: 2.1.2
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):

备注 | Anything else?

@LDLINGLINGLING
Copy link
Collaborator

请问您是在微调过程中出现这个问题还是,推理

@DankoZhang
Copy link
Author

DankoZhang commented Nov 14, 2024

请问您是在微调过程中出现这个问题还是,推理

@LDLINGLINGLING 是微调过程中。
max_slices_num,我猜测是图片分辨率都比较小,max_slices_num=9就能覆盖,所以显存占用变化不大;
max_length和batch_size,就不太理解了

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants