Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于module_util.py中的CrossLinearAttention #73

Open
caoyue2020 opened this issue Aug 28, 2024 · 1 comment
Open

关于module_util.py中的CrossLinearAttention #73

caoyue2020 opened this issue Aug 28, 2024 · 1 comment

Comments

@caoyue2020
Copy link

作者您好,我在module_util.py中看到了CrossLinearAttention,看起来应该是用来替换UNet尺寸较大的几个阶段的Attention的,但是为什么最终没有使用呢?

@Algolzw
Copy link
Owner

Algolzw commented Aug 28, 2024

你好,因为尺寸较大时cross linear attention也很吃显存且最终发现前几层使用cross linear attention效果提升也并不大所以弃用了。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants