We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
作者您好,我在module_util.py中看到了CrossLinearAttention,看起来应该是用来替换UNet尺寸较大的几个阶段的Attention的,但是为什么最终没有使用呢?
The text was updated successfully, but these errors were encountered:
你好,因为尺寸较大时cross linear attention也很吃显存且最终发现前几层使用cross linear attention效果提升也并不大所以弃用了。
Sorry, something went wrong.
No branches or pull requests
作者您好,我在module_util.py中看到了CrossLinearAttention,看起来应该是用来替换UNet尺寸较大的几个阶段的Attention的,但是为什么最终没有使用呢?
The text was updated successfully, but these errors were encountered: