-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to use the sliding attention window mechanism? #244
Comments
note: unconditional |
hi What operating system do you run the code on? windows or linux. I ran the code on winodws system but i encountered bugs that i cant fix |
windows :) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I have trained a transformer model that can generate 256*256 images, how can I use the sliding attention window mechanism mentioned in the paper to generate high resolution images? It would be nice to have sample code!
The text was updated successfully, but these errors were encountered: