add a new image_size parameter in train_dalle and generate #310
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
VAE models can be use with patches of any size.
For example a model trained on 16x16 patches can still be used on 32x32 patches
that increase the seq length from 256 to 1024 in dalle
This is still a draft, as I think we should not store the image_size in the vae, but rather in the dalle. Indeed dalle model need the sequence length to be of fixed size and fixed resolution, but the vae model works for any power of 2 resolution. I'll try later to do this change.
This work as-is if anyone want to experiment with it a bit.