Skip to content

Commit

Permalink
support sdxl dynamic shape vanilla, lora, dreambooth finetune
Browse files Browse the repository at this point in the history
  • Loading branch information
wcrzlh committed Jun 28, 2024
1 parent 95a8a51 commit 5bff308
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion examples/stable_diffusion_xl/gm/modules/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -347,7 +347,6 @@ def __init__(
self.proj_out = zero_module(nn.Dense(inner_dim, in_channels))
self.use_linear = use_linear


def construct(self, x, context=None):
# note: if no context is given, cross-attention defaults to self-attention
if not isinstance(context, (list, tuple)):
Expand Down

0 comments on commit 5bff308

Please sign in to comment.