Skip to content

Commit

Permalink
Fixed bad key naming on lora fuse I just pushed
Browse files Browse the repository at this point in the history
  • Loading branch information
jaretburkett committed Aug 18, 2024
1 parent 77ee709 commit 13a965a
Showing 1 changed file with 2 additions and 4 deletions.
6 changes: 2 additions & 4 deletions toolkit/stable_diffusion_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -545,11 +545,9 @@ def load_model(self):
double_block_key = "transformer.transformer_blocks."
for key, value in lora_state_dict.items():
if single_block_key in key:
new_key = key.replace(single_block_key, "")
single_transformer_lora[new_key] = value
single_transformer_lora[key] = value
elif double_block_key in key:
new_key = key.replace(double_block_key, "")
double_transformer_lora[new_key] = value
double_transformer_lora[key] = value
else:
raise ValueError(f"Unknown lora key: {key}. Cannot load this lora in low vram mode")

Expand Down

0 comments on commit 13a965a

Please sign in to comment.