Skip to content

Commit

Permalink
fix: attention mask device.
Browse files Browse the repository at this point in the history
  • Loading branch information
celll1 committed Sep 8, 2024
1 parent 40fd26b commit 37baa59
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion modules/model/util/clip_util.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ def encode_clip(
continue

# Create attention mask (1 for non-masked, 0 for masked)
chunk_attention_mask = torch.ones_like(chunk, dtype=torch.bool)
chunk_attention_mask = torch.ones_like(chunk, dtype=torch.bool, device=chunk.device, device=chunk.device)

# First, add BOS and EOS tokens
bos_tokens = torch.full((chunk.shape[0], 1), text_encoder.config.bos_token_id, dtype=chunk.dtype, device=chunk.device)
Expand Down

0 comments on commit 37baa59

Please sign in to comment.