You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using distributed or parallel set-up in script?: no
Using GPU in script?: no
GPU type: NVIDIA A100-SXM4-80GB
adapters version: 1.0.0
Information
I am trying to set up a simple ac.Parallel on two lora modules. However, running an slightly modified example gives me RuntimeError: shape '[2, 6, 32, 128]' is invalid for input of size 98304
Adapter setup I am using (if any):
The problem arises when using:
the official example scripts: (give details below)
my own modified scripts: (give details below)
The tasks I am working on is:
an official GLUE/SQUaD task: (give the name)
my own task or dataset: (give details below)
To reproduce
import adapters.composition as ac
from adapters import AutoAdapterModel
from transformers import AutoTokenizer
import adapters
import torch
model_name="meta-llama/Llama-3.2-1B"
model_name = "meta-llama/Llama-3.1-8B"
model = AutoAdapterModel.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
input_ids = tokenizer(["Adapters are great!", "Adapters are awesome!"], return_tensors="pt")
from adapters import LoRAConfig
config = LoRAConfig(r=8, alpha=16)
model.add_adapter("a", config=config)
model.add_adapter("b", config=config)
model.active_adapters = ac.Parallel("a", "b")
output1, output2 = model(**input_ids)
Environment info
transformers
version: 4.43.4adapters
version: 1.0.0Information
I am trying to set up a simple ac.Parallel on two lora modules. However, running an slightly modified example gives me
RuntimeError: shape '[2, 6, 32, 128]' is invalid for input of size 98304
Adapter setup I am using (if any):
The problem arises when using:
The tasks I am working on is:
To reproduce
Error message:
Expected behavior
It worked using the bert example from the documentation, so something with the lora or the autoregressive model here maybe?
The text was updated successfully, but these errors were encountered: