Skip to content

Commit

Permalink
FIX Dora finetuning example collate fn (#2197)
Browse files Browse the repository at this point in the history
  • Loading branch information
shirinyamani authored Nov 4, 2024
1 parent b5b9023 commit 4e57aa5
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions examples/dora_finetuning/dora_finetuning.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
AutoModelForCausalLM,
AutoTokenizer,
BitsAndBytesConfig,
DataCollatorWithPadding,
DataCollatorForLanguageModeling,
Trainer,
TrainingArguments,
)
Expand Down Expand Up @@ -95,7 +95,7 @@ def tokenize_function(examples):
tokenized_datasets = dataset.map(tokenize_function, batched=True, remove_columns=dataset["train"].column_names)

# Data collator to dynamically pad the batched examples
data_collator = DataCollatorWithPadding(tokenizer)
data_collator = DataCollatorForLanguageModeling(tokenizer, mlm=False)

# Define training arguments
training_args = TrainingArguments(
Expand Down

0 comments on commit 4e57aa5

Please sign in to comment.