You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
packageimport Imported-import nig Modern ModsipelPostalCodesmatchCondition크utchuttersutches overcome
defeatedpackagepedpackageậmortsSorting_sorted sortOrderSortablesortable_sortpackagepackage
packagingpackagefelt
Exception in thread Thread-1 (generate):
Traceback (most recent call last):
File "/home/z004x2xz/local/python3.10/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
self.run()
File "/home/z004x2xz/local/python3.10/lib/python3.10/threading.py", line 953, in run
self._target(*self._args, **self._kwargs)
File "/home/z004x2xz/WorkAssignedByMatt/trl/venvTRL/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/home/z004x2xz/WorkAssignedByMatt/trl/venvTRL/lib/python3.10/site-packages/transformers/generation/utils.py", line 2048, in generate
result = self._sample(
File "/home/z004x2xz/WorkAssignedByMatt/trl/venvTRL/lib/python3.10/site-packages/transformers/generation/utils.py", line 3044, in _sample
next_tokens = torch.multinomial(probs, num_samples=1).squeeze(1)
RuntimeError: probability tensor contains either inf, nan or element < 0
### Information
- [X] The official example scripts
- [ ] My own modified scripts
### Tasks
- [X] An officially supported task in the `examples` folder
- [X] My own task or dataset (give details below)
### Reproduction
### Reproduction
1. **Set up the environment**:
- Install `trl` version `0.11.1` and `transformers` version `4.45.1`.
- Ensure that Python `3.10.11` and CUDA `12.1` are installed.
- Use a Linux machine with the specified configuration.
2. **Run the following command**:
```bash
trl chat --model_name_or_path meta-llama/Llama-3.2-1B-Instruct
Expected behavior
Expected behavior
Expected behavior:
The model should load successfully and allow interaction via the chat interface.
Actual behavior:
The command throws a validation error regarding the model name format.
A warning related to the attention mask also appears.
Here is the output of my terminal
The text was updated successfully, but these errors were encountered:
System Info
System Info
Environment Details:
trl
version:0.11.1
transformers
version:4.45.1
3.10.11
Linux 4649c3747948 6.8.0-41-generic #41-Ubuntu SMP PREEMPT_DYNAMIC Fri Aug 2 20:41:06 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
12.1 (V12.1.105)
Information
Tasks
examples
folder:
what is AI?
<meta-llama/Llama-3.2-1B-Instruct>:
:
what is AI?
<meta-llama/Llama-3.2-1B-Instruct>:
?',?',
?</_Size
())
);
packageimport Imported-import nig Modern ModsipelPostalCodesmatchCondition크utchuttersutches overcome
defeatedpackagepedpackageậmortsSorting_sorted sortOrderSortablesortable_sortpackagepackage
packagingpackagefelt
Exception in thread Thread-1 (generate):
Traceback (most recent call last):
File "/home/z004x2xz/local/python3.10/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
self.run()
File "/home/z004x2xz/local/python3.10/lib/python3.10/threading.py", line 953, in run
self._target(*self._args, **self._kwargs)
File "/home/z004x2xz/WorkAssignedByMatt/trl/venvTRL/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/home/z004x2xz/WorkAssignedByMatt/trl/venvTRL/lib/python3.10/site-packages/transformers/generation/utils.py", line 2048, in generate
result = self._sample(
File "/home/z004x2xz/WorkAssignedByMatt/trl/venvTRL/lib/python3.10/site-packages/transformers/generation/utils.py", line 3044, in _sample
next_tokens = torch.multinomial(probs, num_samples=1).squeeze(1)
RuntimeError: probability tensor contains either
inf
,nan
or element < 0RS PS PSpackageelfsign SignspackageasicASIpackage)!
sink themselvesurousurette."). hồ themselvespackage---
arentацииститаablylesslyậm circular226 Camcams cameracamuyoizen
Expected behavior
Expected behavior
Expected behavior:
Actual behavior:
The text was updated successfully, but these errors were encountered: