You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ts/full_training/reproduce_TIGER-Lab-VLM2Vec-Full/checkpoint-1000
Traceback (most recent call last):
File "train.py", line 89, in <module>
main()
File "train.py", line 81, in main
trainer.train()
File "/opt/conda/lib/python3.8/site-packages/transformers/trainer.py", line 2123, in train
return inner_training_loop(
File "/opt/conda/lib/python3.8/site-packages/transformers/trainer.py", line 2548, in _inner_training_loop
self._maybe_log_save_evaluate(tr_loss, grad_norm, model, trial, epoch, ignore_keys_for_eval)
File "/opt/conda/lib/python3.8/site-packages/transformers/trainer.py", line 3007, in _maybe_log_save_evaluate
self._save_checkpoint(model, trial, metrics=metrics)
File "/opt/conda/lib/python3.8/site-packages/transformers/trainer.py", line 3097, in _save_checkpoint
self.save_model(output_dir, _internal_call=True)
File "/opt/conda/lib/python3.8/site-packages/transformers/trainer.py", line 3730, in save_model
self._save(output_dir)
File "code/VLM2Vec-main/src/trainer.py", line 140, in _save
self.model.encoder.save_pretrained(
File "/opt/conda/lib/python3.8/site-packages/transformers/modeling_utils.py", line 2959, in save_pretrained
raise RuntimeError(
RuntimeError: The weights trying to be saved contained shared tensors [{'model.embed_tokens.weight', 'model.vision_embed_tokens.wte.weight'}] that are mismatching the transformers base configuration. Try saving using `safe_serialization=False` or remove this tensor sharing.
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling par
look like
https://github.com/kazuar/Phi3-Vision-ft/issues/2
The text was updated successfully, but these errors were encountered:
using phi3-vision for full train
when up to 1000 epoch, occur
look like
The text was updated successfully, but these errors were encountered: