You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@chrischoy
Got the following error while trying to train on scannet dataset with your latest commit,
Traceback (most recent call last):
File "main.py", line 156, in <module>
main()
File "main.py", line 149, in main
train(model, train_data_loader, val_data_loader, config)
File "/home/ubuntu/workspace_pcs/code/SpatioTemporalSegmentation/lib/train.py", line 78, in train
coords, input, target = data_iter.next()
File "/home/ubuntu/miniconda3/envs/mink/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 819, in __next__
return self._process_data(data)
File "/home/ubuntu/miniconda3/envs/mink/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 846, in _process_data
data.reraise()
File "/home/ubuntu/miniconda3/envs/mink/lib/python3.7/site-packages/torch/_utils.py", line 369, in reraise
raise self.exc_type(msg)
ValueError: Caught ValueError in DataLoader worker process 0.
Original Traceback (most recent call last):
File "/home/ubuntu/miniconda3/envs/mink/lib/python3.7/site-packages/torch/utils/data/_utils/worker.py", line 178, in _worker_loop
data = fetcher.fetch(index)
File "/home/ubuntu/miniconda3/envs/mink/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/home/ubuntu/miniconda3/envs/mink/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 44, in <listcomp>
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/home/ubuntu/workspace_pcs/code/SpatioTemporalSegmentation/lib/dataset.py", line 265, in __getitem__
coords, feats, labels, center = self.load_ply(index)
ValueError: not enough values to unpack (expected 4, got 2)
The text was updated successfully, but these errors were encountered:
@chrischoy
Got the following error while trying to train on scannet dataset with your latest commit,
The text was updated successfully, but these errors were encountered: