You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While trying to convert this code, I get the following error during the third run of the converter, while performing a view on the tensor(Full log):
Traceback (most recent call last):
File "main.py", line 32, in <module>
out = verify_torch_and_convert_to_returnn(
- snip -
File "/u/soethe/pytorch-to-returnn-cn/pytorch_to_returnn/naming/namespace.py", line 265, in name_in_ctx
raise KeyError(f"namespace {self!r}: {_src_tensor or possible_sub_names!r} not found")
KeyError: "namespace <RegisteredName 'primary_capsules' <ModuleEntry <CapsuleLayer>> -> ...>: <TensorEntry name:? tensor:(B(100),F'feature:data'(1)(1),'time:data'[B](28),'spatial1:data'[B](28)) returnn_data:'data' [B,F|F'feature:data'(1),T|'time:data'[B],'spatial1:data'[B]] axes id> not found"
The code I'm using to convert can be found here and is adapted from the mnist example already in the converter.
Unfortunately I haven't managed to reproduce the error in a separate test case, the current test case can be found here. The torch layers are identical between the test case and the code that errors, as are the shapes including the metadata of which axis is B/T/F, however in the test case the error cannot be reproduced and converts the call to view fine.
However in the test case, another error occurs related to a torch.cat call. I'm not sure whether this is an error by itself (Full log):
File "/u/soethe/pythonpackages/returnn/returnn/tf/util/data.py", line 4964, in Data.set_dynamic_size
line: assert sizes_tag, "%s: assign dyn sizes %s without defined dim tag" % (self, sizes)
locals:
sizes_tag = <local> None
self = <local> Data{'Cat_ReturnnReinterpretSameSizeAs_output', [B,F'Conv2d_2:channel*Conv2d_2:conv:s0*Conv2d_2:conv:s1'[B],F|F'Unflatten_1_split_dims1'(1)]}
sizes = <local> <tf.Tensor 'Flatten/mul_1:0' shape=(?,) dtype=int32>
AssertionError: Data{'Cat_ReturnnReinterpretSameSizeAs_output', [B,F'Conv2d_2:channel*Conv2d_2:conv:s0*Conv2d_2:conv:s1'[B],F|F'Unflatten_1_split_dims1'(1)]}: assign dyn sizes Tensor("Flatten/mul_1:0", shape=(?,), dtype=int32) without defined dim tag
The text was updated successfully, but these errors were encountered:
The error seems to occur when there are nested calls using nn.Module, here is a small testcase:
def test_view_in_nested_module():
def model_func(wrapped_import, inputs: torch.Tensor):
if wrapped_import:
nn = wrapped_import("torch.nn")
F = wrapped_import("torch.nn.functional")
else:
import torch.nn.functional as F
import torch.nn as nn
class CapsuleLayer(nn.Module):
def __init__(self):
super(CapsuleLayer, self).__init__()
def forward(self, x):
return x.view(x.size(0), -1, 1)
class CapsuleNet(nn.Module):
def __init__(self):
super(CapsuleNet, self).__init__()
self.primary_capsules = CapsuleLayer()
def forward(self, x):
x = F.relu(x)
x = self.primary_capsules(x)
return x
net = CapsuleNet()
return net(inputs)
rnd = numpy.random.RandomState(42)
x = rnd.normal(0., 1., (100, 256, 20, 20)).astype("float32")
verify_torch_and_convert_to_returnn(model_func, inputs=x, returnn_dummy_input_shape=x.shape)
The output during the second run seems to indicate that the converter is not finding the call to view because it's not recursively going through the module:
However the F.relu is required to reproduce the error, if x is just passed into primary_capsules the error doesn't occur, which indicates there's more to it than only the nested nn.Module.
While trying to convert this code, I get the following error during the third run of the converter, while performing a
view
on the tensor(Full log):The code I'm using to convert can be found here and is adapted from the mnist example already in the converter.
Unfortunately I haven't managed to reproduce the error in a separate test case, the current test case can be found here. The torch layers are identical between the test case and the code that errors, as are the shapes including the metadata of which axis is B/T/F, however in the test case the error cannot be reproduced and converts the call to
view
fine.However in the test case, another error occurs related to a
torch.cat
call. I'm not sure whether this is an error by itself (Full log):The text was updated successfully, but these errors were encountered: