You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
02 Problem with CudaNdarray in the proposed solution.
Traceback (most recent call last):
File "ex_02_detect_negative.py", line 63, in
f(0.)
File "/home/danielcanelhas/workspace/Theano/theano/compile/function_module.py", line 588, in call
outputs = self.fn()
File "/home/danielcanelhas/workspace/Theano/theano/gof/link.py", line 761, in f
raise_with_op(node, thunk)
File "/home/danielcanelhas/workspace/Theano/theano/gof/link.py", line 759, in f
wrapper(i, node, _thunks)
File "/home/danielcanelhas/workspace/Theano/theano/gof/link.py", line 774, in wrapper
f(_args)
File "ex_02_detect_negative.py", line 41, in neg_check
do_check_on(x, node, fn)
File "ex_02_detect_negative.py", line 32, in do_check_on
if var.min() < 0:
AttributeError: 'CudaNdarray' object has no attribute 'min'
Apply node that caused the error: GpuFromHost(x)
Inputs types: [TensorType(float32, scalar)]
Inputs shapes: ['No shapes']
Inputs strides: ['No strides']
Inputs scalar values: ['not scalar']
HINT: Re-running with most Theano optimization disabled could give you a back-traces when this node was created. This can be done with by setting the Theano flags optimizer=fast_compile
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint of this apply node.
03 problem with GpuSoftmax in the proposed solution.
workarounds may be to compare the fgraph as a string containing "Softmax", though I'm not sure how much you like it:
02 Problem with CudaNdarray in the proposed solution.
03 problem with GpuSoftmax in the proposed solution.
workarounds may be to compare the fgraph as a string containing "Softmax", though I'm not sure how much you like it:
alternatively :
The text was updated successfully, but these errors were encountered: