---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
/home/ekhongl/.conda/envs/py3/lib/python3.5/site-packages/theano/compile/function_module.py in __call__(self, *args, **kwargs)
858 try:
--> 859 outputs = self.fn()
860 except Exception:
ValueError: dimension mismatch in args to gemm (3,2)x(2,2)->(2,1)
During handling of the above exception, another exception occurred:
ValueError Traceback (most recent call last)
<ipython-input-22-1f8cc4efdce0> in <module>()
1 b = np.array([[2,0,],[0,2],[1,1]], dtype = theano.config.floatX)
----> 2 [pre_sigmoid_h1, h1_mean, h1_sample, softmax_v1, v1_doc_len, v1_mean, v1_sample]= gibbs(b)
3 print(pre_sigmoid_h1)
4 print('-------')
5 print(h1_mean)
/home/ekhongl/.conda/envs/py3/lib/python3.5/site-packages/theano/compile/function_module.py in __call__(self, *args, **kwargs)
869 node=self.fn.nodes[self.fn.position_of_error],
870 thunk=thunk,
--> 871 storage_map=getattr(self.fn, 'storage_map', None))
872 else:
873 # old-style linkers raise their own exceptions
/home/ekhongl/.conda/envs/py3/lib/python3.5/site-packages/theano/gof/link.py in raise_with_op(node, thunk, exc_info, storage_map)
312 # extra long error message in that case.
313 pass
--> 314 reraise(exc_type, exc_value, exc_trace)
315
316
/home/ekhongl/.conda/envs/py3/lib/python3.5/site-packages/six.py in reraise(tp, value, tb)
683 value = tp()
684 if value.__traceback__ is not tb:
--> 685 raise value.with_traceback(tb)
686 raise value
687
/home/ekhongl/.conda/envs/py3/lib/python3.5/site-packages/theano/compile/function_module.py in __call__(self, *args, **kwargs)
857 t0_fn = time.time()
858 try:
--> 859 outputs = self.fn()
860 except Exception:
861 if hasattr(self.fn, 'position_of_error'):
ValueError: dimension mismatch in args to gemm (3,2)x(2,2)->(2,1)
Apply node that caused the error: GpuGemm{no_inplace}(<CudaNdarrayType(float32, matrix)>, TensorConstant{1.0}, GpuFromHost.0, GpuDimShuffle{1,0}.0, TensorConstant{1.0})
Toposort index: 8
Inputs types: [CudaNdarrayType(float32, matrix), TensorType(float32, scalar), CudaNdarrayType(float32, matrix), CudaNdarrayType(float32, matrix), TensorType(float32, scalar)]
Inputs shapes: [(2, 1), (), (3, 2), (2, 2), ()]
Inputs strides: [(1, 0), (), (2, 1), (1, 2), ()]
Inputs values: [b'CudaNdarray([[ 0.]\n [ 0.]])', array(1.0, dtype=float32), 'not shown', b'CudaNdarray([[ 1. 1.]\n [ 1. 1.]])', array(1.0, dtype=float32)]
Outputs clients: [[GpuDimShuffle{0,1,x,x}(GpuGemm{no_inplace}.0), GpuDimShuffle{x,0,1}(GpuGemm{no_inplace}.0)]]
HINT: Re-running with most Theano optimization disabled could give you a back-trace of when this node was created. This can be done with by setting the Theano flag 'optimizer=fast_compile'. If that does not work, Theano optimizations can be disabled with 'optimizer=None'.
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and storage map footprint of this apply node.