If you receive the following error, it is because the Python function __len__ cannot be implemented on Theano variables:
TypeError: object of type 'TensorVariable' has no len()
Python requires that __len__ returns an integer, yet it cannot be done as Theano’s variables are symbolic. However, var.shape[0] can be used as a workaround.
This error message cannot be made more explicit because the relevant aspects of Python’s internals cannot be modified.
You can enable faster gcc optimization with the cxxflags. This list of flags was suggested on the mailing list:
cxxflags=-march=native -O3 -ffast-math -ftree-loop-distribution -funroll-loops -ftracer
Use it at your own risk. Some people warned that the -ftree-loop-distribution optimization resulted in wrong results in the past. Also the -march=native flag must be used with care if you have NFS. In that case, you MUST set the compiledir to a local path of the computer.
You can set the Theano flag allow_gc to False to get a speed-up by using more memory. By default, Theano frees intermediate results when we don’t need them anymore. Doing so prevents us from reusing this memory. So disabling the garbage collection will keep all intermediate results’ memory space to allow to reuse them during the next call to the same Theano function, if they are of the correct shape. The shape could change if the shapes of the inputs change.
Note
For Theano 0.6 and up.
For Theano functions that don’t do much work, like a regular logistic regression, the overhead of checking the input can be significant. You can disable it by setting f.trust_input to True. Make sure the types of arguments you provide match those defined when the function was compiled.
Also, for small Theano functions, you can remove more Python overhead by making a Theano function that does not take any input. You can use shared variables to achieve this. Then you can call it like this: f.fn() or f.fn(n_calls=N) to speed it up. In the last case, only the last function output (out of N calls) is returned.
Theano offers a good amount of flexibility, but has some limitations too. You must answer for yourself the following question: How can my algorithm be cleverly written so as to make the most of what Theano can do?
Here is a list of some of the known limitations: