API changes
- Spiking
LIF
neuron models now accept an additional argument,min_voltage
. Voltages are clipped such that they do not drop below this value (previously, this was fixed at 0). (#666) Process
objects can now be passed directly as node outputs, making them easier to use. TheProcess
interface is also improved and is currently the same as theSynapse
interface. However, further improvements are pending, and the current implementation SHOULD NOT BE RELEASED! (#652)- The
PES
learning rule no longer accepts a connection as an argument. Instead, error information is transmitted by making a connection to the learning rule object (e.g.,nengo.Connection(error_ensemble, connection.learning_rule)
. (#344, #642) - The
modulatory
attribute has been removed fromnengo.Connection
. This was only used for learning rules to this point, and has been removed in favor of connecting directly to the learning rule. (#642) - Connection weights can now be probed with
nengo.Probe(conn, 'weights')
, and these are always the weights that will change with learning regardless of the type of connection. Previously, eitherdecoders
ortransform
may have changed depending on the type of connection; it is now no longer possible to probedecoders
ortransform
. (#729) - A version of the AssociativeMemory SPA module is now available as a
stand-alone network in
nengo.networks
. The AssociativeMemory SPA module also has an updated argument list. (#702) - The
Product
andInputGatedMemory
networks no longer accept aconfig
argument. (#814) - The
EnsembleArray
network'sneuron_nodes
argument is deprecated. Instead, call the newadd_neuron_input
oradd_neuron_output
methods. (#868) - The
nengo.log
utility function now takes a stringlevel
parameter to specify any logging level, instead of the old binarydebug
parameter. Cache messages are logged at DEBUG instead of INFO level. (#883) - Reorganised the Associative Memory code, including removing many extra
parameters from
nengo.networks.assoc_mem.AssociativeMemory
and modifying the defaults of others. (#797) - Add
close
method toSimulator
.Simulator
can now be used used as a context manager. (`#857 <https://github.com/nengo/nengo/issues/857`_, #739, #859) - Most exceptions that Nengo can raise are now custom exception classes
that can be found in the
nengo.exceptions
module. (#781) - All Nengo objects (
Connection
,Ensemble
,Node
, andProbe
) now accept alabel
andseed
argument if they didn't previously. (#958)
Behavioural changes
- The sign on the
PES
learning rule's error has been flipped to conform with most learning rules, in which error is minimized. The error should beactual - target
. (#642) - The
PES
rule's learning rate is invariant to the number of neurons in the presynaptic population. The effective speed of learning should now be unaffected by changes in the size of the presynaptic population. Existing learning networks may need to be updated; to achieve identical behavior, scale the learning rate bypre.n_neurons / 100
. (#643) - The
probeable
attribute of all Nengo objects is now implemented as a property, rather than a configurable parameter. (#671) - Node functions receive
x
as a copied NumPy array (instead of a readonly view). (#716, #722) - The SPA Compare module produces a scalar output (instead of a specific vector). (#775, #782)
- Bias nodes in
spa.Cortical
, and gate ensembles and connections inspa.Thalamus
are now stored in the target modules. (#894, #906)
Improvements
- Added a
randomized_svd
subsolver for the L2 solvers. This can be much quicker for large numbers of neurons or evaluation points. (#803) - Added
PES.pre_tau
attribute, which sets the time constant on a lowpass filter of the presynaptic activity. (#643) EnsembleArray.add_output
now accepts a list of functions to be computed by each ensemble. (#562, #580)LinearFilter
now has ananalog
argument which can be set through its constructor. Linear filters with digital coefficients can be specified by settinganalog
toFalse
. (#819)- Added
SqrtBeta
distribution, which describes the distribution of semantic pointer elements. (#414, #430) - Added
Triangle
synapse, which filters with a triangular FIR filter. (#660) - Added
utils.connection.eval_point_decoding
function, which provides a connection's static decoding of a list of evaluation points. (#700) - Resetting the Simulator now resets all Processes, meaning the
injected random signals and noise are identical between runs,
unless the seed is changed (which can be done through
Simulator.reset
). (#582, #616, #652) - An exception is raised if SPA modules are not properly assigned to an SPA attribute. (#730, #791)
- The
Product
network is now more accurate. (#651) - Numpy arrays can now be used as indices for slicing objects. (#754)
Config.configures
now accepts multiple classes rather than just one. (#842)- Added
add
method tospa.Actions
, which allows actions to be added after module has been initialized. (#861, #862) - Added SPA wrapper for circular convolution networks,
spa.Bind
(#849) - Added the
Voja
(Vector Oja) learning rule type, which updates an ensemble's encoders to fire selectively for its inputs. (seeexamples/learning/learn_associations.ipynb
). (#727) - Added a clipped exponential distribution useful for thresholding, in particular in the AssociativeMemory. (#582)
- Added a cosine similarity distribution, which is the distribution of the
cosine of the angle between two random vectors. It is useful for setting
intercepts, in particular when using the
Voja
learning rule. (#768)
Bug fixes
- Fixed issue where setting
Connection.seed
through the constructor had no effect. (#724) - Fixed issue in which learning connections could not be sliced. (#632)
- Fixed issue when probing scalar transforms. (#667, #671)
- Fix for SPA actions that route to a module with multiple inputs. (#714)
- Corrected the
rmses
values inBuiltConnection.solver_info
when usingNNls
andNnl2sL2
solvers, and thereg
argument forNnl2sL2
. (#839) spa.Vocabulary.create_pointer
now respects the specified number of creation attempts, and returns the most dissimilar pointer if none can be found below the similarity threshold. (#817)- Probing a Connection's output now returns the output of that individual Connection, rather than the input to the Connection's post Ensemble. (#973, #974)
API changes
- The
spa.State
object replaces the oldspa.Memory
andspa.Buffer
. These old modules are deprecated and will be removed in 2.2. (#796)
2.0.2 is a bug fix release to ensure that Nengo continues to work with more recent versions of Jupyter (formerly known as the IPython notebook).
Behavioural changes
- The IPython notebook progress bar has to be activated with
%load_ext nengo.ipynb
. (#693)
Improvements
- Added
[progress]
section tonengorc
which allows settingprogress_bar
andupdater
. (#693)
Bug fixes
- Fix compatibility issues with newer versions of IPython, and Jupyter. (#693)
Behavioural changes
- Node functions receive
t
as a float (instead of a NumPy scalar) andx
as a readonly NumPy array (instead of a writeable array). (#626, #628)
Improvements
rasterplot
works with 0 neurons, and generates much smaller PDFs. (#601)
Bug fixes
- Fix compatibility with NumPy 1.6. (#627)
Initial release of Nengo 2.0! Supports Python 2.6+ and 3.3+. Thanks to all of the contributors for making this possible!