[nengo-user] Achieving ensemble activity sparsity
Daniel Rasmussen
dhrsmss at gmail.com
Wed Nov 5 12:46:48 EST 2014
Hi Claus,
There are two main things to think about when adjusting the sparsity of
neural activity. The first is the distribution of encoding vectors for the
population, and the second is the distribution of intercepts. The
particular approach you take will depend on the underlying function you are
trying to capture.
For example, suppose your model's inputs are coming from some large
n-dimensional space, but you think that it is only representing certain
points in that space. Then you could cluster all your encoding vectors
around those points. This would lead to sparse activity, as you will get
very little activation for any inputs outside of those clusters, and you
could adjust the level of sparsity by changing the size of the clusters
Or, your hypothesis could be that you have an even representation of the
space, but each neuron only responds to very particular inputs. Then you
could change your intercepts to be between, e.g. (0.5,1) instead of the
default (-1,1). This will also give you sparser activity, because now the
minimum firing threshold of the neurons will be increased. But then you
will not be able to represent inputs with small magnitude, since they fall
outside the range of your intercepts. Note that this interacts with the
encoding vector manipulation, since the range you are specifying with the
intercepts is a range on the value of dot(input, encoder).
It is also possible to implement something like WTA with a recurrent,
mutually inhibitory connection, as you say. That's definitely feasible in
the NEF, and can be used to give you sparse activity. But it will have
more temporal dynamics, and represents a very different hypothesis about
the underlying mechanism.
So, to sum up, roughly speaking you could just keep increasing the minimum
intercept until you get the desired level of sparsity. But in practice the
method you go with will probably depend more on what the function is that
you are trying to model.
Daniel
On 5 November 2014 08:48, Claus Agerskov <clausagerskov at hotmail.com> wrote:
> Hi
> Inspired by the high sparsity of active neurons in the hippocampal
> formation, especially the dendrate gyrus, I want to create a neural network
> that has ensembles with neural activity limited to some percentage of the
> present neurons. Of course such can be done by making a self-inhibiting
> connection, and adjusting the weight until the desired number of neurons
> are active, given a certain input. I am wondering however if there is a
> better way to do this.
> The classic solution is k-WTA but Eliasmith is speaking against it in "How
> to Build a Brain", so I am wondering what kind of solution is more
> biologically realistic.
> Regards
> Claus Agerskov, Technical University of Denmark, Biophysics and Fluids
> group
>
> _______________________________________________
> nengo-user mailing list
> nengo-user at ctnsrv.uwaterloo.ca
> http://ctnsrv.uwaterloo.ca/mailman/listinfo/nengo-user
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://artsservices.uwaterloo.ca/pipermail/nengo-user/attachments/20141105/9b52f615/attachment-0002.html>
More information about the nengo-user
mailing list