We just put up a new preprint https://www.biorxiv.org/content/10.1101/2020.06.29.176925v1 in which we take a careful look at what makes surrogate gradients work. Spiking neural networks are notoriously hard to train using gradient-based methods due to their binary spiking nonlinearity. To deal …

Preprint: The remarkable robustness of surrogate gradient learning for instilling complex function in spiking neural networks Read more »

Tagged with: , , , , ,

I am happy to announce that the SuperSpike paper and code are finally published. Here is an example of a network with one hidden layer which is learning to produce a Radcliffe Camera spike train from frozen Poisson input spike …

SuperSpike: Supervised learning in spiking neural networks — paper and code published Read more »

Tagged with: , , ,

I am happy that our article on “The temporal paradox of Hebbian learning and homeostatic plasticity” was just published in Current Opinion in Neurobiology (full text). This article essentially concisely presents the main arguments for the existence of rapid compensatory …

The temporal paradox of Hebbian learning and homeostatic plasticity Read more »

Tagged with: , ,

Excited that our preprint “Improved multitask learning through synaptic intelligence” just went life on the arXiv (https://arxiv.org/abs/1703.04200). This article, by Ben Poole, Surya and myself, illustrates the benefits of complex synaptic dynamics on continual learning in neural networks. Here a …

Role of complex synapses in continual learning Read more »

Tagged with: , , , ,

I recommend taking a look at the special issue  on ‘Integrating Hebbian and Homeostatic plasticity’ which was just published in Phil Trans of the Royal Society B. You can find the table of contents at http://rstb.royalsocietypublishing.org/content/372/1715. The issue is based …

Special Issue: “Integrating Hebbian and Homeostatic plasticity” Read more »

Tagged with: , , , ,