Talk at “Event-based Asynchronous Neuro-Cognitive Control” Symposium in Amsterdam

On Wednesday August 28th 2019 I am luck to be able to give a talk at the “Event-based Asynchronous Neuro-Cognitive Control” Symposium in Amsterdam which is kindly organized by Sander Bohte, Aditya Gilra, Qinghai Guo, and J. Camilo Vasquez Tieck. This workshop promises three stimulating days of discussion on recent advances on event-based computing and spiking neural networks. Thanks heaps to the organizers for making this happen!

Details: https://www.cwi.nl/research/groups/machine-learning/events/workshop-neurocognitivecontrol/event-based-asynchronous-neuro-cognitive-control

Tagged with: , , , ,

Talk at “Functional network dynamics: Recent mathematical perspectives” workshop CNS 2019, Barcelona

Thanks to the organizers for inviting me to the “Functional network dynamics: Recent mathematical perspectives” workshop at CNS this year. I am very much looking forward to our discussions.

https://matthieugilson.eu/events/workshop_CNS2019.html

Update: Apologies again that my slides were cut off during the presentation. Here is a PDF of the slides fzenke_slides_cns2019

Short tutorial on how to implement surrogate gradients in PyTorch: https://github.com/fzenke/spytorch

Tagged with:

Talk at “Neuroscience meets Deep Learning” Symposium at EPF Lausanne

I am stoked for the EPFL Neurosymposium “Neuroscience meets Deep Learning” next week.
July 8th to July 9th 2019 in Lausanne, Switzerland

I will talk about: “Credit assignment in space and time — Training spiking neural networks with surrogate gradients”

You are welcome to join the talks with Zoom using the following link: https://epfl.zoom.us/j/980568331

 

Tagged with: ,

zenkelab.org online

My lab’s website zenkelab.org is online! Moreover, I am looking for PhD students. If you are interested apply for the May 1st 2019 deadline.

Tagged with: ,

Tutorial on surrogate gradient learning in spiking networks online

Please try this at home!

I just put up a beta version of a tutorial showing how to train spiking neural networks with surrogate gradients using PyTorch:
https://github.com/fzenke/spytorch

Emre, Hesham, and myself are planning to release a more comprehensive collection of code in the near future to accompany our tutorial paper. Stay tuned!

Tagged with: , ,

Auryn 0.8.2 released

I am happy to announce a long overdue maintenance release of Auryn v0.8.2m with plenty of fixes and improvements. Most notable improvement are the added support for non x86 architectures such as Arm and PowerPC. Special thanks to Ankur Sinha for his support on this one.

Up next, is improved support and fixes for large-scale deployments (>1000 cores) which was mainly spearheaded by Anders Lansner. We are currently still testing this, but you find the corresponding code in the develop branch.

Tagged with: , ,

Cosyne 2019 — Talks & poster resources

I am eagerly anticipating fun discussions at Cosyne 2019. We have a poster at the main meeting and I will give two talks at the workshops. If biological learning and spiking neural networks tickle your fancy, come along. I will post some details and supplementary material below.

Poster

Thursday, 28 February 2019, 8.30p — Poster Session 1

I-98 Rapid spatiotemporal coding in trained multi-layer and recurrent spiking neural networks. Friedemann Zenke, Tim P Vogels
PDF download

Code: Surrogate gradients in PyTorch

Please try this at home! I have put the early beginnings of a tutorial on how to train simple spiking networks with surrogate gradients using PyTorch here:
https://github.com/fzenke/spytorch

Emre, Hesham, and myself are planning to release a more comprehensive collection of code in the near future to accompany our tutorial paper. Stay tuned!

Talks at the workshops

9:25-9:50 on Monday 4th of March 2019 in the workshop “Continual learning in biological and artificial neural networks” (more)
Title: “Continual learning through synaptic intelligence”
PDF slides download
The talk will be largely based on:

  • Zenke, F.*, Poole, B.*, and Ganguli, S. (2017).
    Continual Learning Through Synaptic Intelligence.
    Proceedings of the 34th International Conference on Machine Learning (ICML), pp. 3987–3995.
    fulltext | preprint | code | talk

9:40-10:10 on Tuesday, 5th of March 2019 in the workshop “Why spikes? – Understanding the power and constraints of spiking based computation in biological and artificial neuronal networks” (more)
Title: “Computation in spiking neural networks — Opportunities and challenges”
I will talk about unpublished results and:

  • Neftci, E.O., Mostafa, H., and Zenke, F. (2019).
    Surrogate Gradient Learning in Spiking Neural Networks.
    ArXiv:1901.09948 [Cs, q-Bio].
    preprint
  • Zenke, F. and Ganguli, S. (2018).
    SuperSpike: Supervised learning in multi-layer spiking neural networks.
    Neural Comput 30, 1514–1541. doi: 10.1162/neco_a_01086
    fulltext | preprint | code

Tagged with: , , , , ,

Computational neuroscience at the FMI (now hiring)

I am very excited to start my research group at the FMI in Basel, Switzerland in June 2019. My group will conduct research on learning and memory at the intersection of computational neuroscience and machine learning. The lab will be embedded in the exciting, collaborative environment of the existing neurobiology groups at the FMI.

More information on our lab website zenkelab.org.

The team

I am currently looking for potential candidates who are passionate about neuroscience and computation. To tackle the problems we are interested in, we often have to be creative and apply analytical and computational tools from other areas. This creative mix includes, but is by no means limited to: dynamical systems, control theory, and machine learning. Ideal candidates should be curious about the neural underpinnings of computation and learning, but should also enjoy taking on difficult math and coding problems.

Topics of interest include:

  • Neural networks (both spiking and non-spiking)
  • Synaptic plasticity and homeostasis
  • Biologically plausible deep credit assignment
  • Complex synapses for continual learning

PhD opportunities

There are currently several PhD positions with competitive salaries available in my lab starting from June 2019. PhD students will work on projects centered around plastic neural networks and will typically be involved in at least one experimental collaboration. If the above applies to you, and you would like to work and learn in an international and interdisciplinary environment, please consider applying.

Applicants will go through the FMI PhD program selection process. Application deadlines are on Nov 16th and May 1st with the associated hiring days in late January and June, respectively.

Note for applicants: When applying under “Specific Scientific Interests” my name currently does not show up in the drop down menu. Instead select “Keller, G.” twice and indicate your preference in the field below.

Other opportunities

I am always looking for apt candidates who are interested in an internship, a master thesis or a post-doc. In case you are interested, I am always happy to discuss.

Tagged with: ,

Bernstein Satellite Workshop on “Emergent function in non-random neural networks”

Mark the dates September 25th-26th for our Bernstein Satellite Workshop on “Networks which do stuff” which Guillaume Hennequin, Tim Vogels and myself are organizing this year at the Bernstein meeting in Berlin.

Abstract

Computation in the brain occurs through complex interactions in highly structured, non-random networks. Moving beyond traditional approaches based on statistical physics, engineering-based approaches are bringing new vistas on circuit computation, by providing novel ways of i) building artificial yet fully functional model circuits, ii) dissecting their dynamics to identify new circuit mechanisms, and iii) reasoning about population recordings made in diverse brain areas across a range of sensory, motor, and cognitive tasks. Thus, the same “science of real-world problems” that is behind the accumulation of increasingly rich neural datasets is now also being recognized as a vast and useful set of tools for their analysis.

This workshop aims at bringing together researchers who build and study structured network models, spiking or otherwise, that serve specific functions. Our speakers will present their neuroscientific work at the confluence of machine learning, optimization, control theory, dynamical systems, and other engineering fields, to help us understand these recent developments, critically evaluate their scope and limitations, and discuss their use for elucidating the neural basis of intelligent behaviour.

Date and venue

September 25, 2018, 2:00 – 6:30 pm
September 26, 2018, 8:30 am – 12:30 pm

Marchstrasse 23
10587 Berlin
Germany

Schedule

Tue, Sept 25, 2018
14:00 Nataliya Kraynyukova, MPI for Brain Research, Frankfurt a.M., Germany
Stabilized supralinear network can give rise to bistable, oscillatory, and persistent activity
14:40 Jake Stroud, University of Oxford, UK
Spatio-temporal control of recurrent cortical activity through gain modulation
15:20 Jorge Mejias, University of Amsterdam, The Netherlands
Balanced amplification of signals propagating across large-scale brain networks
16:00 Coffee Break
16:30 Srdjan Ostojic, Ecole normale supérieure, Paris, France
Reverse-engineering computations in recurrent neural networks
17:10 Chris Stock, Stanford University, USA
Reverse engineering transient computations in nonlinear recurrent neural networks through model reduction
17:50 Guillaume Hennequin, University of Cambridge, UK
Flexible, optimal motor control in a thalamo-cortical circuit model
Wed, Sep 26, 2018
08:30 Aditya Gilra, University of Bonn, Germany
Local stable learning of forward and inverse dynamics in spiking neural networks
09:10 Robert Gütig, MPI for Experimental Medicine Göttingen, Germany
Margin learning in spiking neurons
09:50 Claudia Clopath, Imperial College London, UK
Training spiking recurrent networks
10:30 Coffee Break
11:00 Friedemann Zenke, University of Oxford, UK
Training deep spiking neural networks with surrogate gradients
11:40 Christian Marton, Imperial College London, UK
Task representation & learning in prefrontal cortex & striatum as a dynamical system
12:20 Wrap up

More details here and general information here.

Tagged with: , , ,

Talk at MPI Göttingen on June 28th

Thrilled to share the latest results on learning in multi-layer spiking networks using biologically plausible surrogate gradients at the “Third workshop on advanced methods in theoretical neuroscience” at the Max Planck Institute for Dynamics and Self-Organization, Göttingen, Germany. Thanks to the organizers for inviting me!

More info under: http://www.wamtn.info/

Tagged with: , , ,
Top