I am very excited to be invited to present at the CRC 1080 International Symposium on homeostatic plasticity at the MPI for Brain research in Frankfurt, Germany on Friday 25th of October 2019.

More details here

 

I am thrilled to present our new data sets (www.compneuro.net) for the systematic evaluation of spiking neural networks at Intel’s Neuromorphic Community Fall workshop in Graz, Austria with a sizeable fraction of the spiking neural network community. On Friday the 18th of October.

Tagged with: , , , ,

On Wednesday August 28th 2019 I am luck to be able to give a talk at the “Event-based Asynchronous Neuro-Cognitive Control” Symposium in Amsterdam which is kindly organized by Sander Bohte, Aditya Gilra, Qinghai Guo, and J. Camilo Vasquez Tieck. This workshop promises three stimulating days of discussion on recent advances on event-based computing and spiking neural networks. Thanks heaps to the organizers for making this happen!

Details: https://www.cwi.nl/research/groups/machine-learning/events/workshop-neurocognitivecontrol/event-based-asynchronous-neuro-cognitive-control

Tagged with: , , , ,

Thanks to the organizers for inviting me to the “Functional network dynamics: Recent mathematical perspectives” workshop at CNS this year. I am very much looking forward to our discussions.

https://matthieugilson.eu/events/workshop_CNS2019.html

Update: Apologies again that my slides were cut off during the presentation. Here is a PDF of the slides fzenke_slides_cns2019

Short tutorial on how to implement surrogate gradients in PyTorch: https://github.com/fzenke/spytorch

Tagged with:

I am stoked for the EPFL Neurosymposium “Neuroscience meets Deep Learning” next week.
July 8th to July 9th 2019 in Lausanne, Switzerland

I will talk about: “Credit assignment in space and time — Training spiking neural networks with surrogate gradients”

You are welcome to join the talks with Zoom using the following link: https://epfl.zoom.us/j/980568331

 

Tagged with: ,

My lab’s website zenkelab.org is online! Moreover, I am looking for PhD students. If you are interested apply for the May 1st 2019 deadline.

Tagged with: ,

Please try this at home!

I just put up a beta version of a tutorial showing how to train spiking neural networks with surrogate gradients using PyTorch:
https://github.com/fzenke/spytorch

Emre, Hesham, and myself are planning to release a more comprehensive collection of code in the near future to accompany our tutorial paper. Stay tuned!

Tagged with: , ,

I am happy to announce a long overdue maintenance release of Auryn v0.8.2m with plenty of fixes and improvements. Most notable improvement are the added support for non x86 architectures such as Arm and PowerPC. Special thanks to Ankur Sinha for his support on this one.

Up next, is improved support and fixes for large-scale deployments (>1000 cores) which was mainly spearheaded by Anders Lansner. We are currently still testing this, but you find the corresponding code in the develop branch.

Tagged with: , ,

I am eagerly anticipating fun discussions at Cosyne 2019. We have a poster at the main meeting and I will give two talks at the workshops. If biological learning and spiking neural networks tickle your fancy, come along. I will post some details and supplementary material below.

Poster

Thursday, 28 February 2019, 8.30p — Poster Session 1

I-98 Rapid spatiotemporal coding in trained multi-layer and recurrent spiking neural networks. Friedemann Zenke, Tim P Vogels
PDF download

Code: Surrogate gradients in PyTorch

Please try this at home! I have put the early beginnings of a tutorial on how to train simple spiking networks with surrogate gradients using PyTorch here:
https://github.com/fzenke/spytorch

Emre, Hesham, and myself are planning to release a more comprehensive collection of code in the near future to accompany our tutorial paper. Stay tuned!

Talks at the workshops

9:25-9:50 on Monday 4th of March 2019 in the workshop “Continual learning in biological and artificial neural networks” (more)
Title: “Continual learning through synaptic intelligence”
PDF slides download
The talk will be largely based on:

  • Zenke, F.*, Poole, B.*, and Ganguli, S. (2017).
    Continual Learning Through Synaptic Intelligence.
    Proceedings of the 34th International Conference on Machine Learning (ICML), pp. 3987–3995.
    fulltext | preprint | code | talk

9:40-10:10 on Tuesday, 5th of March 2019 in the workshop “Why spikes? – Understanding the power and constraints of spiking based computation in biological and artificial neuronal networks” (more)
Title: “Computation in spiking neural networks — Opportunities and challenges”
I will talk about unpublished results and:

  • Neftci, E.O., Mostafa, H., and Zenke, F. (2019).
    Surrogate Gradient Learning in Spiking Neural Networks.
    ArXiv:1901.09948 [Cs, q-Bio].
    preprint
  • Zenke, F. and Ganguli, S. (2018).
    SuperSpike: Supervised learning in multi-layer spiking neural networks.
    Neural Comput 30, 1514–1541. doi: 10.1162/neco_a_01086
    fulltext | preprint | code

Tagged with: , , , , ,

I am very excited to start my research group at the FMI in Basel, Switzerland in June 2019. My group will conduct research on learning and memory at the intersection of computational neuroscience and machine learning. The lab will be embedded in the exciting, collaborative environment of the existing neurobiology groups at the FMI.

More information on our lab website zenkelab.org.

The team

I am currently looking for potential candidates who are passionate about neuroscience and computation. To tackle the problems we are interested in, we often have to be creative and apply analytical and computational tools from other areas. This creative mix includes, but is by no means limited to: dynamical systems, control theory, and machine learning. Ideal candidates should be curious about the neural underpinnings of computation and learning, but should also enjoy taking on difficult math and coding problems.

Topics of interest include:

  • Neural networks (both spiking and non-spiking)
  • Synaptic plasticity and homeostasis
  • Biologically plausible deep credit assignment
  • Complex synapses for continual learning

PhD opportunities

There are currently several PhD positions with competitive salaries available in my lab starting from June 2019. PhD students will work on projects centered around plastic neural networks and will typically be involved in at least one experimental collaboration. If the above applies to you, and you would like to work and learn in an international and interdisciplinary environment, please consider applying.

Applicants will go through the FMI PhD program selection process. Application deadlines are on Nov 16th and May 1st with the associated hiring days in late January and June, respectively.

Note for applicants: When applying under “Specific Scientific Interests” my name currently does not show up in the drop down menu. Instead select “Keller, G.” twice and indicate your preference in the field below.

Other opportunities

I am always looking for apt candidates who are interested in an internship, a master thesis or a post-doc. In case you are interested, I am always happy to discuss.

Tagged with: ,