Auryn simulator

Simulator for spiking neural networks with synaptic plasticity

User Tools

Site Tools


examples:start

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
examples:start [2017/11/23 09:49] – [Simple examples --- First Steps] Heading changed zenkeexamples:start [2018/06/03 13:05] (current) – Updates reference zenke
Line 5: Line 5:
 Starting from Auryn v0.7.0, examples are compiled automatically when building the simulator. See [[manual:CompileAuryn]] to learn how to build Auryn and its examples using ''cmake'' on diverse platforms. Starting from Auryn v0.7.0, examples are compiled automatically when building the simulator. See [[manual:CompileAuryn]] to learn how to build Auryn and its examples using ''cmake'' on diverse platforms.
  
 +===== Example code included with Auryn  =====
 +
 +The following examples can be found Auryn's /examples directory.
  
 ==== Basic examples ==== ==== Basic examples ====
  
-These are very simple network models which can be easily understood and modified to get a first impression of how Auryn simulations are built.+These are very simple models with a single neuron which can be easily understood and modified to get a first impression of how Auryn simulations are built.
  
   * [[sim_poisson]] This example is //Hello world// in Auryn. It shows you how to create a simple [[manual:PoissonGroup]] that fires at a given rate and writes the output to a [[manual:ras]] file.   * [[sim_poisson]] This example is //Hello world// in Auryn. It shows you how to create a simple [[manual:PoissonGroup]] that fires at a given rate and writes the output to a [[manual:ras]] file.
Line 16: Line 19:
  
  
-==== Example code included with Auryn  ====+==== Network simulations ====
  
-These examples come with Auryn in the /examples directory.+Here a few more common network simulation examples.
  
   * [[sim_coba_benchmark]] The Vogels and Abbott network [1] in its 4000 neuron conductance based synapses version as used in [7,8].   * [[sim_coba_benchmark]] The Vogels and Abbott network [1] in its 4000 neuron conductance based synapses version as used in [7,8].
-  * [[sim_isp_orig]] This simulation illustrates inhibitory plasticity in the Vogels and Abbott network. It is the parallelized version of our network used in Figure 4 in [2]. +  * [[sim_isp_orig]] This simulation illustrates inhibitory plasticity in the Vogels and Abbott network. It is the parallelized version of our network used in Figure 4 in [2]. ([[sim_isp_big]] An up-scaled version of this network to 200,000 neurons)
-  * [[sim_isp_big]] An up-scaled version of this network to 200,000 neurons.+
   * [[sim_background]] A simulation implementing homeostatic triplet STDP at excitatory synapses. It was used in [3].   * [[sim_background]] A simulation implementing homeostatic triplet STDP at excitatory synapses. It was used in [3].
   * [[sim_dense]] simulates a 25,000 neuron network with non-plastic connectivity of 10% which receives modulated external Poisson input. Similar to what we used in [4].   * [[sim_dense]] simulates a 25,000 neuron network with non-plastic connectivity of 10% which receives modulated external Poisson input. Similar to what we used in [4].
Line 32: Line 34:
 The code for these works can be found in separate repositories, but in some cases it might be closed too. The code for these works can be found in separate repositories, but in some cases it might be closed too.
  
-  * Zenke, F., and Ganguli, S. (2017). SuperSpike: Supervised learning in multi-layer spiking neural networks. ArXiv:1705.11146 [Csq-Bio, Stat]. [[https://arxiv.org/abs/1705.11146]]+  * Zenke, F., and Ganguli, S. (2018). SuperSpike: Supervised learning in multi-layer spiking neural networks. Neural Computation 301514–1541. [[https://doi.org/10.1162/neco_a_01086]] | code: https://github.com/fzenke/pub2018superspike
   * Zenke, F., and Gerstner, W. (2017). Hebbian plasticity requires compensatory processes on multiple timescales. Phil. Trans. R. Soc. B 372, 20160259. [[http://rstb.royalsocietypublishing.org/content/372/1715/20160259]]   * Zenke, F., and Gerstner, W. (2017). Hebbian plasticity requires compensatory processes on multiple timescales. Phil. Trans. R. Soc. B 372, 20160259. [[http://rstb.royalsocietypublishing.org/content/372/1715/20160259]]
   * Neftci, E., Augustine, C., Paul, S., and Detorakis, G. (2016). Neuromorphic Deep Learning Machines. arXiv:1612.05596 [Cs]. [[https://arxiv.org/abs/1612.05596]]   * Neftci, E., Augustine, C., Paul, S., and Detorakis, G. (2016). Neuromorphic Deep Learning Machines. arXiv:1612.05596 [Cs]. [[https://arxiv.org/abs/1612.05596]]
Line 47: Line 49:
    
  
-==== Bibliography ====+===== Bibliography =====
  
 [1] Vogels, T.P., Abbott, L.F., 2005. Signal propagation and logic gating in networks of integrate-and-fire neurons. J Neurosci 25, 10786. [1] Vogels, T.P., Abbott, L.F., 2005. Signal propagation and logic gating in networks of integrate-and-fire neurons. J Neurosci 25, 10786.
examples/start.1511430542.txt.gz · Last modified: 2017/11/23 09:49 by zenke