The main perks: Further increase of performance. Class-based state vectors for neuronal and synaptic states for ease of code writing and readability.
The main changes from Auryn v0.7 to Auryn v0.8.0-alpha happened under the hood. Auryn’s core vector class for state updates and its core class for MPI communication between nodes were both completely rewritten. This increase Auryn’s performance even further by about 10%. Here are the results of a series of benchmarks on how execution speed increased with development:
Ease of writing code
By re-factoring Auryn’s state vector class which is the heart of neuronal and synaptic updates, not only performance was increased, but also the code has now become more readable and easier to write. Before, vector operations were based on a functional framework inherited from older versions which still used the GSL. To implement an exponential decay of an AMPA conductance stored in a state vector g_ampa for instance you had to write
auryn_vector_float_scale( mul, g_ampa);
where mul is a float and g_ampa is a the vector containing all AMPA conductances of the NeuronGroup. Now, state vectors are classes with their own functions. The above expression now reduces to:
g_ampa->scale( mul );
Or similarly, to compute the current caused by an inhibitory conductance up to know you had to write:
which first computes the distance from the inhibitory reversal potential (e_rev), stores it in the state vector t_inh and then multiplies it with the conductances in g_gaba. In Auryn v0.8 the same is achieved by
Don’t worry, though. All the legacy functions will also still work.
New devices, models and perks
In addition to that Auryn 0.8 comes with a bunch of nice new tools. For instance there is a BinaryStateMonitor now. Both, BinaryStateMonitor and StateMonitor can now compress their output if desired. Moreover, I laid out the basis for supporting AVX instructions in the future. There are new neuron models available such as the Izhikevich model and plenty of more …
Go take a look! I hope you like it.