BRIAN Simulation of spiking neural networks

Slides:



Advertisements
Similar presentations
Dougal Sutherland, 9/25/13.
Advertisements

BRIAN Simulator 11/4/11. NEURON is cool, but… …it’s not suited particularly well for large network simulations What if you want to look at properties.
Intro to modeling April Part of the course: Introduction to biological modeling 1.Intro to modeling 2.Intro to biological modeling (Floor) 3.Modeling.
Introduction to MATLAB The language of Technical Computing.
Introduction to Matlab Workshop Matthew Johnson, Economics October 17, /13/20151.
SE263 Video Analytics Course Project Initial Report Presented by M. Aravind Krishnan, SERC, IISc X. Mei and H. Ling, ICCV’09.
Marković Miljan 3139/2011
Slides 2c: Using Spreadsheets for Modeling - Excel Concepts (Updated 1/19/2005) There are several reasons for the popularity of spreadsheets: –Data are.
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
1 CSC 551: Web Programming Spring 2004 client-side programming with JavaScript  scripts vs. programs  JavaScript vs. JScript vs. VBScript  common tasks.
Introduction to Training and Learning in Neural Networks n CS/PY 399 Lab Presentation # 4 n February 1, 2001 n Mount Union College.
The Web Warrior Guide to Web Design Technologies
HMAX Models Architecture Jim Mutch March 31, 2010.
Dan Goodman & Romain Brette Ecole Normale Supérieure Projet Odyssée
Dan Goodman & Romain Brette Ecole Normale Supérieure Projet Odyssée
Artificial Neural Networks - Introduction -
1Neural Networks B 2009 Neural Networks B Lecture 1 Wolfgang Maass
Welcome to E-Prime E-Prime refers to the Experimenter’s Prime (best) development studio for the creation of computerized behavioral research. E-Prime is.
Romain Brette Institut de la Vision, Paris Main developers of : Dan Goodman & Marcel Stimberg Neural.
By Hrishikesh Gadre Session II Department of Mechanical Engineering Louisiana State University Engineering Equation Solver Tutorials.
Case, Arrays, and Structures. Summary Slide  Case Structure –Select Case - Numeric Value Example 1 –Select Case - String Value Example  Arrays –Declaring.
Introduction to a Programming Environment
Neural Networks Primer Dr Bernie Domanski The City University of New York / CSI 2800 Victory Blvd 1N-215 Staten Island, New York 10314
Honors 101, Fall 2006 Please do not sit in back of room! Lots of info on web page Join the mailing list Download Eclipse and start using it Read the text.
Neural Network Tools. Neural Net Concepts The package provides a “standard” multi-layer perceptron –Composed of layers of neurons –All neurons in a layer.
InSTALL InVEST Rich Sharp. InVEST Installation process InVEST: The Application Get InVEST Install the core InVEST application – InVEST statistics while.
C++ Functions. 2 Agenda What is a function? What is a function? Types of C++ functions: Types of C++ functions: Standard functions Standard functions.
EPSII 59:006 Spring Topics Using TextPad If Statements Relational Operators Nested If Statements Else and Elseif Clauses Logical Functions For Loops.
Simulating neural networks with Romain Brette
Chapter Seven Advanced Shell Programming. 2 Lesson A Developing a Fully Featured Program.
Matlab tutorial course Lesson 2: Arrays and data types
Dr. Chris Musselle – Consultant R Meets Julia Dr Chris Musselle.
FW364 Ecological Problem Solving Lab 4: Blue Whale Population Variation [Ramas Lab]
Introduction to Python
Data, graphics, and programming in R 28.1, 30.1, Daily:10:00-12:45 & 13:45-16:30 EXCEPT WED 4 th 9:00-11:45 & 12:45-15:30 Teacher: Anna Kuparinen.
Lists in Python.
Lecture 10: Mean Field theory with fluctuations and correlations Reference: A Lerchner et al, Response Variability in Balanced Cortical Networks, q-bio.NC/ ,
1 CSC 221: Introduction to Programming Fall 2012 Functions & Modules  standard modules: math, random  Python documentation, help  user-defined functions,
Computational Methods of Scientific Programming Lecturers Thomas A Herring, Room A, Chris Hill, Room ,
Romain Brette & Dan Goodman Ecole Normale Supérieure Equipe Audition
CARDIAC ELECTROPHYSIOLOGY WEB LAB Developing your own protocol descriptions.
An Object-Oriented Approach to Programming Logic and Design Fourth Edition Chapter 6 Using Methods.
Planned features STDP (close to finished) Gap junctions Using the GPU (project with GPULib) Automatic code generation Static analysis of neural networks.
Exercise Your your Library ® RefWorks: Advanced November 21, 2006.
EDExpress Training Presented by Doug Baldwin – CPS/SAIG Technical Support Bob Berry – U.S Department of Education/FSA.
Arena Simulation Language. Simulation with ArenaChapter 3 – A Guided Tour Through ArenaSlide 2 of 58 The Create Flowchart Module “Birth” node for entities.
Dan Goodman & Romain Brette Ecole Normale Supérieure Projet Odyssée
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
SIMULINK-Tutorial 1 Class ECES-304 Presented by : Shubham Bhat.
CSC321 Introduction to Neural Networks and Machine Learning Lecture 3: Learning in multi-layer networks Geoffrey Hinton.
Kirchhoff Institute for Physics Johannes Schemmel Ruprecht-Karls-Universität Heidelberg 1 Accelerated Neuromorphic Hardware : Hybrid Plasticity - The Next.
Introduction to Python Dr. José M. Reyes Álamo. 2 Three Rules of Programming Rule 1: Think before you program Rule 2: A program is a human-readable set.
Intermediate 2 Computing Unit 2 - Software Development.
Brian A clock driven simulator for spiking neural networks in Python.
How to execute Program structure Variables name, keywords, binding, scope, lifetime Data types – type system – primitives, strings, arrays, hashes – pointers/references.
Math 252: Math Modeling Eli Goldwyn Introduction to MATLAB.
IST 210: PHP Basics IST 210: Organization of Data IST2101.
Quiz 4 Topics Aid sheet is supplied with quiz. Functions, loops, conditionals, lists – STILL. New topics: –Default and Keyword Arguments. –Sets. –Strings.
Wednesday NI Vision Sessions
Windows Vista Configuration MCTS : Internet Explorer 7.0.
How to Get Started With Python
Top 10 Entity Framework Features Every Developer Should Know
Data Mining (and machine learning)
Prepared by Kimberly Sayre and Jinbo Bi
Welcome to E-Prime E-Prime refers to the Experimenter’s Prime (best) development studio for the creation of computerized behavioral research. E-Prime is.
Matlab tutorial course
MXNet Internals Cyrus M. Vahid, Principal Solutions Architect,
Emre O. Neftci  iScience  Volume 5, Pages (July 2018) DOI: /j.isci
JavaScript CS 4640 Programming Languages for Web Applications
JavaScript CS 4640 Programming Languages for Web Applications
Presentation transcript:

BRIAN Simulation of spiking neural networks

The spirit of “A simulator should not only save the time of processors, but also the time of scientists” scientist computer Writing code often takes more time than running it Goals: Quick model coding Flexible models are defined by equations (rather than pre-defined)

An example from systems neuroscience Sturzl, W., R. Kempter, and J. L. van Hemmen (2000, June). Theory of arachnid prey localization. Physical Review Letters 84 (24), 5668{71. PMID:

Learning Brian 1 hour is not long enough for me to say everything Will include detailed slides, but talk in a more general way You can download the slides and go through it in more detail later (from Telluride website or Brian website) Also look at the documentation page on the Brian website Use the Brian list for questions

Installing Brian Instructions for installing Brian on our webpage: – Recommend using Python(x,y) for Windows users – – Includes all packages, various IDEs, etc. – Brian available as a plugin, or can be downloaded separately.

Python No time to introduce Python programming language, but there is an excellent tutorial online: – For scientific work, use the NumPy (numerical), SciPy (scientific) and Pylab/Matplotlib (plotting) libraries. Documentation (including tutorials) available online: –

Anatomy of a Brian script Import the Brian package Define some parameters, you can use physical units Define neuron equations, the “volt” term says that V has units of volts, and Brian checks the consistency of your equations. Create neurons with given equations and fixed threshold and reset value. Create synapses – here recurrent random connectivity with probability.1 for each pair of neurons and given synaptic weight, spikes cause an instantaneous increase of amount weight in variable V of the target neuron Record spiking activity Initialise state variables to random values. Run simulation Analyse results – do a raster plot

Units system Standard SI names like volt, amp, etc. Standard SI prefixes, mvolt, namp, etc. Some short names: mV, ms, nA, etc. Consistency checking: – 3*mV+2*nA will raise an exception – “dV/dt = -V” will raise an exception (RHS should have units of volt/second) – Useful for catching hard to find bugs – Can be switched off globally: import brian_no_units before importing Brian

Defining a neuron model: Equations Equations, each of one of the following forms: – dx/dt = -x/tau : volt differential equation – y = x*x : volt2 equation – z = y alias – w : unit parameter Non-autonomous equations, use the reserved symbol “t” for time Stochastic DEs – Use the term “xi” for noise with: – Has units s -1/2 - use typically as “sigma*xi*(2/tau)**0.5” Can also get equations from brian.library, but won’t cover this here Solvers: – Exact for linear DEs (detected automatically) – Euler for nonlinear DEs (method=‘Euler’) – 2 nd order Runge-Kutta (method=‘RK’) – Exponential Euler (method=‘exponential_Euler’)

NeuronGroup creation Resets – Single value – String with Python statement(s), e.g. ‘V=Vr’ where Vr could be a constant or another state variable ‘Vt += dVt; V=Vr’ where Vt is another state variable ‘V=Vr+rand()*sigma’ (rand and randn are known) – Arbitrary Python function f(group, spikes) Refractory – Single value, note that this only works with single valued resets, otherwise it is ambiguous. In other cases, use CustomRefractoriness (see docs) Threshold – Single value – String with Python expression, e.g. ‘V>=Vt’ where Vt could be a constant or another state variable ‘V+sigma*rand()>=Vt’ (will be in the next release of Brian) – Arbitrary Python function f(var1, var2, …) should return array of spike indices – EmpiricalThreshold(thresh, refrac) fires a spike if V>thresh but won’t fire another spike for period refrac, used for models without an instantaneous reset (e.g. HH) group = NeuronGroup(N, eqs, reset=…, threshold=…, refractory=…, method=…)

Example: adaptive threshold

More on groups Subgroups – subgp = group.subgroup(N) – subgp = group[i:j] – Can be used whenever a group is used – Use to structure network (e.g. into layers) – More computationally efficient to use one large group with several subgroups than to use several small groups State variables – Access by name, e.g. group.V, group.ge, group.I Standard groups – PoissonGroup(N, rates) – N neurons firing with given rates (scalar or vector, can be changed during a run) – SpikeGeneratorGroup(N, spikes) – N neurons that fire spikes specified by the user in the form spikes=[(i1, t1), (i2, t2), …] – PulsePacket(t, N, sigma) – N neurons firing at Gaussian dist. times

Connections C = Connection(source, target, var) – Synapses from source group to target group acting on state variable var. – Weight matrix C.W – When neuron i in source fires: target.var[j] += C.W[i, j] for each j Features (next slides): – Building connectivity (full, random, custom) – Delays (homogeneous, heterogeneous) – Weight matrix structures (dense, sparse)

Building connectivity C.connect_full(source, target, weight) – Source and target can be whole group or subgroups – weight can be single value, matrix or function w(i,j) C.connect_random(source, target, p, weight) – p is the probability of synapse between i and j. – weight can be single value or function w(i, j) C.connect_one_to_one(source, target, weight) – weight has to be a single value Multiple connect_* calls allowed for different subgroups, etc. Can build or modify connectivity by hand, e.g.: – for i in range(N): C.W[i, :] = N-abs(i-arange(N))

Delays Connection(…, delay=…, max_delay=…) – Fixed constant, all synapses have the same (axonal) delay – no need to specify max_delay – delay=True, each synapse has its own delay, have to specify the maximum delay (larger uses more memory) Specifying delays – Each C.connect_* method has a new delay keyword, which can have different values: Scalar, all delays equal Pair (min, max) uniformly distributed Function f() called for each synapse Function f(i, j) called for each synapse Matrix – Specify by hand, C.delay[i,j] = …

Matrix structures C = Connection(…, structure=…) ‘dense’ – Numpy 2D array – Uses 8NM bytes for matrix of shape (N,M) ‘sparse’ – Sparse matrix class with fast row access and reasonable column access – Uses 20 bytes per nonzero entry (12 bytes if column access not required) – Cannot be changed at runtime ‘dynamic’ – Sparse matrix class with reasonable row and column access speeds – Uses 24 bytes per nonzero entry – Can be changed at runtime

Example: synfire chain

Plasticity STDP – stdp = ExponentialSTDP(C, taupre, taupost, deltapre, deltapost, …) – stdp = STDP(C, …) See docs STP – stp = STP(C, taud, tauf, U) Tsodyks-Markram model (see docs)

Recording activity Spikes – M = SpikeMonitor(group) M.spikes = [(i1, t1), (i2, t2), …] – M = SpikeCounter(group) M.count = [count0, count1, …] M.nspikes – PopulationSpikeCounter, StateSpikeMonitor See docs Others – ISIHistogramMonitor – PopulationRateMonitor – See docs State variables – M=StateMonitor(group, var, record=…) Records values of variable var in group Record= – True, record all neurons (lots of memory) – [i1,i2,…], record given neurons – False (default), only record summary stats M.plot() M[i] array of recorded values for neuron i M.times array of recording times M.var, M.std summary stats

Network operations Generic mechanism for online control of def f(): … Function f is called every time step, can do anything Example: stop a network from running if too many spikes being produced: – … M=PopulationSpikeCounter(G) def check_too_many_spikes(): if M.nspikes>1000: stop()

Simulation control run(duration) creates network of all created objects net = Network(obj1, obj2, …) for finer control, then call net.run(duration) stop() and net.stop() reinit() and net.reinit() forget(obj) stops run() from using obj recall(obj) lets run() use obj again clear() forgets all objects, clear(True) forgets and deletes memory associated to all objects (useful for doing multiple runs)

Clocks By default, Brian uses ‘defaultclock’: – defaultclock.t = 0*ms – defaultclock.dt = 0.1*ms Can use multiple clocks if some objects need to be simulated with smaller dt than others Create clock with clk = Clock(dt=1*ms) Use clock for object with obj=Obj(…, clock=clk) If using only one clock, don’t need to specify, if using multiple clocks, you have to specify for every object

Analysis packages Brian has some statistical tools for spike trains: CVs, correlelograms, etc. – atistics atistics NeuroTools is a set of Python packages for analysing neuroscientific data – – Disclaimer: I haven’t used it myself The INCF maintains a list of relevant software – – Disclaimer: likewise

Optimisations Use C code – Build optional C modules – Set the global preference useweave=True for more C code if you have gcc or Visual Studio installed – Vectorise your code – Soon: use ‘code generation’ (not yet documented)

Model fitting toolbox Automatically fit spiking neuron models to electrophysiological data Uses particle swarm optimisation Adding genetic algorithms and others Can use GPU for 60x speed improvement Rossant et al., Automatic fitting of spiking neuron models to electrophysiological recordings, Front. Neuroinformatics (2010)

“Brian hears” (soon) Library for auditory modelling Filtering – High pass, low pass, band pass, etc. – Gammatone, gammachirp, etc. – Head related transfer functions (HRTFs) Standard neuron models – Meddis, Lyons, etc. All in Brian framework

GPUs Graphics processing units – Originally made for computer games (and therefore cheap) – Now being used for high-performance scientific computing – Many cores (up to 240 at the moment) but each must do similar computations (like SIMD, but slightly better) Huge potential speed improvements – Existing papers (early work) show 8-80x improvement – In model fitting library we have 60x improvement – Currently shows a 30x improvement in general case

Thanks!