Presentation is loading. Please wait.

Presentation is loading. Please wait.

Larry Wittie Computer Science, StonyBrook University and ~lw

Similar presentations


Presentation on theme: "Larry Wittie Computer Science, StonyBrook University and ~lw"— Presentation transcript:

1 CSE511 Brain & Memory Modeling Lect05-6: Large-Scale Neuronal Structure Modeling
Larry Wittie Computer Science, StonyBrook University and ~lw Adapted from Research Proficiency Exam of Heraldo Memelli  8/31/2010 

2 Lect05-6 Large-Scale Neuronal Modeling
Outline Intro to neuroscience Modeling a neuron Modeling large-scale networks of neurons Examples of large-scale models Our work: BOSS Future directions Let’s quickly start with a few neuroscience definitions 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

3 Lect05-6 Large-Scale Neuronal Modeling
What is a neuron? Basic building block in the brain and nervous system Electrically excitable cell Forms synapses (connections) with other neurons Receives thousands of inputs (electrical signals) from its dendrites and sends output “spikes” through its axon Information is transmitted by synaptic communication of electro-chemical signals You’re gonna hear me use the word neuron a few dozen times in this talk. The neuron is the basic building block in the brain and the nervous system. It is an electrically excitable cell that is specialized in inter-cellular communication Neurons communicate with each other by connections that are called synapses, as you can see in the picture. The main function of a neuron is to receive input "information" from thousands of other neurons through its numerous dendrites, to process that information, and then to send "information" as output to post-synaptic neurons through electro-chemical signals. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

4 Neuronal cell membrane
Channels in the semi-permeable membrane control ion movements in and out of the cell Ion concentration gradients generate a voltage difference across the membrane At rest, there is too much extracellular Na+ and too much K inside the cell. Na+ Cl- Outside K+ Part of the specialization of the neuron for electro-chemical signaling comes from the properties of the neuronal cell membrane Every neuron is surrounded by a semi-permeable membrane that controls the movement of ions in and out of the cell. The concentration inside and outside the cell are not maintained the same. The concentration of K+ inside the cell is much higher than outside, while Na+ and Cl- are present in a higher concentration outside the cell. Not at equilibrium. These ion concentration difference generates a voltage across the membrane. And keep in mind that the electrical signaling is basically movement of ions. K+ Cl- Inside Na+ 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

5 Action potential (output spike)
Action potential is an all-or-nothing positive spike in voltage across the axon’s cell wall membrane. Action potentials propagate constant-strength signals between neurons. The up slope comes from in-rushing Na and the drop from out-rushing K+ ions. A constant negative voltage called resting membrane potential is maintained across the membrane when the neuron is at rest. Since the neurons are not good electric conductors, they have evolved a way to amplify their ability to conduct electrical signals over long distances. The electrical signals produced by the neurons are called action potentials. An action potential is an all-or-nothing positive spike in the membrane voltage. As the result of the input from other cells, the membrane voltage can become more positive than the resting membrane potential (depolarization). If the potential becomes positive enough (it crosses a threshold), Shape due to ion movement A few milliseconds after a spike, the neuron cannot fire again because of a refractory period Neuroscience, 26 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

6 Neuron: passive & active electrical signals
Injecting current through the current-passing microelectrode alters the neuronal membrane potential. Hyperpolarizing current pulses produce only passive changes in potential. Small depolarizing currents also elicit only passive responses, but depolarizations that cause the membrane potential to meet or exceed threshold evoke action potentials. Action potentials are active responses in the sense that they are generated by changes in the permeability of the neuronal membrane. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

7 Lect05-6 Large-Scale Neuronal Modeling
Outline Intro to neuroscience Modeling a neuron Hodgkin-Huxley Integrate-and-Fire Izhikevich Modeling large-scale networks of neurons Examples of large-scale models Our work: BOSS Future directions Now that we got a feeling of roughly how a neuron operates and we talked about the ions and the membrane, let’s go into how to model a neuron mathematically. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

8 Lect05-6 Large-Scale Neuronal Modeling
Hodgkin-Huxley model Model of a neuron as an electrical circuit Models three individual ion channels More biologically realistic The most famous model in neuroscience is the Hodgkin-Huxley model. They simply applied basic electric circuit laws to the neuron. Membrane is impermeable to ions, it acts as an electrical capacitor that accumulates charge by blocking their diffusion. Specific ion channels can open and let pass a particular ion, thus acting as a resistor. If there is an input current injected into the cell: it can either go to charging the capacitor, or pass through as a K, Na or Leak which in this case accounts for Cl- and all other ions. L 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

9 Hodgkin-Huxley equations
Non-constant conductances (g) for Na+ and K+ ions Non-linear gating variables (m, n, h) for each ion channel & a fixed-rate L channel for slow “leaks” Computationally expensive! Seven differential equations and fourth power gating coefficients Let’s look at the complete HH equations The important part of the HH model is the conductances which is just the reciprocal of resistance g represents conductance and these 3 equations show how that changes g-bar”s represent constant maximum conductances The m, n and h are non-linear gating variables that control the state of the ion channels, in other words what fraction of the channels is open and thus allows the specific ionic current pass through. This system has four differential equations dV/dt dm/dt dh/dt dn/dt The likelihood of any given gate x being open is governed by the steady state equation xinf(V) However, gates do not open and close instantaneously, so we introduce a time “constant” equation taux(V) More biologically plausible because neuroscientists can play with parameters to selectively block channels and control ion flow in a molecular level Open-close-inactivated You even add more channels, so it increases 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

10 Leaky Integrate-and-Fire
Much simpler model of a neuron The –x/τ voltage-decay term models ion leakage Spikes are generated artificially when the cell voltage exceeds the “threshold” and “resets” Lacks biophysical detail and it cannot display different complex spiking neuronal behaviors Because of the computational complexity of the Hodgkin-Huxley model, many modelers have focused their attention to a computationally simpler class of neuron models. This comprises the entirety of the Leaky Integrate-and-Fire model… We have only one differential equation that keeps track of the overall voltage or potential in the cell Can be thought of as a single-channel system The cell accumulates external input and the spike is generated artificially when the summed potential exceeds threshold. It is then reset to a predefined parameter value. Computationally simple: it takes only about 5 to 10 FLOP for every time step of the integration. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

11 Lect05-6 Large-Scale Neuronal Modeling
Izhikevich model Combines simplicity of Leaky-Integrate-and-Fire with many easily achievable dynamic spiking patterns Various mathematical modelers have tried to improve the standard Integrate-and-Fire model to get more complex spike shapes and behaviors. One of this models was introduced by Izhikevich and it is shown here as it consists of two simple differential equations and one firing/reset condition. We see on the right an illustration of comparing actual experimental rat cortex firing patterns, with corresponding patterns of firing produced by the Izhikevich model. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling Izhikevich, 2003

12 Other Izhikevich firing patterns
Here is a further illustration of firing patterns that can be produced by the Izhikevich neuron model Just by changing the 4 parameters a, b, c and d, we can get all this complex example that are also observed experimentally Izhikevich, 2003 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

13 Lect05-6 Large-Scale Neuronal Modeling
Outline Intro to neuroscience Modeling a neuron Modeling large-scale networks of neurons Motivation and dynamic behaviors Neuroscience challenges & questions Computational methods Examples of large-scale models Our work: BOSS Future directions Having seen how we can model a single neuron, now let’s go to how we put many of those neurons together: why we do that, and what questions we have to take into consideration. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

14 Why large-scale neuronal networks?
Improve understanding of brain functionality involving interactions of billions of neuronal and synaptic processes Perform experiments (on a computer) that are impossible (experimentally or ethically) to be done on humans or animals Eventually improve and test hypotheses about complex behaviors: - Perception - Attention - Learning - Memory - Consciousness - Sleep and wakefulness Memory networks in the brain Our brain contains about 100 billion neurons and almost a quadrillion 10^15 synapses. The purpose of large-scale neural networks is to improve understanding of how the brain works in interactions of millions and billions of neurons In the picture we see a rough approximation of the areas that are thought to be involved in Executive and Perceptual Memory 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

15 Large-scale neural network dynamics
Large-scale network models can show complex dynamical patterns similar to brain firing activity - Response to external stimuli - Sustained intrinsic activity - Oscillations - Chaotic activity - Seizures Modeling this large-scale circuitry to generate complex patterns of activity will help analyzing the behaviors mentioned in the previous slide. People haven’t yet modeled in detail the complex cognitive behaviors that I mentioned before but nonetheless it is possible to model and simulate some complex firing activity patterns. For example, modelers attempt to analyze how a large network responds to external stimuli, attempting to model the way our brain reacts to some of the sensory perceptions. Also we know a major part of the activity of our brain is intrinsically/self generated. Modelers have been able to show behaviors like oscillatory activity Chaotic activity Or seizures. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

16 Neuroscience questions for large models
What neuron model to use? How to obtain anatomically accurate neuron counts and connectivity patterns? How to handle synaptic plasticity (learning)? Before building a model, there are a number of questions that need to be asked. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

17 Neuroscience questions: What neuron model to use?
Large models need simple neuron models: Integrate-and-Fire types of models are obligatory because of their efficiency Izhikevich model is a wise choice because it exhibits a wide range of spiking behaviors and allows about 100 times faster computation runs than Hodgkin-Huxley 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

18 Lect05-6 Large-Scale Neuronal Modeling
How to have anatomically accurate neuron counts and connectivity patterns? Difficult to get accurate detailed anatomical information Strategies used: fMRI, DTI, in vivo measurements in animals Usually neuron types are approximated in models as a few simple types fMRI with DTI Difficult to get accurate detailed anatomical information about neuron types and neuron counts in various regions of the brain New strategies used include fMRI with Diffusor Tensor Imaging (the image produced on the right) The resolution is very low, in the order of thousands of neurons at best: These lines show tracts of bundles of axons, trying to attempt to follow connection patterns. The way modelers handle this lack of accurate information is by making rough approximation into a few neuron types and counts 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

19 Lect05-6 Large-Scale Neuronal Modeling
How to have anatomically accurate neuron counts and connectivity patterns? Very difficult to get accurate, detailed neuron-to-neuron connectivity information Apart from Diffusion Tensor Imaging (DTI), tedious multi-array spike-train recordings are sometimes used to get micro-circuitry information Approximate or probabilistic approaches are common Often random connections subject to a few constraints Even more difficult to get accurate information about what neuron is connected to what other neurons Another strategy used are multi-array spike train recordings, where dozens of electrodes are inserted into specific neurons, and then statistical correlation analysis is performed on them. The picture on the right illustrates such an experiment on the brain stem of a rat. How do modelers handle this? Well they do approximate connectivity with % of neurons of a group connected to a % of neurons of another group or type. Often some degree of random connectivity based on a few parameters is introduced. Nuding, 2009 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

20 Neuroscience-related questions: How to handle synaptic plasticity?
Synaptic plasticity is the main brain-learning mechanism Hebb’s 1947 hypothesis for automatic learning of repeated stimulus patterns: “fire together – wire together” STDP (Spike-Timing Dependent Plasticity): a Hebb-style long term modification of synaptic strength that depends on timing of pre- and post-synaptic potentials Main approach is to maintain bounded but dynamically changing synaptic weights Only the most repeatedly effective synapses survive STPD (Spike-Timing Dependent Plasticity) is an extension to Hebb’s rule defined as a long term modification of synaptic strength that depends on timing of pre- and post-synaptic potentials. Plasticity is important in studying large-scale networks. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

21 Computational methods: modeling tools
NEURON Complete simulation environment for biophysically detailed neurons and networks of neurons Has a built-in GUI and is widely used by neuroscientists More suitable for small to medium size networks Now I’m going to describe some of the computational modeling tools and strategies NEURON is by far the most commonly used modeling tool by neuroscientists. Its focus is on building biophysically detailed Hodgkin-Huxley type of neurons and is more suitable for small to medium size networks 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

22 Lect05-6 Large-Scale Neuronal Modeling
NEURON - Screenshot This is a screenshot of NEURON running on my Mac, showing the main menu, the run control, the cell-Builder and some graphs. As you can see, you can set many parameters in detail.. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

23 Other modeling systems
GENESIS Similar to NEURON in targeting Hodgkin-Huxley types of models. Size of large models = order of 104 neurons NEST Focused towards larger-scale networks with quite realistic connectivity Size of large models = order of 105 neurons SPLIT A C++ library (not a full system) that helps modeling large-scale networks of HH-type Size of large models = order of 106 neurons GENESIS Similar to NEURON but less widely used NEST is a common project among some european universities but it is not a full system yet. SPLIT library of classes, templates and methods 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

24 Lect05-6 Large-Scale Neuronal Modeling
Super-computing All large-scale neural simulations need super-computers with thousands of processors. All the modeling tools/platforms are now adding parallelization libraries/mechanisms. The MPI (Message Passing Interface) library is often used for inter-processor communication Efficient scaling to thousands of processors is not an easy task Since we are speaking about scales of billions and trillions, it is obvious that we need big and fast computers to build and run simulations. massively parallel simulations are not easy and efficient scaling and good parallel algorithms are an interesting challenge on this field. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

25 Lect05-6 Large-Scale Neuronal Modeling
Outline Intro to neuroscience Modeling a neuron Modeling large-scale networks of neurons Examples of large-scale models Our work: BOSS Future directions Having described the strategies for modeling large-scale networks of neurons, I will go through some relevant examples in literature. A couple of them have even become media-famous 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

26 Examples of large network simulations
Blue Brain project (2007) Djurfeldt brain cortex model (2008) Izhikevich thalamo-cortical model (2007) IBM “Cat-Brain” model (2009) 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

27 Examples of large-scale models
Blue Brain Most biologically detailed and accurate model based on thousands of microanatomy experiments One neo-cortical column of 10,000 neurons Djurfeldt brain cortex model Hodgkin-Huxley type of neurons Models few cortical layers with approximate connectivity detail 22 millions of neurons and 11 billion synapses Sketch of microcolumns of a few neurons, and they just put millions of those together Lect05-6 Large-Scale Neuronal Modeling 9/13,18/12 Djurfeldt, 2008

28 Lect05-6 Large-Scale Neuronal Modeling
Izhikevich model Izhikevich-type neurons with 22 different basic types Thalamo-cortical anatomy based on human DTI, plus other experimental data 1 million neurons (tens of millions compartments), 0.5 billion synapses I wanted to show this illustration about how they approach this modeling. We can somehow think of it as a layered graph. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

29 Lect05-6 Large-Scale Neuronal Modeling
IBM “Cat-Brain” model Simpler single-compartment I&F neurons Anatomical approximation of thalamo-cortical brain tissue Ran on a Blue Gene/P supercomputer with 147,456 CPUs with 1 GB of memory each Won the ACM Gordon Bell “Parallel Speedup” Prize in 2009 1.6 billion (109) neurons 8.87 trillion (1012) synapses 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

30 Lect05-6 Large-Scale Neuronal Modeling
Comparing the models Neuron type # of neurons # of synapses Runtime (seconds) Super- computer Biophysical accuracy Blue - Brain Hodgkin-Huxley (+) 10,000 1 x 108 ~100 BlueGene (8192 CPUs) Extremely detailed Djurfeldt Cortex Hodgkin-Huxley 22 million 1.1 x 1010 Not reported (4096 CPUs) Good approx. Izhikevich thalam-cor. Izhikevich model 1 million 0.5 x 109 660 Beowulf (60 CPUs) “Mixed” approx. IBM Cat-Brain Simple I&F 1.6 billion 8.9 x 1012 683 (147,456 CPUs) Rough approx. Let’s look at a comparison of the sizes. I ordered them in terms of their biological detail and biophysical accuracy All of these were hand-coded by super-computer teams in long periods. More about runtime 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

31 Lect05-6 Large-Scale Neuronal Modeling
Outline Intro to neuroscience Modeling a neuron Modeling large-scale networks of neurons Examples of large-scale models Our work: BOSS Future directions All relevant large networks in recent publications have been coded by researchers themselves without using any of these tools. Partially for efficiency reasons, and partially because these tools are not yet adequate for huge-scale simulations on super-computers. Also the fact remains that all these large models by teams of super-computer programmers and are not directly available or directly useful to neuroscientists that are not part of those groups. Considering the popularity of this new inter-disciplinary field, we decided to focus our own work towards building a large-scale simulation tool. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

32 Lect05-6 Large-Scale Neuronal Modeling
BOSS: Intro and goal Brain Organization Simulation System Attempt to create a tool for neuroscientists to simulate huge-scale networks of neuronal structures Test hypotheses about memory, learning, and other complex emergent behaviors that require simulation of networks of millions or billions of neurons Brain Organization Simulation System is a project that will attempt to create a tool for neuroscientists 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

33 BOSS – Simulator details
Quantitized-time discrete-event simulator Circular header array of unsorted (thus faster) queues of future events for every future time cycle. Each firing of a neuron creates an event for every output synapse of that neuron and is placed in the appropriate future queue Summing events that target the same neuron can save memory. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

34 BOSS: Discrete event queues
Less-time 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

35 BOSS V1-V6: First simple neuron model
Neuron model: Simple threshold element that sums square-wave pulses propagating along output links (axons) from many inputs The reason for having such a simple model is that first we wanted to create and debug an efficient discrete-event simulator. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

36 BOSS : Improvements through V7
V1: coded by Slava Akhmechet for 4 Sun T1000s V2: ported to Bluegene by Ryan Welsch V3: Summed pre-synaptic potential changes for the same local neuron to run bigger models V5: Decreased memory bits per synapse to double sizes of largest achievable models V6: Implemented remote future-event summing potentials allowing for higher synapses/neuron V7: Replaced threshold element with Izhikevich neuron models by Heraldo Memelli Maybe Maybe Maybe 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

37 Neuronal features of first BOSS models
Threshold-based action potentials Refractory period Axonal delays Balanced excitation and inhibition Periodic external stimulation Uniform neuron connectivity topologies TODO: Neuronal features 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

38 BOSS V1-V7: Initial network model
Topology: The first BOSS simulator versions implemented a simple one-layer square topology with end-around links (torus) E-cells at each grid point strongly excited a few nearby cells I-cells weakly inhibited many surrounding cells The simple torus topology was chosen for easier supercomputer code development & debugging 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

39 BOSS-First Grid Topology
The picture shows in orange that inhibitory I-cell 477 has end-around links to reach most of the 240 cells that it inhibits. E-cell 476 uses no end-around links to excite its 16 nearest neighboring cells. Many cells are inhibited both by I-cell 205 and I-cell 477. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

40 BOSS : Parallel computing
Runs are performed on NY-Blue: an IBM Blue Gene/L supercomputer sited at Brookhaven National Laboratory (BNL) but owned by Stony Brook University for joint use by BNL & SBU computational scientists Currently BOSS uses up to 4,096 processor nodes out of the 18,432 processors in total. Inter-processor communication is handled by MPI calls to pass messages about firing events 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

41 BOSS V2-7: Maximum Sizes of Grid Model
A temporary maximum of 131 billion synapses Number of neurons ranges from dozens of millions to up to a billion (depending on the average number of synapses per neuron) Uses 1 TeraByte (TB) of memory on 1,024 Bluegene processors For size of human brain, we would need about 8,000 TBs of computer main memory (Jaguar, the fastest 2010 super-computer has 360 TB) 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

42 BOSS – memory needs of big models
Future-event storage limited model sizes in BOSS V1-5 Since version 6 (V6), memory needs for synapse data structures determines maximum model sizes Each synapse needs only bytes, allowing up to billion per model in TB of NY-Blue memory Runtime is not critical on NY-Blue for BOSS models Other Conclusions: An elegant parallelization is necessary. We reached good scalability results as increasing the number of processors proportionally increases the size of the model 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

43 Lect05-6 Large-Scale Neuronal Modeling
Outline Intro to neuroscience Modeling a neuron Modeling large-scale networks of neurons Examples of large-scale models Our work: BOSS Future directions BOSS is very much a work in progress as we plan numerous improvements for various aspects of the project 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

44 Upcoming BOSS improvements
Completed (2012) front-end initializer (INIT) for more anatomically accurate models of brain tissues Add learning mechanisms – synaptic plasticity Let widely separated neurons interact across very distant NY-Blue computing nodes (in process ‘12). Let INIT use all cores in each computing node Consistently optimize the BOSS simulator for fast runtimes and efficient use of NY-Blue memory BOSS is very much a work in progress as we plan numerous improvements for various aspects of the project. We plan to complete a front-end initializer called INIT, which creates a more anatomically accurate model. I will describe INIT in more detail in the next slide. Eventually we will add learning mechanisms that model synaptic plasticity. Part of the challenge will be that when you have anatomically complex connectivity, then parallelization is not straight-forward because some neurons with long axons can span long distances and can be many processors away. Let widely separated neurons interact through many processors away. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

45 Lect05-6 Large-Scale Neuronal Modeling
INIT Front-end initializer to create realistic brain tissue models INIT takes dozens of parameters for: Number of neuron types Density and placement of neurons in the tissue Definitions for axonal and dendritic fields Density and placement of synapses Other connection details Automatically places all neurons to match distributions Finds all synapses with an efficient staggered walk (N logN) algorithm (N2 and N3/2 in the first INIT implementations) Creates details of specific network models that can run fast INIT takes dozens of parameters that describe in detail a brain tissue structure in terms of: neuron types, density and placement of neurons in the tissue, definitions for axonal and dendritic fields, density and placement of synapses and many other connection details. The program automates the geometrical placement of all neurons to match distributions in the tissue with the exact coordinates of their axonal and dendritic fields (approximated as boxes). (NlogN) where N is the number of neuritic fields 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

46 Lect05-6 Large-Scale Neuronal Modeling
Many neuron types Dozens of neuronal types in our nervous systems They differ by size, shape and electrical behavior. Golgi stain of neurons 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

47 INIT: Sample details from cerebellar model
The drawing in Figure 6 illustrates the functionality of INIT in constructing the neuron's axonal and dendritic fields for three types of cerebellar neurons (in this case a Golgi cell, a Purkinje cell and the parallel fibers of Granule cells). The complex dendritic trees are approximated as boxes. Wherever axonal/dendritic boxes overlap for types of neurons known to form synapses, INIT checks for possible synapses based on various connectivity parameters. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

48 Lect05-6 Large-Scale Neuronal Modeling
Future Directions Finish building a full BOSS system, a flexible tool for creating large-scale brain structure models. Use models created by BOSS to tackle questions related to many complex brain behaviors. Show formation, interaction, and regeneration of Hebb-style distributed memories: demonstrate “memories in motion” Collaborate with the group at Dept. of Physiology & Biophysics to address their large-scale modeling needs. As we complete the previously mentioned tasks we hope to have a complete and flexible tool for creating large-scale brain structure models. We hope that the models that will be created using BOSS will help tackle questions related to many complex brain behaviors. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling

49 Lect05-6 Large-Scale Neuronal Modeling
Thank you RPE committee members: Dr. Scott Smolka, Dr. Irene Solomon and Dr. Larry Wittie Other students that collaborated with Heraldo Memelli: Ryan Welsch, Jack Zito, Slava Akhmechet, Tabitha Shen, and Kyle Horn. 9/13,18/12 Lect05-6 Large-Scale Neuronal Modeling


Download ppt "Larry Wittie Computer Science, StonyBrook University and ~lw"

Similar presentations


Ads by Google