Presentation is loading. Please wait.

Presentation is loading. Please wait.

Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov John P. Miller Zane Aldworth.

Similar presentations


Presentation on theme: "Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov John P. Miller Zane Aldworth."— Presentation transcript:

1 Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov John P. Miller Zane Aldworth Tomas Gedeon Brendan Mumey Neural Coding and Decoding Albert E. Parker

2 Problem: How does neural ensemble activity represent information about sensory stimuli? Our Approach: Build a model using Information Theory My Research: Probability Theory Use the model to determine a coding scheme My Research: Numerical Optimization Techniques

3 Neural Coding and Decoding. Goal: Determine a coding scheme: How does neural ensemble activity represent information about sensory stimuli? Demands: An animal needs to recognize the same object on repeated exposures. Coding has to be deterministic at this level. The code must deal with uncertainties introduced by the environment and neural architecture. Coding is by necessity stochastic at this finer scale. Major Problem: The search for a coding scheme requires large amounts of data

4 How to determine a coding scheme? Idea: Model a part of a neural system as a communication channel using Information Theory. This model enables us to: Meet the demands of a coding scheme: oDefine a coding scheme as a relation between stimulus and neural response classes. oConstruct a coding scheme that is stochastic on the finer scale yet almost deterministic on the classes. Deal with the major problem: oUse whatever quantity of data is available to construct coarse but optimally informative approximations of the coding scheme. oRefine the coding scheme as more data becomes available. Investigate the cricket cercal sensory system.

5 Y X stimulus sequences response sequences stimulus/response sequence pairs distinguishable classes of stimulus/response pairs Stimulus and Response Classes

6 Information Theoretic Quantities A quantizer or encoder, Q, relates the environmental stimulus, X, to the neural response, Y, through a process called quantization. In general, Q is a stochastic map The Reproduction space Y is a quantization of X. This can be repeated: Let Y f be a reproduction of Y. So there is a quantizer Use Mutual information to measure the degree of dependence between X and Y f. Use Conditional Entropy to measure the self-information of Y f given Y

7 The Model Problem: To determine a coding scheme between X and Y requires large amounts of data Idea: Determine the coding scheme between X and Y f, a squashing (reproduction) of Y, such that: Y f preserves as much information (mutual information) with X as possible and the self-information (entropy) of Y f |Y is maximized. That is, we are searching for an optimal mapping (quantizer): that satisfies these conditions. Justification: Jayne's maximum entropy principle, which states that of all the quantizers that satisfy a given set of constraints, choose the one that maximizes the entropy.

8 Equivalent Optimization Problems  Maximum entropy: maximize F ( q(y f |y) ) = H(Y f |Y) constrained by I(X;Y f )  I o I o determines the informativeness of the reproduction.  Deterministic annealing (Rose, ’98): maximize F ( q(y f |y) ) = H(Y f |Y) -  D I (Y,Y f ). Small  favor maximum entropy, large  - minimum D I.  Simplex Algorithm: maximize I(X,Y f ) over vertices of constraint space  Implicit solution:

9 ? ?

10 Signal Nervous system Communication channel Modeling the cricket cercal sensory system as a communication channel

11 Wind Stimulus and Neural Response in the cricket cercal system Neural Responses (over a 30 minute recording) caused by white noise wind stimulus. T, ms Neural Responses (these are all doublets) for a 12 ms window Some of the air current stimuli preceding one of the neural responses Time in ms. A t T=0, the first spike occurs X Y

12 YfYf Y Quantization: A quantizer is any map f: Y -> Y f from Y to a reproduction space Y f with finitely many elements. Quantizers can be deterministic or refined Y probabilistic

13 Applying the algorithm to cricket sensory data. Y 123123 1212 YfYf YfYf

14 Conclusions We model a part of the neural system as a communication channel. define a coding scheme through relations between classes of stimulus/response pairs. -Coding is probabilistic on the individual elements of X and Y. -Coding is almost deterministic on the stimulus/response classes. To recover such a coding scheme, we propose a new method to quantify neural spike trains. -Quantize the response patterns to a small finite space (Y f ). -Use information theoretic measures to determine optimal quantizer for a fixed reproduction size. -Refine the coding scheme by increasing the reproduction size. present preliminary results with cricket sensory data.

15

16

17

18 Random clusters Linear relation Application to synthetic data.

19 The cricket case: dealing with complex stimuli. D I =I(X,Y)-I(X,Y N ) cannot be estimated directly for rich stimulus sets. Use an upper bound. Start by D I =H(X)-H(X|Y)- ( H(X) - H(X|Y N ) ). Only H(X|Y N ) depends on the quantizer so use D eff = H(X|Y N ). Use H(X|y N ) is bounder by a Gaussian: where is the conditional covariance of the stimulus. This gives an upper bound of D eff. Minimize the upper bound. Better model = tighter bound. explicitly as a function of the quantizer.

20 Directions Properties of the information distortion function. –Bifurcation structure (conjecture: there are only pitchfork bifurcations). –Classes of problems with exact solution. –Structure of the set of optima (conjecture: global minima can be found at vertices of the domain). Algorithmic development –Improved numeric optimization schemes. –Convergence properties. Neural coding model. –Better models for rich stimulus sets. Better upper bounds. –Explicit dependence on the size of the dataset (possibly for each model). Applications to non-neural problems.


Download ppt "Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov John P. Miller Zane Aldworth."

Similar presentations


Ads by Google