# Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov Tomas Gedeon John P. Miller.

## Presentation on theme: "Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov Tomas Gedeon John P. Miller."— Presentation transcript:

Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov Tomas Gedeon John P. Miller Zane Aldworth Neural Coding and Decoding Albert E. Parker

Problem: Determine a coding scheme: How does neural ensemble activity represent information about sensory stimuli? Our Approach: Construct a model using Probability and Information Theory Optimize the model to cluster the neural responses which gives an approximation of a coding scheme given the available data. Apply our method to the cricket cercal system

Neural Coding and Decoding. Goal: What conditions must a coding scheme satisfy? Demands: An animal needs to recognize the same object on repeated exposures. Coding has to be deterministic at this level. The code must deal with uncertainties introduced by the environment and neural architecture. Coding is by necessity stochastic at this finer scale. Major Problem: The search for a coding scheme requires large amounts of data

How to determine a coding scheme? Idea: Model a part of a neural system as a communication channel using Information Theory. This model enables us to: Meet the demands of a coding scheme: oDefine a coding scheme as a relation between stimulus and neural response classes. oConstruct a coding scheme that is stochastic on the finer scale yet almost deterministic on the classes. Deal with the major problem: oUse whatever quantity of data is available to construct coarse but optimally informative approximations of the coding scheme. oRefine the coding scheme as more data becomes available. Investigate the cricket cercal sensory system.

X Y Q(Y|X) inputoutput A Stochastic Map The relationship between X and Y is completely described by the conditional probability Q. stimulus sequence X=xresponse sequence Y=y Realizations of X and Y in neural coding Q(Y=y|X=x)

Y 12341234 X stimulus sequences response sequences Determining Stimulus/Response Classes Given a joint probability p(X,Y):

Stimulus and Response Classes stimulus sequences response sequences Distinguishable stimulus/response classes Y X 12341234

Information Theoretic Quantities A quantizer or encoder, Q, relates the environmental stimulus, X, to the neural response, Y, through a process called quantization. In general, Q is a stochastic map The Reproduction space Y is a quantization of X. This can be repeated: Let Y f be a reproduction of Y. So there is a quantizer Use Mutual information to measure the degree of dependence between X and Y f. Use Conditional Entropy to measure the self-information of Y f given Y

The Model Problem: To determine a coding scheme between X and Y requires large amounts of data Idea: Determine the coding scheme between X and Y f, a clustering (reproduction) of Y, such that: Y f preserves as much information (mutual information) with X as possible and the self-information (entropy) of Y f |Y is maximized. That is, we are searching for an optimal mapping (quantizer): that satisfies these conditions. Justification: Jayne's maximum entropy principle, which states that of all the quantizers that satisfy a given set of constraints, choose the one that maximizes the entropy.

 Maximum entropy: maximize F ( q(y f |y) ) = H(Y f |Y) constrained by I(X;Y f )  I o I o determines the informativeness of the reproduction.  Deterministic annealing (Rose, ’98): maximize F ( q(y f |y) ) = H(Y f |Y) +  I(X,Y f ). Small  favor maximum entropy, large  : maximum I(X,Y f ).  Augmented Lagrangian with Newton CG line search  Implicit solution:  Simplex Algorithm: maximize I(X,Y f ) over vertices of constraint space Equivalent Optimization Problems

Random clusters Application to synthetic data (p(X,Y) is known)

The Optimization Problem for Real Data Maximum entropy: maximize F(q(y f |y)) = H(Y f |Y) constrained by H(X)-H G (X|Y f )  I o I o determines the informativeness of the reproduction.

? ?

Signal Nervous system Communication channel Modeling the cricket cercal sensory system as a communication channel

Wind Stimulus and Neural Response in the cricket cercal system Neural Responses (over a 30 minute recording) caused by white noise wind stimulus. T, ms Neural Responses (these are all doublets) for a 12 ms window Some of the air current stimuli preceding one of the neural responses Time in ms. A t T=0, the first spike occurs X Y

YfYf Y Quantization: A quantizer is any map f: Y -> Y f from Y to a reproduction space Y f with finitely many elements. Quantizers can be deterministic or refined Y probabilistic

Applying the algorithm to cricket sensory data. Y YfYf YfYf Class 1 Class 2 Class 1 Class 2 Class 3

Conclusions We model a part of the neural system as a communication channel. define a coding scheme through relations between classes of stimulus/response pairs. -Coding is probabilistic on the individual elements of X and Y. -Coding is almost deterministic on the stimulus/response classes. To recover such a coding scheme, we propose a new method to quantify neural spike trains. -Quantize the response patterns to a small finite space (Y f ). -Use information theoretic measures to determine optimal quantizer for a fixed reproduction size. -Refine the coding scheme by increasing the reproduction size. present preliminary results with cricket sensory data.

Download ppt "Center for Computational Biology Department of Mathematical Sciences Montana State University Collaborators: Alexander Dimitrov Tomas Gedeon John P. Miller."

Similar presentations