Nonlinear estimators and time embedding Raul Vicente FIAS Frankfurt 7th August 2008.

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

From
DYNAMICS OF RANDOM BOOLEAN NETWORKS James F. Lynch Clarkson University.
 Introduction and motivation  Comparitive investigation: Predictive performance of measures of synchronization  Statistical validation of seizure predictions:
Sensorimotor Transformations Maurice J. Chacron and Kathleen E. Cullen.
Spike Train Statistics Sabri IPM. Review of spike train  Extracting information from spike trains  Noisy environment:  in vitro  in vivo  measurement.
Neurophysics Part 1: Neural encoding and decoding (Ch 1-4) Stimulus to response (1-2) Response to stimulus, information in spikes (3-4) Part 2: Neurons.
1 Correlations Without Synchrony Presented by: Oded Ashkenazi Carlos D. Brody.
Introduction to Mathematical Methods in Neurobiology: Dynamical Systems Oren Shriki 2009 Modeling Conductance-Based Networks by Rate Models 1.
Mining for High Complexity Regions Using Entropy and Box Counting Dimension Quad-Trees Rosanne Vetro, Wei Ding, Dan A. Simovici Computer Science Department.
Nonlinear estimators and time embedding Raul Vicente FIAS Frankfurt,
Seizure prediction by non- linear time series analysis of brain electrical activity Ilana Podlipsky.
Introduction to chaotic dynamics
Dynamic Systems Thanks to Derek Harter for having notes on the web. Also see, Port & Van Gelder and Beltrami.
Environmental Data Analysis with MatLab Lecture 24: Confidence Limits of Spectra; Bootstraps.
Deterministic Chaos PHYS 306/638 University of Delaware ca oz.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
Statistical Methods for long-range forecast By Syunji Takahashi Climate Prediction Division JMA.
Principles of the Global Positioning System Lecture 10 Prof. Thomas Herring Room A;
Elec471 Embedded Computer Systems Chapter 4, Probability and Statistics By Prof. Tim Johnson, PE Wentworth Institute of Technology Boston, MA Theory and.
Review of Probability.
Measuring Functional Integration: Connectivity Analyses
TIME SERIES by H.V.S. DE SILVA DEPARTMENT OF MATHEMATICS
Page 0 of 14 Dynamical Invariants of an Attractor and potential applications for speech data Saurabh Prasad Intelligent Electronic Systems Human and Systems.
Reconstructed Phase Space (RPS)
Phase synchronization in coupled nonlinear oscillators
Dynamics of Coupled Oscillators-Theory and Applications
1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,
The search for organizing principles of brain function Needed at multiple levels: synapse => cell => brain area (cortical maps) => hierarchy of areas.
Wolf-Gerrit Früh Christina Skittides With support from SgurrEnergy Preliminary assessment of wind climate fluctuations and use of Dynamical Systems Theory.
2 2  Background  Vision in Human Brain  Efficient Coding Theory  Motivation  Natural Pictures  Methodology  Statistical Characteristics  Models.
1 Recurrence analysis and Fisheries Fisheries as a complex systems Traditional science operates on the assumption that natural systems like fish populations.
Introduction to Quantum Chaos
The Examination of Residuals. Examination of Residuals The fitting of models to data is done using an iterative approach. The first step is to fit a simple.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Contrasts & Inference - EEG & MEG Himn Sabir 1. Topics 1 st level analysis 2 nd level analysis Space-Time SPMs Time-frequency analysis Conclusion 2.
Applications of Neural Networks in Time-Series Analysis Adam Maus Computer Science Department Mentor: Doctor Sprott Physics Department.
Multiple Regression Petter Mostad Review: Simple linear regression We define a model where are independent (normally distributed) with equal.
Motor Control. Beyond babbling Three problems with motor babbling: –Random exploration is slow –Error-based learning algorithms are faster but error signals.
Low-Dimensional Chaotic Signal Characterization Using Approximate Entropy Soundararajan Ezekiel Matthew Lang Computer Science Department Indiana University.
EEG analysis during hypnagogium Petr Svoboda Laboratory of System Reliability Faculty of Transportation Czech Technical University
Introduction to Chaos by: Saeed Heidary 29 Feb 2013.
Introduction: Brain Dynamics Jaeseung Jeong, Ph.D Department of Bio and Brain Engineering, KAIST.
CCN COMPLEX COMPUTING NETWORKS1 This research has been supported in part by European Commission FP6 IYTE-Wireless Project (Contract No: )
BioSS reading group Adam Butler, 21 June 2006 Allen & Stott (2003) Estimating signal amplitudes in optimal fingerprinting, part I: theory. Climate dynamics,
BCS547 Neural Decoding. Population Code Tuning CurvesPattern of activity (r) Direction (deg) Activity
Neural Modeling - Fall NEURAL TRANSFORMATION Strategy to discover the Brain Functionality Biomedical engineering Group School of Electrical Engineering.
ECE-7000: Nonlinear Dynamical Systems Overfitting and model costs Overfitting  The more free parameters a model has, the better it can be adapted.
S. Srinivasan, S. Prasad, S. Patil, G. Lazarou and J. Picone Intelligent Electronic Systems Center for Advanced Vehicular Systems Mississippi State University.
Biological Modeling of Neural Networks Week 4 – Reducing detail - Adding detail Wulfram Gerstner EPFL, Lausanne, Switzerland 3.1 From Hodgkin-Huxley to.
1 Challenge the future Chaotic Invariants for Human Action Recognition Ali, Basharat, & Shah, ICCV 2007.
Control and Synchronization of Chaos Li-Qun Chen Department of Mechanics, Shanghai University Shanghai Institute of Applied Mathematics and Mechanics Shanghai.
Neural Synchronization via Potassium Signaling. Neurons Interactions Neurons can communicate with each other via chemical or electrical synapses. Chemical.
Predictability of daily temperature series determined by maximal Lyapunov exponent Jan Skořepa, Jiří Mikšovský, Aleš Raidl Department of Atmospheric Physics,
Discrete-time Random Signals
6. Population Codes Presented by Rhee, Je-Keun © 2008, SNU Biointelligence Lab,
ECE-7000: Nonlinear Dynamical Systems 2. Linear tools and general considerations 2.1 Stationarity and sampling - In principle, the more a scientific measurement.
Ch 9. Rhythms and Synchrony 9.7 Adaptive Cooperative Systems, Martin Beckerman, Summarized by M.-O. Heo Biointelligence Laboratory, Seoul National.
Controlling Chaos Journal presentation by Vaibhav Madhok.
ECE-7000: Nonlinear Dynamical Systems 3. Phase Space Methods 3.1 Determinism: Uniqueness in phase space We Assume that the system is linear stochastic.
Chapter 2. Signals and Linear Systems
CORRELATION-REGULATION ANALYSIS Томский политехнический университет.
[Chaos in the Brain] Nonlinear dynamical analysis for neural signals Jaeseung Jeong, Ph.D Department of Bio and Brain Engineering, KAIST.
Page 0 of 5 Dynamical Invariants of an Attractor and potential applications for speech data Saurabh Prasad Intelligent Electronic Systems Human and Systems.
CLASSIFICATION OF ECG SIGNAL USING WAVELET ANALYSIS
Chaos Analysis.
CSC2535: Computation in Neural Networks Lecture 11 Extracting coherent properties by maximizing mutual information across space or time Geoffrey Hinton.
Dimension Review Many of the geometric structures generated by chaotic map or differential dynamic systems are extremely complex. Fractal : hard to define.
What is Correlation Analysis?
Autonomous Cyber-Physical Systems: Dynamical Systems
Presentation transcript:

Nonlinear estimators and time embedding Raul Vicente FIAS Frankfurt 7th August 2008

OUTLINE Introduction to nonlinear systems 2 Phase space methods Exponents and dimensions Interdependence measures Take-home messages

OUTLINE  Introduction to nonlinear systems 3 Definition Why nonlinear methods? Linear techniques Phase space methods Exponents and dimensions Interdependence measures Take-home messages

INTRODUCTION Definition A nonlinear system is one whose behavior can‘t be expressed as a sum of the behaviors of its parts. In technical terms, the behavior of nonlinear systems is not subject to the principle of superposition. 4 „Nonlinear“ is a very popular word in (neuro)science but what does it really mean? The brain as a whole is a nonlinear „device“ Ex: our perception can be more than the sum of responses to individual stimulus Surface completion

INTRODUCTION Definition A nonlinear system is one whose behavior can‘t be expressed as a sum of the behaviors of its parts. In technical terms, the behavior of nonlinear systems is not subject to the principle of superposition. 4 „Nonlinear“ is a very popular word in (neuro)science but what does it really mean? Individual neurons are also nonlinear Excitable cells with all or none responses Double the input does not mean double the output Nonlinear frequency response

INTRODUCTION Definition A nonlinear system is one whose behavior can‘t be expressed as a sum of the behaviors of its parts. In technical terms, the behavior of nonlinear systems is not subject to the principle of superposition. 4 „Nonlinear“ is a very popular word in (neuro)science but what does it really mean? Individual neurons are also nonlinear Double the input does not mean double the output The brain as a whole is a nonlinear „device“ Ex: our perception can be more than the sum of responses to individual stimulus

INTRODUCTION Why nonlinear methods? 5 „The study of non-linear physics is like the study of non-elephant biology“ Unknown Neuronal activity is highly nonlinear Nonlinear features will be present in the recorded neurophysiological data From neuronal action potentials (spikes) to integrated activity (EEG, MEG, fMRI) Linear techniques might fail to capture key information Nonlinear indices: measure complexity of EEG, monitoring depth of anaesthesia, studies of epilepsy, detection of interdependence, etc...

INTRODUCTION Linear techniques 6 Linear systems always need irregular inputs to produce bounded irregular signals Linear System Linear System Most simple system which produces nonperiodic (interesting) signals is a linear stochastic process...,S n-1, S n, S n+1,... Measurement of state S n at time n of such a process p(s) probability dist. Information about p(s) can be inferred from the time series:

INTRODUCTION Linear techniques 7 Linear methods interpret all regular structure in a data set, such as a dominant frequency, as linear correlations (time or frequency domain) Autocorrelation at lag SnSn S n- SnSn SnSn Periodic signal Stochastic process Chaotic system Periodic autocorrelation Decaying autocorrelation Exponential decay ?

INTRODUCTION Linear techniques 8 Cross-correlation function: measures the linear correlation between two variables X and Y as a function of their delay time (  ) Cross-correlation at lag tendency to have similar values with the same sign tendency to have similar values with opposite sign suggest lack of linear interdependence EEG time series recorded from the two hemispheres in a rat X´=X 4 Y´=Y 4 r xy = 0.63r xy = 0.25  that maximizes this function estimator delay between signals

INTRODUCTION Linear techniques 8 Cross-correlation function: measures the linear correlation between two variables X and Y as a function of their delay time (  ) Cross-correlation at lag tendency to have similar values with the same sign tendency to have similar values with opposite sign suggest lack of linear interdependence  that maximizes this function estimator delay between signals Cross-correlogram histogram is also used to reveal the temporal coherence in the firing of neurons MT neurons in visual cortex of a macaque monkey

INTRODUCTION Linear techniques 9 Coherence: measures the linear correlation between two signals as a function of the frequency Coherence at frequency f activities of the signals in this frequency are linearly independent maximum linear correlation for this frequency In forming an estimate of coherence, it is always essential to simulate ensemble averaging. EEG and MEG signals are subdivided in epochs or for event-related data spectra are averaged over trials

INTRODUCTION Linear techniques 10 Prediction: we have a sequence of measuraments s n, n = 1,...,N and we want to predict the outcome of the following measurement, s N+1 Linear prediction minimising the error In-sample Out-of-sample

INTRODUCTION Linear techniques 11 Causality: (Nobert Wiener in 1956) for two simultaneously measured signals, if one can predict the first signal better by incorporating the past information from the second signal than using only information from the first one, then the second signal can be called causal to the first one Predicting the future of X improves when incorporating the information about the past of Y → Y is causal to X X1X1 Y1Y1 Time X2X2 In neurophysiology, a question of great interest is whether there exists a causal relation between two brain regions. Inferring causality from the time delay in the cross-correlation is not always straighforward

INTRODUCTION Linear techniques 12 Causality: (Nobert Wiener in 1956) for two simultaneously measured signals, if one can predict the first signal better by incorporating the past information from the second signal than using only information from the first one, then the second signal can be called causal to the first one X1X1 Y1Y1 Time X2X2 Granger just applied this definition in the context of linear stochastic models. If X is influencing Y, then adding the past values of the first variable to the regression of the second one will improve its prediction error. Univariate fitting Bivariate fitting Prediction performance: is assessed by the variances of the prediction errors Granger causality

OUTLINE Introduction to nonlinear systems 13 The concept of phase space Attractor reconstruction Time embedding Application: nonlinear predictor  Phase space methods Exponents and dimensions Interdependence measures Take-home messages

Phase space The concept 14 Phase space: is a space in which all possible states of a system are represented, with each possible state of the system corresponding to one unique point in the phase space Ex: the Fitzhugh-Nagumo model is a two-dimensional simplification of the Hodgkin-Huxley model of spike generation In a phase space, every degree of freedom or parameter of the system is represented as an axis of a multidimensional space. Ex: dim(HH) = 4 - membrane potential - recovery variable (V, W) V(t) W(t) For deterministic systems (no noise), the system state at time t consists of all information needed to uniquely determine the future system states for times > t

Phase space Attractor reconstruction 15 Attractor: a set of points in phase space such that for "many" choices of initial point the system will evolve towards them. It is a set to which the system evolves after a long enough time Attractor set A point A curve A manifold Behavior Constant Periodic Possibly chaotic Van der Pol limit cycle attractor Lorenz attractor Strange attractors produce chaotic behavior. Nonlinear systems: irregular dynamics without invoking noise!

Phase space Attractor reconstruction 16 In general (and especially in biological systems) it is impossible to access all relevant variables of a system. Ex: usually in electrophysiology we just measure membrane voltage. These vectors constructed from a single variable play a role similar to [x(t),y(t),z(t)] How from a single measured quantity can one reconstruct the original attractor? ?

Phase space Time embedding 17 In general (and especially in biological systems) it is impossible to access all relevant variables of a system. Ex: usually in electrophysiology we just measure membrane voltage. Time delay embedding m > 2D F T ~ first zero autocorrelation

Phase space Application: nonlinear predictor 18 - A signal does not change is easy to predict: take the last observation as a forecast for the next one Depending on the type of signals the power of predictability and the best strategy changes: - A periodic system is also easy one observed for a full cycle - For independent random numbers the best prediction is the mean value - Interesting signals are not periodic but contain some kind of structure which can be exploited to obtain better predictions If the source of predictability are linear correlations in time: next observations will be given approximately by a linear combination of preceding observations What if I know that my series is nonlinear?

Phase space Application: nonlinear predictor 19 For nonlinear deterministic systems all future states are unambiguosly determined by specifying its present state. Nonlinear correlations can be exploited with new techniques Lorenz method of analogues: Prediction for the future state X N+1 Look for recorded states close to the one we want to predict Predict the average of the next states of the past neighbors Current state Neighbors Next values of Better prediction for short time scales than linear predictors Predicted state

OUTLINE Introduction to nonlinear systems 20 Sensibility to initial conditions: Lyapunov exp. Self-similarity: correlation dimension Phase space methods  Exponents and dimensions Interdependence measures Take-home messages

Exponents and dimensions Sensibility to initial conditions 21 The most striking feature of chaos is the long-term unpredictability of the future despite a deterministic time evolution. The cause is the inherent instability of the solutions, reflected in their sensitive dependence on initial conditions. Amplification of errors since nearby trajectories separate exponentially fast. How fast it is measured by the Lyapunov exponent. type of motionmaximal Lyap. exp. stable fixed point < 0 limit cycle  = 0 chaos 0 < < ∞ noise  = ∞ The inverse of the maximal Lyap. Exp. defines the time beyond which predictability is impossible.

Exponents and dimensions Sensibility to initial conditions 22 The most striking feature of chaos is the long-term unpredictability of the future despite a deterministic time evolution. The cause is the inherent instability of the solutions, reflected in their sensitive dependence on initial conditions. Maximal Lyapunov exponent from time series: - Delay embedding - Compute the average diverging rate: - The slope of S(  n) is an estimate of the maximal LE

Exponents and dimensions Correlation dimension 23 Strange attractors with fractal dimension are typical of chaotic systems. Non integer dimensions are assigned to geometrical objects which exhibit self-similarity and structure on all length scales. Box-counting dimension: For time series: - Delay embedding - Compute the correlation sum: Kaplan-Yorke conjecture relates D to Lyapunov spectra

Exponents and dimensions Applications 24 Nonlinear statistics such exponents, dimensions, prediction errors, etc., can be computed to characterize non-trivial differences in signals (EEG) between different stages (brain states: sleep/rest, eyes open/closed). Word of caution: such quantities are used to compare data from similar situations! Ex: ECG series taken during exercise are more noisy due to sweat of patient skin. The different noise levels at rest and exercising can affect the former nonlinear estimators and erroneusly conclude a higher complexity of the heart during exercise just because of sweat on the skin.

OUTLINE Introduction to nonlinear systems 25 Synchronization Phase space methods Exponents and dimensions  Interdependence measures Take-home messages Causality: Transfer entropy

Interdependence measures Synchronization 26 Synchronization is the dynamical process by which two or more oscillators adjust their rhythms due to their weak interaction. A universal phenomenon found everywhere: Mechanical systems (pendula, London’s bridge, …) Electrical generators (power grids, Josephson junctions, …) Life sciences (biological clocks, firing neurons, pacemaker cells, …) Chemical reactions (Belousov-Zhabotinsky)... Synchronization refers to the way in which coupled elements, due to their dynamics, communicate and exhibit collective behavior. In large populations of oscillators synchronization can be understood as a self-organization process. Without a master o leader the individuals spontaneously tend to oscillate in synchrony. Neural synchronization is one of the most promising mechanism to underlay the flexible formation of cell assemblies and thus bind the information processed at different areas.

Interdependence measures Synchronization hallmarks 27 Before coupling:  f  F=F 2 -F 1 =0 f2f2 f1f1 After coupling:  F    in-phase Higher order: F 2 /F 1 =q/p    anti-phase Phase shift  is fixed:  Phase can be extracted from data by several techniques: Hilbert transform Wavelet transform Poincare map __________________________Frequency locking___________________________ ____________________________Phase locking______________________________

Interdependence measures Synchronization solutions Information theory: mutual information, entropies, Different types of synchronization capture different relationships between the signals x 1 (t) and x 2 (t) of two interacting systems: Classical: adjustment of rhythms in periodic oscillators. Identical: coincidence of outputs due to their coupling, x 1 (t)=x 2 (t). Generalized: captures a more general relationship like x 1 (t)=F(x 2 (t)). Phase: expresses the regime where the phase difference between two irregular oscillators is bounded but their amplitudes are uncorrelated. Lag: accounts for relation between two systems when compared at different times such as x 1 (t)=x 2 (t-  ). Noise-induced: synchronization induced by a common noise source.

Correlation does not imply causality 29 In March 1999, Nature: One year later... (also in Nature):

30 Transfer entropy Shannon entropyMutual information Kolmogorov entropy Static Dynamic UnivariateBivariate Information theory i1i1 inin i n+1 iMiM j1j1 jnjn j n+1 jMjM I J Kullback-Leibler divergence Interdependence measures Nonlinear causality measures

INTRODUCTION 31 Embedding of the time series from each channel (Cao´s criterium) Kozachenko-Leonenko estimator of entropies from kernel-based probability densities Apply transfer entropy concept to determine the causal/effective connectivity from ERFs (MEG). Interdependence measures Nonlinear causality measures together M. Wibral, J. Triesch, G. Pipa

OUTLINE Introduction to nonlinear systems 32 Phase space methods Exponents and dimensions Interdependence measures  Take-home messages

Take-home messages Linear vs nonlinear 33 - Linear techniques are much well understood and rigorous. - Linear and nonlinear estimates may assess different characteristics of the signals. - Complementary approaches to the analysis of temporal series. Even though most of systems in Nature are nonlinear do not underestimate linear methods

Take-home messages Phase space methods 34 - Attractor reconstruction is a powerful technique to recover the topological structure of an attractor given a scalar time series. - Useful complexity quantifiers of the signal can be computed after the reconstruction. - Word of caution in their use.

Take-home messages Do it yourself... with a little help 35 - TISEAN is a very complete software package for nonlinear time series analysis - „Nonlinear time series analysis“ by Holger Kantz and Thomas Schreiber, Cambridge University Press.