Presentation is loading. Please wait.

Presentation is loading. Please wait.

Nonlinear estimators and time embedding Raul Vicente FIAS Frankfurt, 08-08-2007.

Similar presentations


Presentation on theme: "Nonlinear estimators and time embedding Raul Vicente FIAS Frankfurt, 08-08-2007."— Presentation transcript:

1 Nonlinear estimators and time embedding Raul Vicente raulvicente@mpih-frankfurt.mpg.de FIAS Frankfurt, 08-08-2007

2 OUTLINE Introduction to nonlinear systems 2 Phase space methods Exponents and dimensions Interdependence measures Take-home messages

3 OUTLINE  Introduction to nonlinear systems 3 Definition Why nonlinear methods? Linear techniques Phase space methods Exponents and dimensions Interdependence measures Take-home messages

4 INTRODUCTION Definition A nonlinear system is one whose behavior can‘t be expressed as a sum of the behaviors of its parts. In technical terms, the behavior of nonlinear systems is not subject to the principle of superposition. 4 „Nonlinear“ is a very popular word in (neuro)science but what does it really mean? The brain as a whole is a nonlinear „device“ Ex: our perception can be more than the sum of responses to individual stimulus Surface completion

5 INTRODUCTION Definition A nonlinear system is one whose behavior can‘t be expressed as a sum of the behaviors of its parts. In technical terms, the behavior of nonlinear systems is not subject to the principle of superposition. 4 „Nonlinear“ is a very popular word in (neuro)science but what does it really mean? Individual neurons are also nonlinear Excitable cells with all or none responses Double the input does not mean double the output Nonlinear frequency response

6 INTRODUCTION Definition A nonlinear system is one whose behavior can‘t be expressed as a sum of the behaviors of its parts. In technical terms, the behavior of nonlinear systems is not subject to the principle of superposition. 4 „Nonlinear“ is a very popular word in (neuro)science but what does it really mean? Individual neurons are also nonlinear Double the input does not mean double the output The brain as a whole is a nonlinear „device“ Ex: our perception can be more than the sum of responses to individual stimulus

7 INTRODUCTION Why nonlinear methods? 5 „The study of non-linear physics is like the study of non-elephant biology“ Unknown Neuronal activity is highly nonlinear Nonlinear features will be present in the recorded neurophysiological data From neuronal action potentials (spikes) to integrated activity (EEG, MEG, fMRI) Linear techniques might fail to capture key information Nonlinear indices: measure complexity of EEG, monitoring depth of anaesthesia, studies of epilepsy, detection of interdependence, etc...

8 INTRODUCTION Linear techniques 6 Linear systems always need irregular inputs to produce bounded irregular signals Linear System Linear System Most simple system which produces nonperiodic (interesting) signals is a linear stochastic process...,S n-1, S n, S n+1,... Measurement of state S n at time n of such a process p(s) probability dist. Information about p(s) can be inferred from the time series:

9 INTRODUCTION Linear techniques 7 Linear methods interpret all regular structure in a data set, such as a dominant frequency, as linear correlations (time or frequency domain) Autocorrelation at lag SnSn S n- SnSn SnSn Periodic signal Stochastic process Chaotic system Periodic autocorrelation Decaying autocorrelation Exponential decay ?

10 INTRODUCTION Linear techniques 8 Cross-correlation function: measures the linear correlation between two variables X and Y as a function of their delay time (  ) Cross-correlation at lag tendency to have similar values with the same sign tendency to have similar values with opposite sign suggest lack of linear interdependence EEG time series recorded from the two hemispheres in a rat X´=X 4 Y´=Y 4 r xy = 0.63r xy = 0.25  that maximizes this function estimator delay between signals

11 INTRODUCTION Linear techniques 8 Cross-correlation function: measures the linear correlation between two variables X and Y as a function of their delay time (  ) Cross-correlation at lag tendency to have similar values with the same sign tendency to have similar values with opposite sign suggest lack of linear interdependence  that maximizes this function estimator delay between signals Cross-correlogram histogram is also used to reveal the temporal coherence in the firing of neurons MT neurons in visual cortex of a macaque monkey

12 INTRODUCTION Linear techniques 9 Coherence: measures the linear correlation between two signals as a function of the frequency Coherence at frequency f activities of the signals in this frequency are linearly independent maximum linear correlation for this frequency In forming an estimate of coherence, it is always essential to simulate ensemble averaging. EEG and MEG signals are subdivided in epochs or for event-related data spectra are averaged over trials

13 INTRODUCTION Linear techniques 10 Prediction: we have a sequence of measuraments s n, n = 1,...,N and we want to predict the outcome of the following measurement, s N+1 Linear prediction minimising the error In-sample Out-of-sample

14 INTRODUCTION Linear techniques 11 Causality: (Nobert Wiener in 1956) for two simultaneously measured signals, if one can predict the first signal better by incorporating the past information from the second signal than using only information from the first one, then the second signal can be called causal to the first one Predicting the future of X improves when incorporating the information about the past of Y → Y is causal to X X1X1 Y1Y1 Time X2X2 In neurophysiology, a question of great interest is whether there exists a causal relation between two brain regions. Inferring causality from the time delay in the cross-correlation is not always straighforward

15 INTRODUCTION Linear techniques 12 Causality: (Nobert Wiener in 1956) for two simultaneously measured signals, if one can predict the first signal better by incorporating the past information from the second signal than using only information from the first one, then the second signal can be called causal to the first one X1X1 Y1Y1 Time X2X2 Granger just applied this definition in the context of linear stochastic models. If X is influencing Y, then adding the past values of the first variable to the regression of the second one will improve its prediction error. Univariate fitting Bivariate fitting Prediction performance: is assessed by the variances of the prediction errors Granger causality

16 OUTLINE Introduction to nonlinear systems 13 The concept of phase space Attractor reconstruction Time embedding Application: nonlinear predictor  Phase space methods Exponents and dimensions Interdependence measures Take-home messages

17 Phase space The concept 14 Phase space: is a space in which all possible states of a system are represented, with each possible state of the system corresponding to one unique point in the phase space Ex: the Fitzhugh-Nagumo model is a two-dimensional simplification of the Hodgkin-Huxley model of spike generation In a phase space, every degree of freedom or parameter of the system is represented as an axis of a multidimensional space. Ex: dim(HH) = 4 - membrane potential - recovery variable (V, W) V(t) W(t) For deterministic systems (no noise), the system state at time t consists of all information needed to uniquely determine the future system states for times > t

18 Phase space Attractor reconstruction 15 Attractor: a set of points in phase space such that for "many" choices of initial point the system will evolve towards them. It is a set to which the system evolves after a long enough time Attractor set A point A curve A manifold Behavior Constant Periodic Possibly chaotic Van der Pol limit cycle attractor Lorenz attractor Strange attractors produce chaotic behavior. Nonlinear systems: irregular dynamics without invoking noise!

19 Phase space Attractor reconstruction 16 In general (and especially in biological systems) it is impossible to access all relevant variables of a system. Ex: usually in electrophysiology we just measure membrane voltage. These vectors constructed from a single variable play a role similar to [x(t),y(t),z(t)] How from a single measured quantity can one reconstruct the original attractor? ?

20 Phase space Time embedding 17 In general (and especially in biological systems) it is impossible to access all relevant variables of a system. Ex: usually in electrophysiology we just measure membrane voltage. Time delay embedding m > 2D F T ~ first zero autocorrelation

21 Phase space Application: nonlinear predictor 18 - A signal does not change is easy to predict: take the last observation as a forecast for the next one Depending on the type of signals the power of predictability and the best strategy changes: - A periodic system is also easy one observed for a full cycle - For independent random numbers the best prediction is the mean value - Interesting signals are not periodic but contain some kind of structure which can be exploited to obtain better predictions If the source of predictability are linear correlations in time: next observations will be given approximately by a linear combination of preceding observations What if I know that my series is nonlinear?

22 Phase space Application: nonlinear predictor 19 For nonlinear deterministic systems all future states are unambiguosly determined by specifying its present state. Nonlinear correlations can be exploited with new techniques Lorenz method of analogues: Prediction for the future state X N+1 Look for recorded states close to the one we want to predict Predict the average of the next states of the past neighbors Current state Neighbors Next values of Better prediction for short time scales than linear predictors Predicted state

23 OUTLINE Introduction to nonlinear systems 20 Sensibility to initial conditions: Lyapunov exp. Self-similarity: correlation dimension Phase space methods  Exponents and dimensions Interdependence measures Take-home messages

24 Exponents and dimensions Sensibility to initial conditions 21 The most striking feature of chaos is the long-term unpredictability of the future despite a deterministic time evolution. The cause is the inherent instability of the solutions, reflected in their sensitive dependence on initial conditions. Amplification of errors since nearby trajectories separate exponentially fast. How fast it is measured by the Lyapunov exponent. type of motionmaximal Lyap. exp. stable fixed point < 0 limit cycle  = 0 chaos 0 < < ∞ noise  = ∞ The inverse of the maximal Lyap. Exp. defines the time beyond which predictability is impossible.

25 Exponents and dimensions Sensibility to initial conditions 22 The most striking feature of chaos is the long-term unpredictability of the future despite a deterministic time evolution. The cause is the inherent instability of the solutions, reflected in their sensitive dependence on initial conditions. Maximal Lyapunov exponent from time series: - Delay embedding - Compute the average diverging rate: - The slope of S(  n) is an estimate of the maximal LE

26 Exponents and dimensions Correlation dimension 23 Strange attractors with fractal dimension are typical of chaotic systems. Non integer dimensions are assigned to geometrical objects which exhibit self-similarity and structure on all length scales. Box-counting dimension: For time series: - Delay embedding - Compute the correlation sum: Kaplan-Yorke conjecture relates D to Lyapunov spectra

27 Exponents and dimensions Applications 24 Nonlinear statistics such exponents, dimensions, prediction errors, etc., can be computed to characterize non-trivial differences in signals (EEG) between different stages (brain states: sleep/rest, eyes open/closed). Word of caution: such quantities are used to compare data from similar situations! Ex: ECG series taken during exercise are more noisy due to sweat of patient skin. The different noise levels at rest and exercising can affect the former nonlinear estimators and erroneusly conclude a higher complexity of the heart during exercise just because of sweat on the skin.

28 OUTLINE Introduction to nonlinear systems 25 Synchronization Phase space methods Exponents and dimensions  Interdependence measures Take-home messages

29 Interdependence measures Synchronization 24 Synchronization is the dynamical process by which two or more oscillators adjust their rhythms due to their weak interaction. A universal phenomenon found everywhere: Mechanical systems (pendula, London’s bridge, …) Electrical generators (power grids, Josephson junctions, …) Life sciences (biological clocks, firing neurons, pacemaker cells, …) Chemical reactions (Belousov-Zhabotinsky)... Synchronization refers to the way in which coupled elements, due to their dynamics, communicate and exhibit collective behavior. In large populations of oscillators synchronization can be understood as a self-organization process. Without a master o leader the individuals spontaneously tend to oscillate in synchrony. Neural synchronization is one of the most promising mechanism to underlay the flexible formation of cell assemblies and thus bind the information processed at different areas.

30 Interdependence measures Synchronization hallmarks 25 Before coupling:  f  F=F 2 -F 1 =0 f2f2 f1f1 After coupling:  F    in-phase Higher order: F 2 /F 1 =q/p    anti-phase Phase shift  is fixed:  Phase can be extracted from data by several techniques: Hilbert transform Wavelet transform Poincare map __________________________Frequency locking___________________________ ____________________________Phase locking______________________________

31 Interdependence measures Synchronization solutions Information theory: mutual information, entropies,... 26 Different types of synchronization capture different relationships between the signals x 1 (t) and x 2 (t) of two interacting systems: Classical: adjustment of rhythms in periodic oscillators. Identical: coincidence of outputs due to their coupling, x 1 (t)=x 2 (t). Generalized: captures a more general relationship like x 1 (t)=F(x 2 (t)). Phase: expresses the regime where the phase difference between two irregular oscillators is bounded but their amplitudes are uncorrelated. Lag: accounts for relation between two systems when compared at different times such as x 1 (t)=x 2 (t-  ). Noise-induced: synchronization induced by a common noise source.

32 OUTLINE Introduction to nonlinear systems 27 Phase space methods Exponents and dimensions Interdependence measures  Take-home messages

33 Take-home messages Linear vs nonlinear 28 - Linear techniques are much well understood and rigorous. - Linear and nonlinear estimates may assess different characteristics of the signals. - Complementary approaches to the analysis of temporal series. Even though most of systems in Nature are nonlinear do not underestimate linear methods

34 Take-home messages Phase space methods 29 - Attractor reconstruction is a powerful technique to recover the topological structure of an attractor given a scalar time series. - Useful complexity quantifiers of the signal can be computed after the reconstruction. - Word of caution in their use.

35 Take-home messages Do it yourself... with a little help 29 - TISEAN is a very complete software package for nonlinear time series analysis - „Nonlinear time series analysis“ by Holger Kantz and Thomas Schreiber, Cambridge University Press.


Download ppt "Nonlinear estimators and time embedding Raul Vicente FIAS Frankfurt, 08-08-2007."

Similar presentations


Ads by Google