Random processes. Matlab What is a random process?

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Random Processes Introduction
ELEN 5346/4304 DSP and Filter Design Fall Lecture 15: Stochastic processes Instructor: Dr. Gleb V. Tcheslavski Contact:
Random Variables ECE460 Spring, 2012.
Economics 20 - Prof. Anderson1 Stationary Stochastic Process A stochastic process is stationary if for every collection of time indices 1 ≤ t 1 < …< t.
10 Further Time Series OLS Issues Chapter 10 covered OLS properties for finite (small) sample time series data -If our Chapter 10 assumptions fail, we.
STAT 497 APPLIED TIME SERIES ANALYSIS
Stationary Stochastic Process
Random Processes ECE460 Spring, Random (Stocastic) Processes 2.
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Probability theory 2011 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different definitions.
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
Probability theory 2008 Outline of lecture 5 The multivariate normal distribution  Characterizing properties of the univariate normal distribution  Different.
Lecture II-2: Probability Review
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Modern Navigation Thomas Herring
Review of Probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Probability Theory and Random Processes
Probability for Estimation (review) In general, we want to develop an estimator for systems of the form: We will primarily focus on discrete time linear.
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Short Resume of Statistical Terms Fall 2013 By Yaohang Li, Ph.D.
Ch2: Probability Theory Some basics Definition of Probability Characteristics of Probability Distributions Descriptive statistics.
1 LES of Turbulent Flows: Lecture 1 Supplement (ME EN ) Prof. Rob Stoll Department of Mechanical Engineering University of Utah Fall 2014.
Review for Exam I ECE460 Spring, 2012.
AP Statistics Section 7.2 C Rules for Means & Variances.
Mathematical Preliminaries. 37 Matrix Theory Vectors nth element of vector u : u(n) Matrix mth row and nth column of A : a(m,n) column vector.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Elements of Stochastic Processes Lecture II
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Tim Marks, Dept. of Computer Science and Engineering Random Variables and Random Vectors Tim Marks University of California San Diego.
Review of Random Process Theory CWR 6536 Stochastic Subsurface Hydrology.
CHAPTER 5 SIGNAL SPACE ANALYSIS
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
Operations on Multiple Random Variables
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
Chapter 1 Random Process
Geology 6600/7600 Signal Analysis 02 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random.
Lab for Remote Sensing Hydrology and Spatial Modeling Dept of Bioenvironmental Systems Engineering National Taiwan University 1/45 GEOSTATISTICS INTRODUCTION.
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Joint Moments and Joint Characteristic Functions.
Geology 6600/7600 Signal Analysis 09 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
ECE4270 Fundamentals of DSP Lecture 8 Discrete-Time Random Signals I School of Electrical and Computer Engineering Center for Signal and Image Processing.
Geology 6600/7600 Signal Analysis 04 Sep 2014 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
Stochastic Process - Introduction
Multiple Random Variables and Joint Distributions
Introduction to Time Series Analysis
Time Series Analysis and Its Applications
Graduate School of Information Sciences, Tohoku University
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Stochastic models - time series.
MEGN 537 – Probabilistic Biomechanics Ch.3 – Quantifying Uncertainty
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Stochastic models - time series.
STOCHASTIC HYDROLOGY Random Processes
Introduction to Time Series Analysis
Introduction to Time Series Analysis
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to record.
Stationary Stochastic Process
Presentation transcript:

Random processes

Matlab What is a random process?

A random process Is defined by its finite-dimensional distributions –The probability of events at a finite number of time points The finite dimensional distributions have to be ‘consistent’ –Integrating over one time point gives the finite- dimensional distribution for the other time points Given a consistent family of finite-dimensional distributions on ‘good enough’ spaces, there is a unique process with those distributions (Kolmogorov) –‘Good enough’ means Borel

Stationarity and ergodicity How to measure the resting membrane potential of a neuron?

Stationarity and ergodicity I arrive this morning to the lab, prepare a neuron for recording and measure its membrane potential at 10am sharp. The value is mV. Is this the resting potential of the neuron?

Stationarity and ergodicity The measurement is noisy We want to have a number of repeats of the same measurement How to get repeated measurements?

Stationarity and ergodicity Repeated measurement: –I arrive this morning a second time to the lab, prepare a neuron for recording and measure its membrane potential at 10am sharp. The value is mV. What is the problem?

Stationarity and ergodicity Repeated measurement 1: –I arrive this morning to the lab 600 times, prepare a neuron for recording and measure its membrane potential at 10am sharp. Repeated measurement 2: –I measure the membrane potential of the same neuron as before once a second from 10:00 to 10:10 (I get 600 measurements)

Go to Matlab

Theoretically, Repeated measurement 1: –I arrive this morning to the lab 600 times, prepare a neuron for recording and measure its membrane potential at 10am sharp. Repeated measurement 2: –I measure the membrane potential of the same neuron as before once a second from 10:00 to 10:10 (I get 600 measurements)

Practically, Repeated measurement 1: –I arrive this morning to the lab 600 times, prepare a neuron for recording and measure its membrane potential at 10am sharp. Repeated measurement 2: –I measure the membrane potential of the same neuron as before once a second from 10:00 to 10:10 (I get 600 measurements)

What to do?

Ergodicity For an ergodic process, –Averaging across many repeated trials (repeated measurements 1) –Averaging across time for a single trial (repeated measurements 2) –Are equal An ergodic process is always stationary, the reverse may not be true

What makes a stationary process ergodic? Asymptotic independence Samples that are far enough in time are independent

Correlation, independence, gaussian and non-gaussian processes

Independence vs. lack of correlation Two variables are independent if knowing anything about one of them doesn’t allow you to make any deductions that you couldn’t already make about the other one Two variables are uncorrelated if their covariance is 0 Independence implies lack of correlation Lack of correlation in general does not imply independence

Go to Matlab

Independence vs. lack of correlation For variables that are jointly Gaussian, lack of correlation implies independence What are jointly Gaussian variables?

Jointly Gaussian variables The distribution of each by itself is gaussian The joint distribution of each pair is gaussian The joint distribution of each triplet is gaussian … (allowing for degeneracy)

Go to Matlab

Jointly gaussian variables Because of the issue of degeneracy, the formal definition is indirect For example: random variables are jointly gaussian if all linear combinations are gaussian (allowing the degenerate case of identically 0 variables) Or using characteristic functions

Characterizing jointly gaussian variables A 1-d Gaussian variable is fully characterized by its mean and variance These determine its probability density function and therefore all other quantifiers An n-d Gaussian variable is fully characterized by the mean of each component and their covariances These determine the joint probability density and therefore all other quantifiers

Gaussian process A random process is gaussian if all finite- dimensional distributions are jointly gaussian A Gaussian process is determined by specifying the mean at each moment in time and a matrix of covariances between the values at different moments in time All finite-dimensional distributions are Gaussian, and are therefore determined by the above data

Stationary Gaussian processes If the process is in addition stationary –The mean and variances are constant as a function of time –the 2-d distributions do not depend on the absolute time In that case, the covariance matrix is constant along the diagonals –‘Toeplitz matrices’ The covariance is specified by a function of the delay between samples

Stationary gaussian processes The autocovariance function is also called –Autocorrelation function –Covariance function –Correlation function –… Make sure you know the normalization (what is the value of the function at 0)