SYSTEMS Identification

Slides:



Advertisements
Similar presentations
ECE 8443 – Pattern Recognition EE 3512 – Signals: Continuous and Discrete Objectives: Response to a Sinusoidal Input Frequency Analysis of an RC Circuit.
Advertisements

ELEC 303 – Random Signals Lecture 20 – Random processes
STAT 497 APPLIED TIME SERIES ANALYSIS
Lecture 6 Power spectral density (PSD)
DEPARTMENT OF HEALTH SCIENCE AND TECHNOLOGY STOCHASTIC SIGNALS AND PROCESSES Lecture 1 WELCOME.
EE322 Digital Communications
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
SYSTEMS Identification
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
SYSTEMS Identification
Multivariable Control Systems
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
Review of Probability and Random Processes
Lecture 16 Random Signals and Noise (III) Fall 2008 NCTU EE Tzu-Hsien Sang.
Sep 20, 2005CS477: Analog and Digital Communications1 Random variables, Random processes Analog and Digital Communications Autumn
Review of Probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
Probability Theory and Random Processes
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
COSC 4214: Digital Communications Instructor: Dr. Amir Asif Department of Computer Science and Engineering York University Handout # 2: Random Signals.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
Review for Exam I ECE460 Spring, 2012.
Week 2ELE Adaptive Signal Processing 1 STOCHASTIC PROCESSES AND MODELS.
EE484: Probability and Introduction to Random Processes Autocorrelation and the Power Spectrum By: Jason Cho
Random Processes ECE460 Spring, Power Spectral Density Generalities : Example: 2.
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
Lecture 2 Signals and Systems (I)
111 Lecture 2 Signals and Systems (II) Principles of Communications Fall 2008 NCTU EE Tzu-Hsien Sang.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Elements of Stochastic Processes Lecture II
CHEE825 Fall 2005J. McLellan1 Spectral Analysis and Input Signal Design.
1 Lecture 1: February 20, 2007 Topic: 1. Discrete-Time Signals and Systems.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Review of Random Process Theory CWR 6536 Stochastic Subsurface Hydrology.
CHAPTER 5 SIGNAL SPACE ANALYSIS
ارتباطات داده (883-40) فرآیندهای تصادفی نیمسال دوّم افشین همّت یار دانشکده مهندسی کامپیوتر 1.
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
1 EE571 PART 4 Classification of Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart.
Chapter 1 Random Process
ENEE631 Digital Image Processing (Spring'04) Basics on 2-D Random Signal Spring ’04 Instructor: Min Wu ECE Department, Univ. of Maryland, College Park.
Fourier series, Discrete Time Fourier Transform and Characteristic functions.
Geology 6600/7600 Signal Analysis 21 Sep 2015 © A.R. Lowry 2015 Last time: The Cross-Power Spectrum relating two random processes x and y is given by:
Discrete-time Random Signals
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Geology 6600/7600 Signal Analysis 28 Sep 2015 © A.R. Lowry 2015 Last time: Energy Spectral Density; Linear Systems given (deterministic) finite-energy.
Geology 6600/7600 Signal Analysis 09 Sep 2015 © A.R. Lowry 2015 Last time: Signal Analysis is a set of tools used to extract information from sequences.
Geology 5600/6600 Signal Analysis 14 Sep 2015 © A.R. Lowry 2015 Last time: A stationary process has statistical properties that are time-invariant; a wide-sense.
ELEC 303 – Random Signals Lecture 19 – Random processes Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 12, 2009.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad.
Geology 6600/7600 Signal Analysis 05 Oct 2015 © A.R. Lowry 2015 Last time: Assignment for Oct 23: GPS time series correlation Given a discrete function.
EEE Chapter 6 Random Processes and LTI Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern.
Auditory Perception: 2: Linear Systems. Signals en Systems: To understand why the auditory system represents sounds in the way it does, we need to cover.
Oh-Jin Kwon, EE dept., Sejong Univ., Seoul, Korea: 2.3 Fourier Transform: From Fourier Series to Fourier Transforms.
Random Signals Basic concepts Bibliography Oppenheim’s book, Appendix A. Except A.5. We study a few things that are not in the book.
Eeng360 1 Chapter 2 Linear Systems Topics:  Review of Linear Systems Linear Time-Invariant Systems Impulse Response Transfer Functions Distortionless.
Chapter 2. Signals and Linear Systems
Chapter 6 Random Processes
SIGNALS PROCESSING AND ANALYSIS
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
UNIT-I SIGNALS & SYSTEMS.
STOCHASTIC HYDROLOGY Random Processes
SYSTEMS Identification
Chapter 6 Random Processes
Presentation transcript:

SYSTEMS Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad <<<1.1>>> ###Control System Design### {{{Control, Design}}} Reference: “System Identification Theory For The User” Lennart Ljung(1999)

Lecture 2 Introduction Topics to be covered include: Impulse responses and transfer functions. Frequency domain expression. Stochastic Process. Signal spectra Disturbances Ergodicity 2 2

Impulse responses It is well known that a linear, time-invariant, causal system can be described as: Sampling Most often, the input signal u(t) is kept constant between the sampling instants: So 3 3

Impulse responses For ease of notation assume that T is one time unit and use t to enumerate the sampling instants 4

Transfer functions Define forward and backward shift operator q and q-1 as Now we can write output as: G(q) is the transfer operator or transfer function Similarly for disturbance we have So the basic description for a linear system with additive disturbance is: 5

Transfer functions Some terminology G(q) is the transfer operator or transfer function We shall say that the transfer function G(q) is stable if This means that G(z) is analytic on and outside the unit circle. We shall say the filter H(q) is monic if h(0)=1: 6

Frequency-domain expressions Let It will convenient to write Now we can write output as: So we have 7

Frequency-domain expressions Now suppose For stable system 8

Periodograms of signals over finite intervals Fourier transform (FT) Discrete Fourier transform (DFT) Exercise1: Show that u(t) can be derived by putting UN(ω) in u(t). 9

Periodograms of signals over finite intervals Some property of UN(ω) The function UN(ω) is therefore uniquely defined by its values over the interval [ 0, 2π ]. It is, however, customary to consider UN(ω) for the interval [ - π, π ]. So u(t)can be defined as The number UN(ω) tell us the weight that the frequency ω carries in the decomposition. So Is known as the periodogram of the signal u(t), t= 1 , 2 , 3 , ….. Parseval’s relationship: 10

Periodograms of signals over finite intervals Example: Periodogram of a sinusoid 11

Periodograms of signals over finite intervals Discrete Fourier transform (DFT) The periodogram defines, in a sense, the frequency contents of a signal over a finite time interval. But we seek for a definition of a similar concept for signals over the interval [1, ∞). 12 But this limits fail to exist for many signals of practical interest. 12 12

Transformation of Periodograms As a signal is filtered through a linear system, its Periodograms changes. Let: Define: Claim: where 13

Transformation of Periodograms Claim: Proof: Now 14

Transformation of Periodograms Claim: Proof: Now 15

Transformation of Periodograms Claim: Proof: So 16

Stochastic Processes A random variable (RV) is a rule (or function) that assigns a real number to every outcome of a random experiment. The closing price of Iranian power market observed from Apr. 15 to Sep. 22, 2009. For scalar (RV) For vector (RV) Probability density function (PDF) If e may assume a certain value with nonzero probability then fe contains δ function. Two random variables e1 and e2 are independent, if we have: Definition: The expectation E[e] of a random variable e is: Definition: The variance, Cov[e], of a random variable, e, is: 17

Stochastic Processes A stochastic process is a rule (or function) that assigns a time function to every outcome of a random experiment. Consider the random experiment of tossing a dice at t = 0 and observing the number on the top face. The sample space of this experiment consists of the outcomes {1, 2, 3, · · · , 6}. For each outcome of the experiment, let us arbitrarily assign a function of time t in the following manner. The set of functions {x1(t), x2(t), · · , x6(t)} represents a stochastic process. 18

Stochastic Processes Mean of a random process X(t) is In general, mX(t) is a function of time. Correlation RX(t1, t2) of a random process X(t) is Note RX(t1, t2) is a function of t1 and t2. Autocovariance CX(t1, t2) of a random process X(t) is defined as the covariance of X(t1) and X(t2): In particular, when t1 = t2 = t, we have 19

Stochastic Processes Example Sinusoid with random amplitude 20

Stochastic Processes Example Sinusoid with random phase 21

Stochastic Processes x(t) is stationary if Example Sinusoid with random phase Clearly x(t) is a stationary (WSS). Example Sinusoid with random amplitude Clearly x(t) is not a stationary. This may be a limiting definition. ????? 22

A Common Framework for Deterministic and Stochastic Signals Signal Spectra A Common Framework for Deterministic and Stochastic Signals y(t) is not a stationary process This may be a limiting definition. ????? To deal with this problem, we introduce the following definition: Quasi-stationary signals 23

If {s(t)} is a deterministic sequence Stochastic Processes x(t) is stationary if Quasi-stationary signals: A signal {s(t)} is said to be quasi-stationary if it is subject to and ( ) Quasi-stationary If {s(t)} is a deterministic sequence means {s(t)} is a bounded sequence and If {s(t)} is a stationary stochastic process It is quasi stationary since 24

The notation means that the limit exists. Signal Spectra Notation: The notation means that the limit exists. Quasi-stationary signals: A signal {s(t)} is said to be quasi-stationary if it is subject to and ( ) Sometimes with some abuse of notation, we call it Covariance function of s. Exercise2: Show that sometime it is exactly covariance function of s. 25

Signal Spectra Two signals {s(t)} and {w(t)} are jointly quasi-stationary if: 1- They both are quasi-stationary, 2- the cross-covariance function exist. Uncorrelated 26

Discrete Fourier transform (DFT) Signal Spectra Discrete Fourier transform (DFT) The periodogram defines, in a sense, the frequency contents of a signal over a finite time interval. But we seek for a definition of a similar concept for signals over the interval [1, ∞). 27 But this limits fail to exist for many signals of practical interest. So we shall develop a frame work for describing signals and their spectra that is applicable to deterministic as well as stochastic signals. 27 27

Signal Spectra Use Fourier transform of covariance function (Spectrum or Spectral density) We define the (power) spectrum of {s(t)} as When following limits exists: and cross spectrum between {s(t)} and {w(t)} as When following limits exists: Exercise3: Show that spectrum always is a real function but cross spectrum is in general a complex-valued function. 28

Signal Spectra Exercise4 : Spectra of a Periodic Signal: Consider a deterministic, periodic signal with period M, i.e., s(t)=s(t+M) Show that Where And finally show that 29

Signal Spectra Exercise5: Spectra of a Sinusoid: Consider Show that 30

Signal Spectra Example Stationary Stochastic Processes: Consider v(t) as a stationary stochastic processes We will assume that e(t) has zero mean and variance λ . It is clear that: The spectrum is: Where Exercise6: Show (I) 31

Spectrum of Stationary Stochastic Processes Signal Spectra Spectrum of Stationary Stochastic Processes The stochastic process described by v(t)= H(q)e(t), where {e(t)} is a sequence of independent random variables with zero mean values and covariances λ , has the spectrum 32

Spectrum of a Mixed Deterministic and Stochastic Signal Signal Spectra Spectrum of a Mixed Deterministic and Stochastic Signal deterministic Stochastic: stationary and zero mean Exercise7: Proof it. 33

Transformation of Spectra by Linear Systems Theorem: Let{w(t)} be a quasi-stationary with spectrum , and let G(q) be a stable transfer function. Let Then {s(t)} is also quasi-stationary and 34

Disturbances There are always signals beyond our control that also affect the system. We assume that such effects can be lumped into an additive term v(t) at the output + u(t) y(t) v(t) So There are many sources and causes for such a disturbance term. Measurement noise. Uncontrollable inputs. ( a person in a room produce 100 W/person) 35

Disturbances Characterization of disturbances Its value is not known beforehand. Making qualified guesses about future values is possible. It is natural to employ a probabilistic framework to describe future disturbances. We put ourselves at time t and would like to know disturbance at t+k, k ≥ 1 so we use the following approach. Where e(t) is a white noise. This description does not allow completely general characteristic of all possible probabilistic disturbances, but it is versatile enough. 36

Disturbances Consider for example, the following PDF for e(t): Small values of μ are suitable to describe classical disturbance patterns, steps, pulses, sinuses and ramps. A realization of v(t) for propose e(t) Exercise8: Derive above figure for μ=0.1 and μ=0.9 and a suitable h(k). 37

Disturbances On the other hand, the PDF: A realization of v(t) for propose e(t) Often we only specify the second-order properties of the sequence {e(t)} that is the mean and variances. Exercise9: What is a white noise? Exercise10: Derive above figure for δ=0.1 and δ =0.9 and a suitable h(k). 38

Disturbances Mean: Covariance: We will assume that e(t) has zero mean and variance λ . Now we want to know the characteristic of v(t) : Mean: Covariance: 39

Disturbances Mean: Covariance: We will assume that e(t) has zero mean and variance λ . Now we want to know the characteristic of v(t) : Mean: Covariance: Since the mean and covariance are do not depend on t, the process is said to be stationary. 40

Ergodicity Suppose you are concerned with determining what the most visited parks in a city are. One idea is to take a momentary snapshot: to see how many people are this moment in park A, how many are in park B and so on. Another idea is to look at one individual (or few of them) and to follow him for a certain period of time, e.g. a year. The first one may not be representative for a longer period of time, while the second one may not be representative for all the people. The idea is that an ensemble is ergodic if the two types of statistics give the same result. Many ensembles, like the human populations, are not ergodic. 41

Ergodicity Let x(t) is a stochastic process Most of our computations will depend on a given realization of a quasi stationary process. Ergodicity will allow us to make statements about repeated experiments. 42