Complex Probability and Markov Stochastic Process BIJAN BIDABAD WSEAS Post Doctorate Researcher No. 2, 12th St., Mahestan Ave., Shahrak Gharb, Tehran,

Slides:



Advertisements
Similar presentations
The complex numbers To make many of the rules of mathematics apply universally we need to enlarge our number field. If we desire that every integer has.
Advertisements

Ch 7.6: Complex Eigenvalues
Longest Common Subsequence
22 March 2009Instructor: Tasneem Darwish1 University of Palestine Faculty of Applied Engineering and Urban Planning Software Engineering Department Introduction.
Chapter 6 Eigenvalues and Eigenvectors
Lecture 6  Calculating P n – how do we raise a matrix to the n th power?  Ergodicity in Markov Chains.  When does a chain have equilibrium probabilities?
Insurance and Chain Bankruptcy Theory BIJAN BIDABAD WSEAS Post Doctorate Researcher No. 2, 12th St., Mahestan Ave., Shahrak Gharb, Tehran, IRAN
Markov Chains 1.
BIJAN BIDABAD WSEAS Post Doctorate Researcher No. 2, 12th St., Mahestan Ave., Shahrak Gharb, Tehran, IRAN
Math 3121 Abstract Algebra I
1 Markov Chains (covered in Sections 1.1, 1.6, 6.3, and 9.4)
1 Part III Markov Chains & Queueing Systems 10.Discrete-Time Markov Chains 11.Stationary Distributions & Limiting Probabilities 12.State Classification.
Андрей Андреевич Марков. Markov Chains Graduate Seminar in Applied Statistics Presented by Matthias Theubert Never look behind you…
Operations Research: Applications and Algorithms
What is the probability that the great-grandchild of middle class parents will be middle class? Markov chains can be used to answer these types of problems.
1 CE 530 Molecular Simulation Lecture 8 Markov Processes David A. Kofke Department of Chemical Engineering SUNY Buffalo
Introduction to stochastic process
Multivariable Control Systems Ali Karimpour Assistant Professor Ferdowsi University of Mashhad.
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 60 Chapter 8 Markov Processes.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete time Markov chains (Sec )
Inferences About Process Quality
Functional Form for Estimating the Lornez Curve BIJAN BIDABAD WSEAS Post Doctorate Researcher No. 2, 12th St., Mahestan Ave., Shahrak Gharb, Tehran,
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete time Markov chains (Sec. 7.1)
Group exercise For 0≤t 1
5.3 Complex Numbers; Quadratic Equations with a Negative Discriminant.
6. Markov Chain. State Space The state space is the set of values a random variable X can take. E.g.: integer 1 to 6 in a dice experiment, or the locations.
Sullivan Algebra and Trigonometry: Section 1.3 Quadratic Equations in the Complex Number System Objectives Add, Subtract, Multiply, and Divide Complex.
Applied Discrete Mathematics Week 9: Relations
The Complex Numbers The ratio of the length of a diagonal of a square to the length of a side cannot be represented as the quotient of two integers.
Chapter 8. Section 8. 1 Section Summary Introduction Modeling with Recurrence Relations Fibonacci Numbers The Tower of Hanoi Counting Problems Algorithms.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
1 § 1-4 Limits and Continuity The student will learn about: limits, infinite limits, and continuity. limits, finding limits, one-sided limits,
Control Engineering Lecture# 10 & th April’2008.
BIJAN BIDABAD WSEAS Post Doctorate Researcher No. 2, 12th St., Mahestan Ave., Shahrak Gharb, Tehran, IRAN
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Chapter 61 Continuous Time Markov Chains Birth and Death Processes,Transition Probability Function, Kolmogorov Equations, Limiting Probabilities, Uniformization.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
© 2015 McGraw-Hill Education. All rights reserved. Chapter 19 Markov Decision Processes.
1 8. One Function of Two Random Variables Given two random variables X and Y and a function g(x,y), we form a new random variable Z as Given the joint.
Basic Principles (continuation) 1. A Quantitative Measure of Information As we already have realized, when a statistical experiment has n eqiuprobable.
CS433 Modeling and Simulation Lecture 07 – Part 01 Continuous Markov Chains Dr. Anis Koubâa 14 Dec 2008 Al-Imam.
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
Discrete Time Markov Chains
Advanced Engineering Mathematics, 7 th Edition Peter V. O’Neil © 2012 Cengage Learning Engineering. All Rights Reserved. CHAPTER 4 Series Solutions.
By Josh Zimmer Department of Mathematics and Computer Science The set ℤ p = {0,1,...,p-1} forms a finite field. There are p ⁴ possible 2×2 matrices in.
Linear Algebra Chapter 2 Matrices.
One Function of Two Random Variables
CS433 Modeling and Simulation Lecture 11 Continuous Markov Chains Dr. Anis Koubâa 01 May 2009 Al-Imam Mohammad Ibn Saud University.
Stochastic Processes and Transition Probabilities D Nagesh Kumar, IISc Water Resources Planning and Management: M6L5 Stochastic Optimization.
Meaning of Markov Chain Markov Chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only.
Prof. Wahied Gharieb Ali Abdelaal
ISHIK UNIVERSITY FACULTY OF EDUCATION Mathematics Education Department
Modeling with Recurrence Relations
Industrial Engineering Dep
Much More About Markov Chains
Chapter 3 The Real Numbers.
Section 4.1 Eigenvalues and Eigenvectors
Limits and Continuity The student will learn about: limits,
Complex integers? Here a and b are integers.
Chapter 3 The Real Numbers.
Complex Probability and Markov Stochastic Process
Triangular Causality in Controlling Parallel Exchange Market by Monetary Targeting and Anti-Inflationary Policies   BIJAN BIDABAD WSEAS Post Doctorate.
Lecture 4: Algorithmic Methods for G/M/1 and M/G/1 type models
Maths for Signals and Systems Linear Algebra in Engineering Lectures 10-12, Tuesday 1st and Friday 4th November2016 DR TANIA STATHAKI READER (ASSOCIATE.
Money-Transaction-Income Process
8. One Function of Two Random Variables
Recurrence Relations Discrete Structures.
8. One Function of Two Random Variables
Insurance and Chain Bankruptcy Theory
Functional Form for Estimating the Lornez Curve
Presentation transcript:

Complex Probability and Markov Stochastic Process BIJAN BIDABAD WSEAS Post Doctorate Researcher No. 2, 12th St., Mahestan Ave., Shahrak Gharb, Tehran, IRAN BEHROUZ BIDABAD Faculty of mathematics, Polytechnics University, Hafez Ave., Tehran, IRAN NIKOS MASTORAKIS Technical University of Sofia, Department of Industrial Engineering, Sofia, 1000 BULGARIA

Abstract This paper discusses the existence of "complex probability" in the real world sensible problems. By defining a measure more general than conventional definition of probability, the transition probability matrix of discrete Markov chain is broken to the periods shorter than a complete step of transition. In this regard, the complex probability is implied.

Introduction Sometimes analytic numbers coincide with the mathematical modeling of real world and make the real analysis of problems complex. All the measures in our everyday problems belong to R, and mostly to R +. Probability of occurrence of an event always belongs to the range [0,1]. In this paper, it is discussed that to solve a special class of Markov chain which should have solution in real world, we are confronted with "analytic probabilities"!. Though the name probability applies to the values between zero and one, we define a special analogue measure of probability as complex probability where the conventional probability is a subclass of this newly defined measure.

Issues and Resolutions Now, suppose that we intend to derive the t-step transition probability matrix P (t) where t≥0 from the above (3) and (4) definition of n-step transition probability matrix P. That is, to find the transition probability matrix for incomplete steps. On the other hand, we are interested to find the transition matrix P (t) when t is between two sequential integers. This case is not just a tatonnement example. To clarify the application of this phenomenon, consider the following example. Example 1. Usually in population census of societies with N distinct regions, migration information is collected in an NxN migration matrix for a period of ten years. Denote this matrix by M. Any element of M, m­ ij is the population who leaved region i and went to region j through the last ten years. By deviding each m ij to sum of the i th row of M, a value of P ij is computed as an estimate of probability of transition from i th to j th regions. Thus, the stochastic matrix P gives the probabilities of going from region i to region j in ten years (which is one–step transition probability matrix). The question is: how we can compute the transition probability matrix for one year or one-tenth step and so on.

Breaking the Time in Discrete Markov Chain

Discussion on Broken Times The broken time discrete Markov chain is not always a complex probability matrix defined by definition 1. Matrix P t has different properties with respect to t and eigenvalues. may be real (positive or negative) or complex depending on the characteristic polynomial of P. Since P is a non–negative matrix, Forbenius theorem (Takayama (1974), Nikaido (1970)) assures that P has a positive dominant eigenvalue Furthermore, if P is also a Markov matrix then its Frobenius root is equal to one, (Bellman (1970), Takayama (1974)). Therefore, With the above information, consider the following discussions.

Discussion on Broken Times

Complex Probability Justification The above remark 8 states that though there exist imaginary transition probabilities to move from state j to k, the total sum of “imaginary transitions” is equal to zero. On the other hand, after t th step transition, the total distribution has no imaginary part.

Summary By summarizing the discrete and continous times Markov stochastic processes a class of real world problems was introduced which can not be solved by each of the procedures. The solutions of these problems coincide with “Complex probabilities” of transitions that are inherent in mathematical formulation of the model. Complex probability is defined and some of its properties with respect to the cited class are examined. Justification of the idea of complex probability needs more work and is left for further research.

Complex Probability and Markov Stochastic Process BIJAN BIDABAD WSEAS Post Doctorate Researcher No. 2, 12th St., Mahestan Ave., Shahrak Gharb, Tehran, IRAN BEHROUZ BIDABAD Faculty of mathematics, Polytechnics University, Hafez Ave., Tehran, IRAN NIKOS MASTORAKIS Technical University of Sofia, Department of Industrial Engineering, Sofia, 1000 BULGARIA