Fuzzkov 1 Eduardo Miranda (SoCCE-Univ. of Plymouth) Adolfo Maia Jr.(*) (NICS & IMECC –UNICAMP) Fuzzy Granular Synthesis through Markov Chains (*) Supported.

Slides:



Advertisements
Similar presentations
1 Gesture recognition Using HMMs and size functions.
Advertisements

ST3236: Stochastic Process Tutorial 3 TA: Mar Choong Hock Exercises: 4.
Fuzzy Sets and Applications Introduction Introduction Fuzzy Sets and Operations Fuzzy Sets and Operations.
1 The Monte Carlo method. 2 (0,0) (1,1) (-1,-1) (-1,1) (1,-1) 1 Z= 1 If  X 2 +Y 2  1 0 o/w (X,Y) is a point chosen uniformly at random in a 2  2 square.
ELEC 303 – Random Signals Lecture 20 – Random processes
Hidden Markov Models. A Hidden Markov Model consists of 1.A sequence of states {X t |t  T } = {X 1, X 2,..., X T }, and 2.A sequence of observations.
Tutorial 8 Markov Chains. 2  Consider a sequence of random variables X 0, X 1, …, and the set of possible values of these random variables is {0, 1,
Operations Research: Applications and Algorithms
Overview of Markov chains David Gleich Purdue University Network & Matrix Computations Computer Science 15 Sept 2011.
Albert Gatt Corpora and Statistical Methods Lecture 8.
Markov Chains Lecture #5
Computer Generated Jazz
July / º Simpósio Brasileiro de Computação Musical (SBCM2005) Joao Martins Marcelo Gimenes Jônatas Manzolli Adolfo.
Texture Turk, 91.
Hidden Markov Models Usman Roshan BNFO 601. Hidden Markov Models Alphabet of symbols: Set of states that emit symbols from the alphabet: Set of probabilities.
1 Hidden Markov Model Instructor : Saeed Shiry  CHAPTER 13 ETHEM ALPAYDIN © The MIT Press, 2004.
The moment generating function of random variable X is given by Moment generating function.
WELCOME TO THE WORLD OF FUZZY SYSTEMS. DEFINITION Fuzzy logic is a superset of conventional (Boolean) logic that has been extended to handle the concept.
Digital Communications I: Modulation and Coding Course Spring Jeffrey N. Denenberg Lecture 3b: Detection and Signal Spaces.
Chapter 5. Operations on Multiple R. V.'s 1 Chapter 5. Operations on Multiple Random Variables 0. Introduction 1. Expected Value of a Function of Random.
Group exercise For 0≤t 1
 Granular Synthesis: an overview. Overview  Sounds are made up of a large number of particles!  Examples of granular sounds Leaves Traffic Babbling.
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Gaussian Mixture Model and the EM algorithm in Speech Recognition
CS654: Digital Image Analysis Lecture 15: Image Transforms with Real Basis Functions.
CS654: Digital Image Analysis Lecture 12: Separable Transforms.
Granular Synthesis. Pre-Class Music Jon Nelson Scatter.
Markov Decision Processes1 Definitions; Stationary policies; Value improvement algorithm, Policy improvement algorithm, and linear programming for discounted.
README Lecture notes will be animated by clicks. Each click will indicate pause for audience to observe slide. On further click, the lecturer will explain.
2-1 Sample Spaces and Events Random Experiments Figure 2-1 Continuous iteration between model and physical system.
Neumaier Clouds Yan Bulgak October 30, MAR550, Challenger 165.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
Fourier Analysis of Discrete Time Signals
ECE 466/658: Performance Evaluation and Simulation Introduction Instructor: Christos Panayiotou.
Feedback Control Systems (FCS) Dr. Imtiaz Hussain URL :
Fall  Types of Uncertainty 1. Randomness : Probability Knowledge about the relative frequency of each event in some domain Lack of knowledge which.
Fourier Analysis of Signals and Systems
Robotics Research Laboratory 1 Chapter 7 Multivariable and Optimal Control.
MaskIt: Privately Releasing User Context Streams for Personalized Mobile Applications SIGMOD '12 Proceedings of the 2012 ACM SIGMOD International Conference.
Thanks / Acknowledgements  Professor Heinrich Taube (UIUC)   Professor Hernando Lopez-Lezcano (Stanford) 
2003/02/19 Chapter 2 1頁1頁 Chapter 2 : Basic Probability Theory Set Theory Axioms of Probability Conditional Probability Sequential Random Experiments Outlines.
Efficient Method of Solution of Large Scale Engineering Problems with Interval Parameters Based on Sensitivity Analysis Andrzej Pownuk Silesian University.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
CS654: Digital Image Analysis Lecture 11: Image Transforms.
Hidden Markov Models. A Hidden Markov Model consists of 1.A sequence of states {X t |t  T } = {X 1, X 2,..., X T }, and 2.A sequence of observations.
Chapter 13 Discrete Image Transforms
Other Models for Time Series. The Hidden Markov Model (HMM)
From DeGroot & Schervish. Example Occupied Telephone Lines Suppose that a certain business office has five telephone lines and that any number of these.
10.Deterministic Randomness 1.Random Sequences Random Sequences A sequence of numbers r 1, r 2,... is random if there are no discernible patterns.
Availability Availability - A(t)
Hidden Markov Models.
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Digital Image Procesing Discrete Walsh Trasform (DWT) in Image Processing Discrete Hadamard Trasform (DHT) in Image Processing DR TANIA STATHAKI READER.
Random Process The concept of random variable was defined previously as mapping from the Sample Space S to the real line as shown below.
Introduction to Fuzzy Theory
Fast Algorithms for Walsh Hadamard Transform on Sliding Windows
Hidden Markov Autoregressive Models
??? The Uncertainty Principle Uncertainty Rel W. Udo Schröder, 2004.
Net 222: Communications and networks fundamentals (Practical Part)
Advanced Digital Signal Processing
Chemical Reaction Networks :
III Digital Audio III.8 (Wed Oct 24) Filters and EQ (= Equalizing)
Sound Periodic Functions. Developing a Music Sequencer/Synthesizer Victor Shepardson Computer Systems Lab
Chebychev, Hoffding, Chernoff
Discrete Controller Synthesis
Speech recognition, machine learning
Discrete-time markov chain (continuation)
ENEE222 Elements of Discrete Signal Analysis Lab 9 1.
Speech recognition, machine learning
Lecture 11 – Stochastic Processes
Presentation transcript:

Fuzzkov 1 Eduardo Miranda (SoCCE-Univ. of Plymouth) Adolfo Maia Jr.(*) (NICS & IMECC –UNICAMP) Fuzzy Granular Synthesis through Markov Chains (*) Supported by São Paulo Science Research Foundation( FAPESP) / Brazil

Granular Synthesis and Analysis Very Short History 1)D. Gabor (1947) Uncertainty Principle (Heisenberg) and Fourier Transforms 2) I. Xenakis (~1960s) Granulation of sounds (clouds) (tape) 3) C. Roads (1978) automated granular synthesis (computer) 4) B. Truax (1988) Real time granular synthesis (granulation) 5) More recently: R. Bencina  Audiomulch E. Miranda  ChaosSynth M. Norris  MagicFX ………………..

Microsound Gabor Cells Time Frequency ∆t∆t ∆f∆f ∆t ∆f ≥1 Time-frequency Uncertainty Relation Characteristic Cells

Fuzzy Sets (Zadeh – 1965) To model vagueness, inexact concepts Membership Function u Let A be a subset of a universe set Ω u: A→[0,1], where 0≤u(x) ≤1, for all x in A Let be an arbitrary discrete set Ex 1) Ex 2) Let Ω = B(R) the sphere of radius R u(x)= 1/r Denote r=|x|

The Fuzzy Grain Matrix ω i j = j-th frequency of the i-th grain a i j = j-th amplitude of the i-th grain α i j = membership value for the j-th Fourier Partial of the i-th grain

Markov Chains 1) Sthocastic processes Random variables X(t) take values on a State Space S 2) Markov Process The actual state X n depends only on the previous X n-1 Transition Matrix P Probability Condition The Process Probability Condition

Algorithm: Diagram for FuzzKov 1

Parameters Input for FuzzKov 1 The Markov Chain N = number of states (grains) n = number of steps of Markov Chain v 0 = initial vector The Grain fs = sample frequency dur = duration of the grain r = number of Fourier partials grain_type = type of grain (1 -3) Fuzzy Parameters alpha_type = type of vector (to generate the Membership Matrix) memb_type = type of Membership Matrix

Walshing the Output 1.Walsh Functions are Retangular Functions 2.They form a basis for Continuous Functions 3.They can be represented by Hadamard Matrices H(n) 4.They can be used to sequencing grain streams H(8) =

Walshing Crickets Click to listen

Future Research Asynchronous Sequency Modulation Glissand Effects New Probability Transitions for Markov Chain Include Fuzzy Metrics New applications of Walsh Functions and Hadamard Matrices