Presentation is loading. Please wait.

Presentation is loading. Please wait.

Bayesian networks and their application in circuit reliability estimation Erin Taylor.

Similar presentations


Presentation on theme: "Bayesian networks and their application in circuit reliability estimation Erin Taylor."— Presentation transcript:

1 Bayesian networks and their application in circuit reliability estimation Erin Taylor

2 What is a Bayesian Network? An example We want to describe the causal relationship between the following events: 1) The season 2) Whether it is raining outside 3) The sprinkler is on 4) The sidewalk is wet 5) The sidewalk is slippery We can construct a graph to represent the causal link between these 5 events.

3 What is a Bayesian Network? Season Wet SprinklerRain Slippery Each node represents a random variable, in this case the probability of a particular event. Assumptions: “Sprinkler on” and “Rain” are determined by “Season” “Sidewalk wet” is determined by “Sprinkler on” and “Rain” “Sidewalk slippery” is determined by “Sidewalk wet” Bayesian network

4 Properties of Bayesian Networks A Bayesian network is –A Directed Acyclic Graph (DAG) –A model of probabilistic events In a Bayesian network –Nodes represent random variables of interest –Links represent causal dependencies among variables Bayesian networks are direct representations of the world. Arrows indicate real causal connections, not the flow of information, as in neural networks.

5 Properties of Bayesian Networks Links are not absolute –If the sprinkler is on, this does not always mean that the sidewalk is wet –For example, the sprinkler may be aimed away from the sidewalk Season Wet SprinklerRain Slippery

6 Properties of Bayesian Networks Given that the sidewalk is wet, we can calculate the probability that the sprinkler is on: P(sprinkler on | sidewalk wet) Season Wet Sprinkler Rain Slippery Bayesian networks allow us to calculate such values from a small set of probabilities in a process called reasoning or Bayesian Inference

7 Reasoning in Bayesian Networks Reasoning in Bayesian networks operates by propagating information in any direction 1)If the sprinkler is on, the sidewalk is probably wet (prediction) 2) If the sidewalk is wet, it is more likely that the sprinkler is on or it is raining (abduction) 3) If the sidewalk is wet and the sprinkler is on, the likelihood that it is raining is reduced (explaining away) Season Wet Sprinkler Rain Slippery Explaining away is a special type of reasoning that is especially difficult to model in other network models

8 Specifying a Bayesian Network A new example: Family outDog dirty Light on Dog out Hear bark 1)When a family leaves their house, they often turn the front light on and let the dog out 2)If the dog is dirty, the family often puts him outside 3)If the dog is out, you can sometimes hear him bark

9 Specifying a Bayesian Network To specify the probability distribution of a Bayesian network we need –The prior probability of all root nodes –The conditional probabilities of all nonroot nodes given all possible combinations of their direct predecessors Family outDog dirty Light on Dog out Hear bark P(fo) = 0.15 P(dd) = 0.01 P(lo | fo) = 0.6 P(lo | ~fo) = 0.05 P(do | fo dd) = 0.99 P(do | fo ~dd) = 0.90 P(do | ~fo dd) = 0.97 P(do | ~fo ~dd) = 0.3 P(hb | do) = 0.7 P(hb | ~do) = 0.01 Total specified values: 10

10 Bayesian Networks and Probability Theory In traditional probability theory, specifying the previous example would require the joint distribution of all 5 variables: P(fo,dd,lo,do,hb) Family outDog dirty Light on Dog out Hear bark The joint distribution of 5 variables requires 2 5 -1 or 31 values

11 Bayesian Networks and Probability Theory To see where 2 5 -1 comes from, consider the set of Boolean variables (a,b) To specify the joint probability distribution we need the following values: In the general case, this yields a total of 2 n values for a system of n variables Since the sum of all possible outcomes must be 1, we can reduce the number of values to 2 n -1 P(a b) P(~a b) P(a ~b) P(~a ~b)

12 Bayesian Networks and Joint Probabilities Using Bayesian networks for this example, we can reduce the number of values that need to be specified from 31 to 10. Family outDog dirty Light on Dog out Hear bark Total specified values: 10 P(fo,dd,lo,do,hb) Total specified values: 31 How is this possible?

13 Simplifying Joint Distributions Bayesian networks reduce the complexity of joint distributions by introducing several independence assumptions Conditional Independence: –If we know whether the dog is out, then the probability of hearing him bark is completely independent from all other events –Other events only serve to indicate the probability that the dog is out P(hb | fo dd lo do) = P(hb | do) –Also P(dd | fo lo do hb) = P(dd) Family outDog dirty Light on Dog out Hear bark Prior probability

14 Simplifying Joint Distributions From probability theory: P(x 1,…,x n ) = P(x 1 )P(x 2 |x 1 )…P(x n |x 1 …x n-1 ) In our example: P(fo,dd,lo,do,hb) = P(fo) P(dd|fo) P(lo|fo dd) P(do|fo dd lo) P(hb|fo dd lo do) Simplify: P(dd|fo) = P(dd) P(lo|fo dd) = P(lo|fo) P(do|fo dd lo) = P(do|fo dd) P(hb|fo dd lo do) = P(hb|do) Family outDog dirty Light on Dog out Hear bark P(fo) P(dd) P(lo|fo) P(do|fo dd) P(hb|do)

15 Simplifying Joint Distributions Only the five terms on the right side are needed to specify the joint distribution of our example: P(fo,dd,lo,do,hb) = P(fo) P(dd) P(lo|fo) P(do|fo dd) P(hb|do) The number of values that need to be specified in a Bayesian network grows linearly with the number of variables whereas the joint distribution grows exponentially Prior probabilities of root nodes Conditional probabilities of nonroot nodes given all combinations of predecessors

16 Evaluating Probabilities Using BN’s Basic computation on Bayesian networks is the computation of every node’s belief (conditional probability) given the evidence observed For example –Evidence: The dog is heard barking –Compute: The probability that the family is out –Compute: The probability that the light is on Family outDog dirty Light on Dog out Hear bark

17 Evaluating Probabilities Using BN’s Solving Bayesian networks involves Bayesian inference Exact Solution –Involves enumerating all possible probability combinations –Generally NP-Hard Simple Query: P(fo = true | hb = true)

18 Evaluating Probabilities Using BN’s Approximate Solutions –Logic sampling –Markov chain Monte Carlo algorithm –Likelihood weighting method General approach to approximate solutions –Select values for a subset of nodes –Use this ‘evidence’ to pick values for remaining nodes –Keep statistics on all the nodes’ values

19 Logic Sampling Logic Sampling Algorithm 1) Guess values for all roots nodes according to prior probability P(fo=0.15) -> 15% of time fo=true 2) Work down network guessing values for next lower node based on parent values Previous values: fo=true and dd=false P(do|fo ~dd)=0.90 -> 90% of time do=true 3) Repeat many times for entire network and keep track of how often each node is assigned each value Family outDog dirty Light on Dog out Hear bark P(fo) = 0.15 P(do | fo ~dd) = 0.90 4) To determine a conditional probability, P(fo=true | hb=true), consider cases when hb=true and count the number of times fo=true.

20 Bayesian networks were popularized by the Artificial Intelligence community who used them as a learning algorithm Used to model trust in a P2P network “Bayesian network-based trust model in P2P networks” by Wang and Vassileva Used to evaluate circuit reliability “Scalable probabilistic computing models using Bayesian networks” by Rejimon and Bhanja Applications “If I see a red object, what is the probability that I should stop?” OR

21 BN’s for Circuit Reliability Circuit Example Inputs: Z 1, Z 2, Z 3 Internal Signals: X 1, X 2 Outputs: Y 1, Y 2 Z1Z1 Z2Z2 Z3Z3 X1X1 X2X2 Y1Y1 Y2Y2

22 BN’s for Circuit Reliability Goal: Analyze circuit reliability in the face of dynamic errors Procedure: –Construct an error prone version of the circuit where each gate has a probability of failure = p –Analyze this circuit in relation to the fault-free circuit Z1Z1 Z2Z2 Z3Z3 Xe1Xe1 Xe2Xe2 Ye1Ye1 Ye2Ye2 p p

23 BN’s for Circuit Reliability Error at the i th output can be represented mathematically: E i = Y e i  Y i P(E i = 1) = P(Y e i  Y i =1) Equivalent circuit representation Xe1Xe1 Xe2Xe2 Ye1Ye1 Ye2Ye2 p p Z1Z1 Z2Z2 Z3Z3 X1X1 X2X2 Y1Y1 Y2Y2 E1E1 E2E2

24 BN’s for Circuit Reliability In a circuit, each gate output has a causal relationship with its input  circuits can be represented as Bayesian networks In the Bayesian network representation of a circuit –Inputs are root nodes –Outputs are leaf nodes –Internal signals are internal nodes –Each node’s conditional probability is determined by the gate type and probability of error, p

25 BN’s for Circuit Reliability Xe1Xe1 Xe2Xe2 Ye1Ye1 Ye2Ye2 p p X1X1 X2X2 Y1Y1 Y2Y2 E1E1 E2E2 Z1Z1 Z1Z1 Z2Z2 Z3Z3 Z2Z2 Z3Z3 Xe2Xe2 Xe1Xe1 X1X1 X2X2 Ye1Ye1 Ye2Ye2 Y1Y1 Y2Y2 E1E1 E2E2 Circuit Bayesian network

26 BN’s for Circuit Reliability Specifying the Bayesian network –Prior probabilities for all root nodes Circuit inputs –Conditional probabilities for all nonroot nodes given all possible combinations of parents Xe1Xe1 Xe2Xe2 Ye1Ye1 Ye2Ye2 p p X1X1 X2X2 Y1Y1 Y2Y2 E1E1 E2E2 Z1Z1 Z2Z2 Z3Z3 Y e 1 is the output of NAND gate with inputs Z 1 and X e 1 P(Y e 1 =1| Z 1 =0, X e 1 =0) = (1-p) Probability of no gate error

27 BN’s for Circuit Reliability Z1Z1 Z2Z2 Z3Z3 Xe2Xe2 Xe1Xe1 X1X1 X2X2 Ye1Ye1 Ye2Ye2 Y1Y1 Y2Y2 E1E1 E2E2 Solving for the error probabilities, E 1 and E 2 –Error probability is fixed (p = 0.005, 0.05, 0.1) –Logic sampling algorithm is used –Determines the probability of output error given each input combination Results –Circuits with 2,000-3,000 gates took on average 18 s to simulate –Average accuracy: 0.7% –Worst-case accuracy: 3.34% –Compare this to an accurate method which takes ~1,000 s to simulate simple circuits with only 10’s of gates

28 Advanced Subjects in BN’s Dynamic Bayesian networks –Models variables whose values change over time –Captures this process by representing each state variable as multiple copies, one for each time step Learning in Bayesian networks –The conditional probabilities of each node can be updated continuously –Similar to the way weights are adjusted in neural networks

29 Conclusions Bayesian networks are a powerful tool for modeling any probabilistic system Applications are diverse –Medical field –Image processing –Speech recognition –Computer networking Used for efficient evaluation of nanoscale circuit reliability


Download ppt "Bayesian networks and their application in circuit reliability estimation Erin Taylor."

Similar presentations


Ads by Google