Plate Models Template Models Representation Probabilistic Graphical

Slides:



Advertisements
Similar presentations
1 Probability and the Web Ken Baclawski Northeastern University VIStology, Inc.
Advertisements

UMBC an Honors University in Maryland 1 Uncertainty in Ontology Mapping: Uncertainty in Ontology Mapping: A Bayesian Perspective Yun Peng, Zhongli Ding,
BAYESIAN NETWORKS CHAPTER#4 Book: Modeling and Reasoning with Bayesian Networks Author : Adnan Darwiche Publisher: CambridgeUniversity Press 2009.
Bayesian Networks. Contents Semantics and factorization Reasoning Patterns Flow of Probabilistic Influence.
Learning First-Order Probabilistic Models with Combining Rules Sriraam Natarajan Prasad Tadepalli Eric Altendorf Thomas G. Dietterich Alan Fern Angelo.
CPSC 422, Lecture 33Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 33 Apr, 8, 2015 Slide source: from David Page (MIT) (which were.
Pharmaceutical R&D and the role of semantics in information management and decision- making Otto Ritter AstraZeneca R&D Boston W3C Workshop on Semantic.
Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning.
10/28 Temporal Probabilistic Models. Temporal (Sequential) Process A temporal process is the evolution of system state over time Often the system state.
Review: Bayesian learning and inference
A Differential Approach to Inference in Bayesian Networks - Adnan Darwiche Jiangbo Dang and Yimin Huang CSCE582 Bayesian Networks and Decision Graph.
1 Department of Computer Science and Engineering, University of South Carolina Issues for Discussion and Work Jan 2007  Choose meeting time.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
CS 188: Artificial Intelligence Spring 2007 Lecture 14: Bayes Nets III 3/1/2007 Srini Narayanan – ICSI and UC Berkeley.
CIS 410/510 Probabilistic Methods for Artificial Intelligence Instructor: Daniel Lowd.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Aspects of Bayesian Inference and Statistical Disclosure Control in Python Duncan Smith Confidentiality and Privacy Group CCSR University of Manchester.
Predicates & Quantifiers Goal: Introduce predicate logic, including existential & universal quantification Introduce translation between English sentences.
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
1 Approximate Inference 2: Importance Sampling. (Unnormalized) Importance Sampling.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 11 th, 2006 Readings: K&F: 8.1, 8.2, 8.3,
Lectures 2 – Oct 3, 2011 CSE 527 Computational Biology, Fall 2011 Instructor: Su-In Lee TA: Christopher Miles Monday & Wednesday 12:00-1:20 Johnson Hall.
V13: Causality Aims: (1) understand the causal relationships between the variables of a network (2) interpret a Bayesian network as a causal model whose.
Bayesian Network By Zhang Liliang. Key Point Today Intro to Bayesian Network Usage of Bayesian Network Reasoning BN: D-separation.
1 BN Semantics 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 15 th, 2008 Readings: K&F: 3.1, 3.2, –  Carlos.
CPSC 422, Lecture 33Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 34 Dec, 2, 2015 Slide source: from David Page (MIT) (which were.
DeepDive Model Dongfang Xu Ph.D student, School of Information, University of Arizona Dec 13, 2015.
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Daphne Koller Template Models Plate Models Probabilistic Graphical Models Representation.
Reasoning Patterns Bayesian Networks Representation Probabilistic
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 15 th, 2008 Readings: K&F: 8.1, 8.2, 8.3,
PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Artificial Intelligence Bayes’ Nets: Independence Instructors: David Suter and Qince Li Course Harbin Institute of Technology [Many slides.
Maximum Expected Utility
Presented By S.Yamuna AP/CSE
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Temporal Models Template Models Representation Probabilistic Graphical
Artificial Intelligence
CS 4/527: Artificial Intelligence
CAP 5636 – Advanced Artificial Intelligence
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Preliminaries: Distributions
Bayesian Networks Independencies Representation Probabilistic
CS 188: Artificial Intelligence
CS 188: Artificial Intelligence Fall 2007
CS 188: Artificial Intelligence Fall 2008
CS 188: Artificial Intelligence
Simple Sampling Sampling Methods Inference Probabilistic Graphical
Learning Probabilistic Graphical Models Overview Learning Problems.
I-equivalence Bayesian Networks Representation Probabilistic Graphical
Class #16 – Tuesday, October 26
Shared Features in Log-Linear Models
CS 188: Artificial Intelligence Spring 2007
MCMC for PGMs: The Gibbs Chain
Probabilistic Influence & d-separation
Reasoning Patterns Bayesian Networks Representation Probabilistic
Belief Networks CS121 – Winter 2003 Belief Networks.
Shared Features in Log-Linear Models
Label and Link Prediction in Relational Data
Probabilistic Reasoning
Plate Models Template Models Representation Probabilistic Graphical
Flow of Probabilistic Influence
CS 188: Artificial Intelligence Spring 2006
Variable Elimination Graphical Models – Carlos Guestrin
CS 188: Artificial Intelligence Fall 2008
Evidence of Sexual Bias in Graduate School Admissions
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Presentation transcript:

Plate Models Template Models Representation Probabilistic Graphical

Modeling Repetition . . .  Outcome Outcome(t1) Outcome(tk) Tosses t {t1, …, tk}

Intelligence G(s1)‏ I(s1)‏ G(s2)‏ I(s2)‏ Grade Students s

Nested Plates Difficulty Grade Intelligence D(c1)‏ G(s1,c1)‏ I(s1,c1)‏ Courses c Difficulty Grade Intelligence Students s D(c1)‏ G(s1,c1)‏ I(s1,c1)‏ G(s2,c1)‏ I(s2,c1)‏ D(c2)‏ I(s1,c2)‏ I(s2,c2)‏

Overlapping Plates Difficulty Intelligence Grade D(c1)‏ G(s1,c1)‏ Courses c Students s D(c1)‏ G(s1,c1)‏ I(s1)‏ D(c2)‏ I(s2)‏ G(s1,c2)‏ G(s2,c1)‏ G(s2,c2)‏

Explicit Parameter Sharing D I G D(c1)‏ G(s1,c1)‏ I(s1)‏ D(c2)‏ I(s2)‏ G(s1,c2)‏ G(s2,c1)‏ G(s2,c2)‏

Collective Inference CS101 C A low high Geo101 easy / hard low / high Welcome to CS101 C A low high Welcome to Geo101 easy / hard low / high This web of influence has interesting ramifications from the perspective of the types of reasoning patterns that it supports. Consider Forrest Gump. A priori, we believe that he is pretty likely to be smart. Evidence about two classes that he took changes our probabilities only very slightly. However, we see that most people who took CS101 got A’s. In fact, even people who did fairly poorly in other classes got an A in CS101. Therefore, we believe that CS101 is probably an easy class. To get a C in an easy class is unlikely for a smart student, so our probability that Forrest Gump is smart goes down substantially.

Plate Dependency Model For a template variable A(U1,…,Uk): Template parents B1(U1),…,Bm(Um) CPD P(A | B1,…, Bm)

Ground Network Let A(U1,…,Uk) with parents B1(U1),…,Bm(Um) for any instantiation u1,…,uk to U1,…,Uk we would have:

Plate Dependency Model Let A(U1,…,Uk) with parents B1(U1),…,Bm(Um) For each i, we must have Ui  U1,…,Uk No indices in parent that are not in child

Summary Template for an infinite set of BNs, each induced by a different set of domain objects Parameters and structure are reused within a BN and across different BNs Models encode correlations across multiple objects, allowing collective inference Multiple “languages”, each with different tradeoffs in expressive power