Heading Text Declarative & Robust Junction Tree for Distributed Inference Ashima Atul, Kuang Chen {ashima, in collaboration with.

Slides:



Advertisements
Similar presentations
David Rosen Goals  Overview of some of the big ideas in autonomous systems  Theme: Dynamical and stochastic systems lie at the intersection of mathematics.
Advertisements

Exact Inference. Inference Basic task for inference: – Compute a posterior distribution for some query variables given some observed evidence – Sum out.
Variational Methods for Graphical Models Micheal I. Jordan Zoubin Ghahramani Tommi S. Jaakkola Lawrence K. Saul Presented by: Afsaneh Shirazi.
CS498-EA Reasoning in AI Lecture #15 Instructor: Eyal Amir Fall Semester 2011.
Implementing declarative overlays Boom Thau Loo Tyson Condie Joseph M. Hellerstein Petros Maniatis Timothy Roscoe Ion Stoica.
Implementing Declarative Overlays From two talks by: Boon Thau Loo 1 Tyson Condie 1, Joseph M. Hellerstein 1,2, Petros Maniatis 2, Timothy Roscoe 2, Ion.
Lauritzen-Spiegelhalter Algorithm
1 Greedy Forwarding in Dynamic Scale-Free Networks Embedded in Hyperbolic Metric Spaces Dmitri Krioukov CAIDA/UCSD Joint work with F. Papadopoulos, M.
Exact Inference in Bayes Nets
Junction Trees And Belief Propagation. Junction Trees: Motivation What if we want to compute all marginals, not just one? Doing variable elimination for.
Loopy Belief Propagation a summary. What is inference? Given: –Observabled variables Y –Hidden variables X –Some model of P(X,Y) We want to make some.
Graphical models: approximate inference and learning CA6b, lecture 5.
Belief Propagation on Markov Random Fields Aggeliki Tsoli.
Junction Trees: Motivation Standard algorithms (e.g., variable elimination) are inefficient if the undirected graph underlying the Bayes Net contains cycles.
Junction tree Algorithm :Probabilistic Graphical Models Recitation: 10/04/07 Ramesh Nallapati.
Machine Learning CUNY Graduate Center Lecture 6: Junction Tree Algorithm.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
© 2007, Roman Schmidt Distributed Information Systems Laboratory Evergrow workshop, Jerusalem, IsraelFebruary 19, 2007 Efficient implementation of BP in.
Practical Belief Propagation in Wireless Sensor Networks Bracha Hod Based on a joint work with: Danny Dolev, Tal Anker and Danny Bickson The Hebrew University.
1 Distributed localization of networked cameras Stanislav Funiak Carlos Guestrin Carnegie Mellon University Mark Paskin Stanford University Rahul Sukthankar.
Distributed Inference in Dynamical Systems Emergency response systems: monitoring in hazardous conditions sensor calibration, localization Autonomous teams.
Global Approximate Inference Eran Segal Weizmann Institute.
December Marginal and Joint Beliefs in BN1 A Hybrid Algorithm to Compute Marginal and Joint Beliefs in Bayesian Networks and its complexity Mark.
Belief Propagation, Junction Trees, and Factor Graphs
Routing.
1 Probabilistic Inference in Distributed Systems Stanislav Funiak Disclaimer: Statements made in this talk are the sole opinions of the presenter and do.
ROUTING ON THE INTERNET COSC Aug-15. Routing Protocols  routers receive and forward packets  make decisions based on knowledge of topology.
Efficient and Robust Query Processing in Dynamic Environments Using Random Walk Techniques Chen Avin Carlos Brito.
1 A Mutual Exclusion Algorithm for Ad Hoc Mobile networks Presentation by Sanjeev Verma For COEN th Nov, 2003 J. E. Walter, J. L. Welch and N. Vaidya.
Wireless Networks of Devices (WIND) Hari Balakrishnan and John Guttag MIT Lab for Computer Science NTT-MIT Meeting, January 2000.
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
GraphLab: how I understood it with sample code Aapo Kyrola, Carnegie Mellon Univ. Oct 1, 2009.
Belief Propagation. What is Belief Propagation (BP)? BP is a specific instance of a general class of methods that exist for approximate inference in Bayes.
Network Layer4-1 Distance Vector Algorithm Bellman-Ford Equation (dynamic programming) Define d x (y) := cost of least-cost path from x to y Then d x (y)
DIST: A Distributed Spatio-temporal Index Structure for Sensor Networks Anand Meka and Ambuj Singh UCSB, 2005.
Content Addressable Networks CAN is a distributed infrastructure, that provides hash table-like functionality on Internet-like scales. Keys hashed into.
Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website
1 Computer Communication & Networks Lecture 21 Network Layer: Delivery, Forwarding, Routing Waleed.
An Introduction to Variational Methods for Graphical Models
1 Mean Field and Variational Methods finishing off Graphical Models – Carlos Guestrin Carnegie Mellon University November 5 th, 2008 Readings: K&F:
Belief Propagation and its Generalizations Shane Oldenburger.
DISTIN: Distributed Inference and Optimization in WSNs A Message-Passing Perspective SCOM Team
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
CS 6401 Overlay Networks Outline Overlay networks overview Routing overlays Resilient Overlay Networks Content Distribution Networks.
HEMANTH GOKAVARAPU SANTHOSH KUMAR SAMINATHAN Frequent Word Combinations Mining and Indexing on HBase.
Pattern Recognition and Machine Learning
Today Graphical Models Representing conditional dependence graphically
Markov Networks: Theory and Applications Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208
Slide 1 Directed Graphical Probabilistic Models: inference William W. Cohen Machine Learning Feb 2008.
Routing algorithms. D(v): the cost from the source node to destination that has currently the least cost. p(v): previous node along current least.
Network Layer COMPUTER NETWORKS Networking Standards (Network LAYER)
The network layer: routing
Dominik Kaspar, Eunsook Kim, Carles Gomez, Carsten Bormann
StatSense In-Network Probabilistic Inference over Sensor Networks
Routing.
ECE 544 Protocol Design Project 2016
CSCI 5822 Probabilistic Models of Human and Machine Learning
Clique Tree & Independence
المشرف د.يــــاســـــــــر فـــــــؤاد By: ahmed badrealldeen
Expectation-Maximization & Belief Propagation
Variable Elimination 2 Clique Trees
Lecture 3: Exact Inference in GMs
Clique Tree Algorithm: Computation
Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website
Routing.
Mean Field and Variational Methods Loopy Belief Propagation
Generalized Belief Propagation
Information Sciences and Systems Lab
Presentation transcript:

Heading Text Declarative & Robust Junction Tree for Distributed Inference Ashima Atul, Kuang Chen {ashima, in collaboration with Stano Funiak Heading Result

Benefits Adaptivity Declarative: High-level logical rules allow you to focus on what instead of how Distributed: Network awareness embedded in language sign) provides recursive cross-network messaging and data access. Dynamic: Network layers automatically update on failure and recovery. Malleable: Variations of inference algorithms can be easily implemented, deployed, tested & compared. Results Demonstrated an effective approach to implement distributed inference algorithms Concise implementation with 2 order-of-magnitude reduction in lines of code compared with Paskin et al. Robust to changing network condictions. Demonstrated on the RAD Lab cluster running 54 node dataset. Verified calculation of Gaussian means. Implementation 3-layer overlay network architecture: Form dynamic spanning tree Form dynamic junction tree Run Shafer-Shenoy message passing Satisfies running intersection property Each node covers associated cliques Adaptive and optimize-able to minimize computational complexity Experiment Concise & declarative implementation of Junction Tree for distributed inference on graphical models. Based on the architecture of Paskin et. al [IPSN 2005] and implemented (a simplified version) in P2 -- (Ask me what’s P2!). Demonstrate on the Intel Research Berkeley dataset (54 nodes) running on RAD Lab cluster. Approach create_path Neighbor, Cost, PathList) :- Neighbor, Cost), PathList := f_concat(X, Neighbor). find_best_path a_MIN ) :- Neighbor, Cost, PathList). store_best_path Neighbor, Cost, PathList) :- Neighbor, Cost), Neighbor, Cost, PathList). update_path X, Dest, Cost, PathList) :- Neighbor, _), Dest, Cost, PathList), Neighbor != Dest. add_path Dest, Cost, PathList) :- Neighbor, ExistingCost), Neighbor, Dest, NCost, NPL), Cost := ExistingCost + NCost, f_member(NewPathList, X)==0, PathList := f_concat(X, NPL). create_path Neighbor, Cost, PathList) :- Neighbor, Cost), PathList := f_concat(X, Neighbor). find_best_path a_MIN ) :- Neighbor, Cost, PathList). store_best_path Neighbor, Cost, PathList) :- Neighbor, Cost), Neighbor, Cost, PathList). update_path X, Dest, Cost, PathList) :- Neighbor, _), Dest, Cost, PathList), Neighbor != Dest. add_path Dest, Cost, PathList) :- Neighbor, ExistingCost), Neighbor, Dest, NCost, NPL), Cost := ExistingCost + NCost, f_member(NewPathList, X)==0, PathList := f_concat(X, NPL). Find the cost to neighbors Find the cost to neighbors Update when there is a new path Update when there is a new path Send my best path to all nodes Send my best path to all nodes Select the best path to a destination Select the best path to a destination Store the best path Store the best path clique_init Var) :- Var). reachable_update Nbr, Var) :- Nbr), Var). reachable_recur Nbr, Var, Time) :- Nbr), OtherNbr, Var), X != OtherNbr. clique_update Var) :- Nbr, Var), OtherNbr, Var), Nbr != OtherNbr. separator_update Nbr, Var) :- Nbr), Var), Var). clique_init Var) :- Var). reachable_update Nbr, Var) :- Nbr), Var). reachable_recur Nbr, Var, Time) :- Nbr), OtherNbr, Var), X != OtherNbr. clique_update Var) :- Nbr, Var), OtherNbr, Var), Nbr != OtherNbr. separator_update Nbr, Var) :- Nbr), Var), Var). Find the cost to neighbors Find the cost to neighbors Update when there is a new path Update when there is a new path Select the best path to a destination Select the best path to a destination Running Intersection: - Track reachable Variables -Find neighbors’ Reachable Variables -Add variable to clique if 2 neighbors have it Running Intersection: - Track reachable Variables -Find neighbors’ Reachable Variables -Add variable to clique if 2 neighbors have it Running Intersection: - Track reachable Variables -Find neighbors’ Reachable Variables -Add variable to clique if 2 neighbors have it Running Intersection: - Track reachable Variables -Find neighbors’ Reachable Variables -Add variable to clique if 2 neighbors have it Spanning Tree Junction Tree X 1,X 2,X 3 X 1,X 2 X 2,X 4, X 5, X 6 X 4,X 5,X 6 X 2,X 4,X 5,X 6 X 4,X 5,X 6 Node ¶ { X 2,X 4, X 6 } Easy to write algorithms that are robust to changing network conditions Future Work Mapping variables to network nodes Loopy belief propagation and other approximate algo. Experiment Declarative & Robust Junction Tree for Distributed Inference Ashima Atul, Kuang Chen {ashima, in collaboration with Stano Funiak Perfomance Junction tree stabilization for 54 node experiment occured in less than 15 sections (after spanning tree formation).