Presentation is loading. Please wait.

Presentation is loading. Please wait.

Topics Combining probability and first-order logic –BLOG and DBLOG Learning very complex behaviors –ALisp: hierarchical RL with partial programs State.

Similar presentations


Presentation on theme: "Topics Combining probability and first-order logic –BLOG and DBLOG Learning very complex behaviors –ALisp: hierarchical RL with partial programs State."— Presentation transcript:

1 Topics Combining probability and first-order logic –BLOG and DBLOG Learning very complex behaviors –ALisp: hierarchical RL with partial programs State estimation for neurotrauma patients –Joint w/ Geoff Manley (UCSF), Intel, Omron Metareasoning and bounded optimality Transfer learning (+ Jordan, Bartlett, MIT, SU, OSU) Knowing everything on the web Human-level AI

2 Heart rate Blood pressure Oxygen saturation Pulmonary artery pressure Intracranial pressure Temperature Nursing Documentation MedicationsTreatments Intake Output Vital signs Blood products IV fluids Tissue oxygen Inspired oxygen Tidal volume Peak pressure End expiratory pressure Respiratory rate Ventilation mode Cardiac output Sedation level ICP wave State estimation: 3x5 index card

3 Patient 2-13

4 Dynamic Bayesian Networks

5 DBNs contd:

6 Research plan DBN model: ~200 core state variables, ~500 sensor-related variables Learn model parameter distributions from DB Infer patient-specific parameters online Goals: –Improved alarms –Diagnostic state estimation => improved treatment –Solve the treatment POMDP –Structure discovery => better understanding of physiology

7 Possible worlds Propositional First-order + unique names, domain closure First-order open-world AB CD AB CD AB CD AB CD AB CD A B C D

8 Example: Citation Matching [Lashkari et al 94] Collaborative Interface Agents, Yezdi Lashkari, Max Metral, and Pattie Maes, Proceedings of the Twelfth National Conference on Articial Intelligence, MIT Press, Cambridge, MA, 1994. Metral M. Lashkari, Y. and P. Maes. Collaborative interface agents. In Conference of the American Association for Artificial Intelligence, Seattle, WA, August 1994. Are these descriptions of the same object? What authors and papers actually exist, with what attributes? Who wrote which papers? General problem: raw data -> relational KB Other examples: multitarget tracking, vision, NLP Approach: formal language for specifying first-order open-world probability models

9 BLOG generative process Number statements describe steps that add some objects to the world Dependency statements describe steps that set the value of a function or relation on a tuple of arguments Includes setting the referent of a constant symbol (0-ary function) Both types may condition on existence and properties of previously added objects

10 BLOG model (simplified) guaranteed Citation Cit1, Cit2, Cit3, Cit4, Cit5, Cit6, Cit7; #Researcher ~ NumResearchersPrior(); Name(r) ~ NamePrior(); #Paper(FirstAuthor = r) ~ NumPapersPrior(Position(r)); Title(p) ~ TitlePrior(); PubCited(c) ~ Uniform({Paper p}); Text(c) ~ NoisyCitationGrammar (Name(FirstAuthor(PubCited(c))), Title(PubCited(c)));

11 Basic results Theorem 1: Every well-formed* BLOG model specifies a unique distribution over possible worlds The probability of each (finite) world is given by a product of the relevant conditional probabilities from the model Theorem 2: For any well-formed BLOG model, there are algorithms (LW, MCMC) that converge to correct probability for any query, using finite time per sampling step

12 Citation Matching Results Four data sets of ~300-500 citations, referring to ~150-300 papers

13 DBLOG BLOG allows for temporal models – time is just a logical variable over an infinite set Inference works (only finitely many relevant random variables) but is grossly inefficient DBLOG includes time as a distinguished type and predecessor as distinguished function; implements special-purpose inference: –Particle filter for temporally varying relations –Decayed MCMC for atemporal relations

14 Open Problems Inference –Applying “lifted” inference to BLOG (like Prolog) –Approximation algorithms for problems with huge/growing numbers of objects Knowledge representation –Hierarchical activity models –Undirected submodels –Nonparametric extensions (cf. de Freitas, 2005)


Download ppt "Topics Combining probability and first-order logic –BLOG and DBLOG Learning very complex behaviors –ALisp: hierarchical RL with partial programs State."

Similar presentations


Ads by Google