Presentation is loading. Please wait.

# Simplicity and Truth: an Alternative Explanation of Ockham's Razor Kevin T. Kelly Conor Mayo-Wilson Department of Philosophy Joint Program in Logic and.

## Presentation on theme: "Simplicity and Truth: an Alternative Explanation of Ockham's Razor Kevin T. Kelly Conor Mayo-Wilson Department of Philosophy Joint Program in Logic and."— Presentation transcript:

Simplicity and Truth: an Alternative Explanation of Ockham's Razor Kevin T. Kelly Conor Mayo-Wilson Department of Philosophy Joint Program in Logic and Computation Carnegie Mellon University www.hss.cmu.edu/philosophy/faculty-kelly.php

I. The Simplicity Puzzle

Which Theory is Right? ???

Ockham Says: Choose the Simplest!

But Why? Gotcha!

Puzzle An indicator must be sensitive to what it indicates. An indicator must be sensitive to what it indicates. simple

Puzzle A reliable indicator must be sensitive to what it indicates. A reliable indicator must be sensitive to what it indicates. complex

Puzzle But Ockham’s razor always points at simplicity. But Ockham’s razor always points at simplicity. simple

Puzzle But Ockham’s razor always points at simplicity. But Ockham’s razor always points at simplicity. complex

Puzzle How can a broken compass help you find something unless you already know where it is? How can a broken compass help you find something unless you already know where it is? complex

Standard Accounts 1. Prior Simplicity Bias Bayes, BIC, MDL, MML, etc. 2. Risk Minimization SRM, AIC, cross-validation, etc.

1. Prior Simplicity Bias The simple theory is more plausible now because it was more plausible yesterday.

More Subtle Version Simple data are a miracle in the complex theory but not in the simple theory. Simple data are a miracle in the complex theory but not in the simple theory. P C Regularity: retrograde motion of Venus at solar conjunction Has to be!

However… e would not be a miracle given P(  ); e would not be a miracle given P(  ); Why not this? C P

The Real Miracle Ignorance about model: p(C)  p(P); + Ignorance about parameter setting: p’(P(  ) | P)  p(P(  ’ ) | P). = Knowledge about C vs. P(  ): p(P(  )) << p(C). CP         Lead into gold. Perpetual motion. Free lunch. Ignorance is knowledge. War is peace. I love Big Bayes.

Standard Paradox of Indifference Ignorance of red vs. not-red + Ignorance over not-red: = Knowledge about red vs. white.   Knognorance = All the priveleges of knowledge With none of the responsibilities Yeah!

The Ellsberg Paradox 1/3??

Human Preference 1/3?? a > b ac < c b b

Human View 1/3?? a > b ac < c b b knowledge ignorance knowledgeignorance

Bayesian View 1/3?? a > b ac > c b b knognorance

In Any Event The coherentist foundations of Bayesianism have nothing to do with short-run truth- conduciveness. Not so loud!

Bayesian Convergence Too-simple theories get shot down… Too-simple theories get shot down… Complexity Theories Updated opinion

Bayesian Convergence Plausibility is transferred to the next-simplest theory… Plausibility is transferred to the next-simplest theory… Blam! Complexity Theories Updated opinion Plink!

Bayesian Convergence Plausibility is transferred to the next-simplest theory… Plausibility is transferred to the next-simplest theory… Blam! Complexity Theories Updated opinion Plink!

Bayesian Convergence Plausibility is transferred to the next-simplest theory… Plausibility is transferred to the next-simplest theory… Blam! Complexity Theories Updated opinion Plink!

Bayesian Convergence The true theory is nailed to the fence. The true theory is nailed to the fence. Blam! Complexity Theories Updated opinion Zing!

Convergence But alternative strategies also converge: But alternative strategies also converge: Anything in the short run is compatible with convergence in the long run. Anything in the short run is compatible with convergence in the long run.

Summary of Bayesian Approach Prior-based explanations of Ockham’s razor are circular and based on a faulty model of ignorance. Prior-based explanations of Ockham’s razor are circular and based on a faulty model of ignorance. Convergence-based explanations of Ockham’s razor fail to single out Ockham’s razor. Convergence-based explanations of Ockham’s razor fail to single out Ockham’s razor.

2. Risk Minimization 2. Risk Minimization Ockham’s razor minimizes expected distance of empirical estimates from the true value. Ockham’s razor minimizes expected distance of empirical estimates from the true value. Truth

Unconstrained Estimates Unconstrained Estimates are Centered on truth but spread around it. are Centered on truth but spread around it. Pop! Unconstrained aim

Off-center but less spread. Off-center but less spread. Clamped aim Truth Constrained Estimates Constrained Estimates

Off-center but less spread Off-center but less spread Overall improvement in expected distance from truth… Overall improvement in expected distance from truth… Truth Pop! Constrained Estimates Constrained Estimates Clamped aim

Doesn’t Find True Theory Doesn’t Find True Theory The theory that minimizes estimation risk can be quite false. The theory that minimizes estimation risk can be quite false. Four eyes! Clamped aim

Makes Sense Makes Sense …when loss of an answer is similar in nearby distributions. Similarity p Close is good enough! Loss

But Truth Matters But Truth Matters …when loss of an answer is discontinuous with similarity. Similarity p Close is no cigar! Loss

E.g. Science E.g. Science If you want true laws, false laws aren’t good enough.

E.g. Science E.g. Science You must be a philosopher. This is a machine learning conference.

E.g., Causal Data Mining E.g., Causal Data Mining Protein A Protein B Protein CCancer protein Practical enough? Now you’re talking! I’m on a cilantro-only diet to get my protein C level under control.

Correlation does imply causation if there are multiple variables, some of which are common effects. [Pearl, Spirtes, Glymour and Scheines] Central Idea Central Idea Protein A Protein B Protein CCancer protein

Joint distribution p is causally compatible with directed, acyclic graph G iff: Causal Markov Condition: each variable X is independent of its non-effects given its immediate causes. Faithfulness Condition: no other conditional independence relations hold in p. Core assumptions

F1 F2 Tell-tale Dependencies Tell-tale Dependencies H C F Given F, H gives some info about C (Faithfulness) C Given C, F1 gives no further info about F2 (Markov)

Common Applications Linear Causal Case: each variable X is a linear function of its parents and a normally distributed hidden variable called an “error term”. The error terms are mutually independent. Linear Causal Case: each variable X is a linear function of its parents and a normally distributed hidden variable called an “error term”. The error terms are mutually independent. Discrete Multinomial Case: each variable X takes on a finite range of values. Discrete Multinomial Case: each variable X takes on a finite range of values.

No unobserved latent confounding causes A Very Optimistic Assumption A Very Optimistic Assumption Genetics SmokingCancer I’ll give you this one.What’s he up to?

Current Nutrition Wisdom Current Nutrition Wisdom Protein A Protein B Protein CCancer protein Are you kidding? It’s dripping with Protein C! English Breakfast?

As the Sample Increases… As the Sample Increases… Protein A Protein B Protein CCancer protein weak Protein D I do! Out of my way! This situation approximates The last one. So who cares?

As the Sample Increases Again… As the Sample Increases Again… Protein A Protein B Protein CCancer protein Wasn’t that last approximation to the truth good enough? weak Protein D Aaack! I’m poisoned! Protein E weak

Causal Flipping Theorem Causal Flipping Theorem No matter what a consistent causal discovery procedure has seen so far, there exists a pair G, p satisfying the assumptions so that the current sample is arbitrarily likely and the procedure produces arbitrarily many opposite conclusions in p as sample size increases. oops I meant oops I meant

The Wrong Reaction The demon undermines justification of science. The demon undermines justification of science. He must be defeated to forestall skepticism. He must be defeated to forestall skepticism. Bayesian circularity Bayesian circularity Classical instrumentalism Classical instrumentalism Grrrr! Urk!

Another View Many explanations have been offered to make sense of the here-today-gone-tomorrow nature of medical wisdom — what we are advised with confidence one year is reversed the next — but the simplest one is that it is the natural rhythm of science. Many explanations have been offered to make sense of the here-today-gone-tomorrow nature of medical wisdom — what we are advised with confidence one year is reversed the next — but the simplest one is that it is the natural rhythm of science. (Do We Really Know What Makes us Healthy, NY Times Magazine, Sept. 16, 2007). (Do We Really Know What Makes us Healthy, NY Times Magazine, Sept. 16, 2007).

Zen Approach Get to know the demon. Get to know the demon. Locate the justification of Ockham’s razor in his power. Locate the justification of Ockham’s razor in his power.

Connections to the Truth Short-run Reliability Short-run Reliability Too strong to be feasible when theory matters. Too strong to be feasible when theory matters. Long-run Convergence Long-run Convergence Too weak to single out Ockham’s razor Too weak to single out Ockham’s razor Complex Simple ComplexSimple

Middle Path Short-run Reliability Short-run Reliability Too strong to be feasible when theory matters. Too strong to be feasible when theory matters. “Straightest” convergence “Straightest” convergence Just right? Just right? Long-run Convergence Long-run Convergence Too weak to single out Ockham’s razor Too weak to single out Ockham’s razor ComplexSimple Complex Simple ComplexSimple

II. Navigation by Broken Compass simple

Asking for Directions Where’s …

Asking for Directions Turn around. The freeway ramp is on the left.

Asking for Directions Goal

Best Route Goal

Best Route to Any Goal

Disregarding Advice is Bad Extra U-turn

Best Route to Any Goal …so fixed advice can help you reach a hidden goal without circles, evasions, or magic.

In Step with the Demon Constant Linear Quadratic Cubic There yet? Maybe.

In Step with the Demon Constant Linear Quadratic Cubic There yet? Maybe.

In Step with the Demon Constant Linear Quadratic Cubic There yet? Maybe.

In Step with the Demon Constant Linear Quadratic Cubic There yet? Maybe.

Ahead of Mother Nature Constant Linear Quadratic Cubic There yet? Maybe.

Ahead of Mother Nature Constant Linear Quadratic Cubic I know you’re coming!

Ahead of Mother Nature Constant Linear Quadratic Cubic Maybe.

Ahead of Mother Nature Constant Linear Quadratic Cubic !!! Hmm, it’s quite nice here…

Ahead of Mother Nature Constant Linear Quadratic Cubic You’re back! Learned your lesson?

Ockham Violator’s Path Constant Linear Quadratic Cubic See, you shouldn’t run ahead Even if you are right!

Ockham Path Constant Linear Quadratic Cubic

Empirical Problems T1T2T3 Set K of infinite input sequences. Set K of infinite input sequences. Partition of K into alternative theories. Partition of K into alternative theories. K

Empirical Methods T1T2T3 Map finite input sequences to theories or to “?”. Map finite input sequences to theories or to “?”. K T3 e

Method Choice T1T2T3 e1e1e2e2e3e3e4e4 Input history Output history At each stage, scientist can choose a new method (agreeing with past theory choices).

Aim: Converge to the Truth T1T2T3 K ?T2?T1... T1

Retraction Choosing T and then not choosing T next Choosing T and then not choosing T next T’ T ?

Aim: Eliminate Needless Retractions Truth

Aim: Eliminate Needless Retractions Truth

Ancient Roots "Living in the midst of ignorance and considering themselves intelligent and enlightened, the senseless people go round and round, following crooked courses, just like the blind led by the blind." Katha Upanishad, I. ii. 5, c. 600 BCE.

Aim: Eliminate Needless Delays to Retractions theory

application corollary application theory application corollary application corollary Aim: Eliminate Needless Delays to Retractions

Why Timed Retractions? Retraction minimization = generalized significance level. Retraction time minimization = generalized power.

Easy Retraction Time Comparisons T1 T2 T1 T2 T3 T2T4 T2 Method 1 Method 2 T4... at least as many at least as late

Worst-case Retraction Time Bounds T1T2 Output sequences T1T2 T1T2 T4 T3 T4... (1, 2, ∞)... T4 T1T2T3 T4T3...

IV. Ockham Without Circles, Evasions, or Magic

Curve Fitting Data = open intervals around Y at rational values of X. Data = open intervals around Y at rational values of X.

Curve Fitting No effects: No effects:

Curve Fitting First-order effect: First-order effect:

Curve Fitting Second-order effect: Second-order effect:

Empirical Effects

May take arbitrarily long to discover

Empirical Effects May take arbitrarily long to discover

Empirical Effects May take arbitrarily long to discover

Empirical Effects May take arbitrarily long to discover

Empirical Effects May take arbitrarily long to discover

Empirical Effects May take arbitrarily long to discover

Empirical Effects May take arbitrarily long to discover

Empirical Theories True theory determined by which effects appear. True theory determined by which effects appear.

Empirical Complexity More complex

Background Constraints More complex

Background Constraints More complex

Ockham’s Razor Don’t select a theory unless it is uniquely simplest in light of experience. Don’t select a theory unless it is uniquely simplest in light of experience.

Weak Ockham’s Razor Don’t select a theory unless it among the simplest in light of experience. Don’t select a theory unless it among the simplest in light of experience.

Stalwartness Don’t retract your answer while it is uniquely simplest Don’t retract your answer while it is uniquely simplest

Stalwartness

Timed Retraction Bounds r(M, e, n) = the least timed retraction bound covering the total timed retractions of M along input streams of complexity n that extend e r(M, e, n) = the least timed retraction bound covering the total timed retractions of M along input streams of complexity n that extend e Empirical Complexity0123... M

Efficiency of Method M at e M converges to the truth no matter what; M converges to the truth no matter what; For each convergent M’ that agrees with M up to the end of e, and for each n: For each convergent M’ that agrees with M up to the end of e, and for each n: r(M, e, n)  r(M’, e, n) r(M, e, n)  r(M’, e, n) Empirical Complexity0123... MM’

M is Beaten at e There exists convergent M’ that agrees with M up to the end of e, such that There exists convergent M’ that agrees with M up to the end of e, such that For each n, r(M, e, n)  r(M’, e, n); For each n, r(M, e, n)  r(M’, e, n); Exists n, r(M, e, n) > r(M’, e, n). Exists n, r(M, e, n) > r(M’, e, n). Empirical Complexity0123... MM’

Basic Idea Ockham efficiency: Nature can force arbitary, convergent M to produce the successive answers down an effect path arbitrarily late, so stalwart, Ockham solutions are efficient. Ockham efficiency: Nature can force arbitary, convergent M to produce the successive answers down an effect path arbitrarily late, so stalwart, Ockham solutions are efficient.

Basic Idea Unique Ockham efficiency: A violator of Ockham’s razor or stalwartness can be forced into an extra retraction or a late retraction in complexity class zero at the time of the violation, so the violator is beaten by each stalwart, Ockham solution. Unique Ockham efficiency: A violator of Ockham’s razor or stalwartness can be forced into an extra retraction or a late retraction in complexity class zero at the time of the violation, so the violator is beaten by each stalwart, Ockham solution.

Ockham Efficiency Theorem Let M be a solution. The following are equivalent: Let M be a solution. The following are equivalent: M is always strongly Ockham and stalwart; M is always strongly Ockham and stalwart; M is always efficient; M is always efficient; M is never weakly beaten. M is never weakly beaten.

Example: Causal Inference Effects are conditional statistical dependence relations. Effects are conditional statistical dependence relations. X dep Y | {Z}, {W}, {Z,W} Y dep Z | {X}, {W}, {X,W} X dep Z | {Y}, {Y,W}...

Causal Discovery = Ockham’s Razor XYZW

Ockham’s Razor XYZW X dep Y | {Z}, {W}, {Z,W}

Causal Discovery = Ockham’s Razor XYZW X dep Y | {Z}, {W}, {Z,W} Y dep Z | {X}, {W}, {X,W} X dep Z | {Y}, {Y,W}

Causal Discovery = Ockham’s Razor XYZW X dep Y | {Z}, {W}, {Z,W} Y dep Z | {X}, {W}, {X,W} X dep Z | {Y}, {W}, {Y,W}

Causal Discovery = Ockham’s Razor XYZW X dep Y | {Z}, {W}, {Z,W} Y dep Z | {X}, {W}, {X,W} X dep Z | {Y}, {W}, {Y,W} Z dep W| {X}, {Y}, {X,Y} Y dep W| {Z}, {X,Z}

Causal Discovery = Ockham’s Razor XYZW X dep Y | {Z}, {W}, {Z,W} Y dep Z | {X}, {W}, {X,W} X dep Z | {Y}, {W}, {Y,W} Z dep W| {X}, {Y}, {X,Y} Y dep W| {X}, {Z}, {X,Z}

IV. Simplicity Defined

Approach Empirical complexity reflects nested problems of induction posed by the problem. Empirical complexity reflects nested problems of induction posed by the problem. Hence, simplicity is problem-relative but topologically invariant. Hence, simplicity is problem-relative but topologically invariant.

Empirical Problems T1T2T3 Set K of infinite input sequences. Set K of infinite input sequences. Partition Q of K into alternative theories. Partition Q of K into alternative theories. K

Simplicity Concepts A simplicity concept for K is just a well-founded order < on a partition S of K with ascending chains of order type not exceeding omega such that: A simplicity concept for K is just a well-founded order < on a partition S of K with ascending chains of order type not exceeding omega such that: 1. Each element of S is included in some answer in Q. 2. Each downward union in (S, <) is closed; 3. Incomparable sets share no boundary point. 4. Each element of S is included in the boundary of its successor.

General Ockham Efficiency Theorem Let M be a solution. The following are equivalent: Let M be a solution. The following are equivalent: M is always strongly Ockham and stalwart; M is always strongly Ockham and stalwart; M is always efficient; M is always efficient; M is never beaten. M is never beaten.

Conclusions Causal truths are necessary for counterfactual predictions. Causal truths are necessary for counterfactual predictions. Ockham’s razor is necessary for staying on the straightest path to the true theory but does not point at the true theory. Ockham’s razor is necessary for staying on the straightest path to the true theory but does not point at the true theory. No evasions or circles are required. No evasions or circles are required.

Future Directions Extension of unique efficiency theorem to stochastic model selection. Extension of unique efficiency theorem to stochastic model selection. Latent variables as Ockham conclusions. Latent variables as Ockham conclusions. Degrees of retraction. Degrees of retraction. Pooling of marginal Ockham conclusions. Pooling of marginal Ockham conclusions. Retraction efficiency assessment of MDL, SRM. Retraction efficiency assessment of MDL, SRM.

Suggested Reading "Ockham’s Razor, Truth, and Information", in Handbook of the Philosophy of Information, J. van Behthem and P. Adriaans, eds., to appear. "Ockham’s Razor, Truth, and Information", in Handbook of the Philosophy of Information, J. van Behthem and P. Adriaans, eds., to appear. "Ockham’s Razor, Truth, and Information" "Ockham’s Razor, Truth, and Information" "Ockham’s Razor, Empirical Complexity, and Truth-finding Efficiency", Theoretical Computer Science, 383: 270-289, 2007. "Ockham’s Razor, Empirical Complexity, and Truth-finding Efficiency", Theoretical Computer Science, 383: 270-289, 2007. "Ockham’s Razor, Empirical Complexity, and Truth-finding Efficiency" "Ockham’s Razor, Empirical Complexity, and Truth-finding Efficiency" Both available as pre-prints at: www.hss.cmu.edu/philosophy/faculty-kelly.php Both available as pre-prints at: www.hss.cmu.edu/philosophy/faculty-kelly.php

Download ppt "Simplicity and Truth: an Alternative Explanation of Ockham's Razor Kevin T. Kelly Conor Mayo-Wilson Department of Philosophy Joint Program in Logic and."

Similar presentations

Ads by Google