Presentation is loading. Please wait.

Presentation is loading. Please wait.

Statistics between Inductive Logic and Empirical Science Jan Sprenger University of Bonn Tilburg Center for Logic and Philosophy of Science 3 rd PROGIC.

Similar presentations


Presentation on theme: "Statistics between Inductive Logic and Empirical Science Jan Sprenger University of Bonn Tilburg Center for Logic and Philosophy of Science 3 rd PROGIC."— Presentation transcript:

1 Statistics between Inductive Logic and Empirical Science Jan Sprenger University of Bonn Tilburg Center for Logic and Philosophy of Science 3 rd PROGIC Workshop, Canterbury

2 I. The Logical Image of Statistics

3 Inductive Logic Deductive logic discerns valid, truth-preserving inferences Deductive logic discerns valid, truth-preserving inferences P; P  Q  Q

4 Inductive Logic Deductive logic discerns valid, truth-preserving inferences Deductive logic discerns valid, truth-preserving inferences P; P  Q  Q Inductive logic generalizes that idea to non- truth-preserving inferences Inductive logic generalizes that idea to non- truth-preserving inferences P; P supports Q  (more) probably Q

5 Inductive Logic Inductive logic: truth of premises indicates truth of concluions Inductive logic: truth of premises indicates truth of concluions Main concepts: confirmation, evidential support

6 Inductive Logic Inductive logic: truth of premises indicates truth of concluions Inductive logic: truth of premises indicates truth of concluions Inductive inference: objective and independent of external factors Inductive inference: objective and independent of external factors Main concepts: confirmation, evidential support

7 The Logical Image of Statistics Statistics infers from particular data to general models Statistics infers from particular data to general models

8 The Logical Image of Statistics Statistics infers from particular data to general models Statistics infers from particular data to general models Formal theory of inductive inference, governed by general, universally applicable principles Formal theory of inductive inference, governed by general, universally applicable principles

9 The Logical Image of Statistics Statistics infers from particular data to general models Statistics infers from particular data to general models Formal theory of inductive inference, governed by general, universally applicable principles Formal theory of inductive inference, governed by general, universally applicable principles Separation of statistics and decision theory (statistics summarizes data in a way that makes a decision-theoretic analysis possible) Separation of statistics and decision theory (statistics summarizes data in a way that makes a decision-theoretic analysis possible)

10 The Logical Image of Statistics Contains theoretical (mathematics, logic) as well as empirical elements (problem-based engineering of useful methods, interaction with „real science“) Contains theoretical (mathematics, logic) as well as empirical elements (problem-based engineering of useful methods, interaction with „real science“) Where to locate on that scale?

11 The Logical Image of Statistics Pro: mathematical, „logical“ character of theoretical statistics Pro: mathematical, „logical“ character of theoretical statistics

12 The Logical Image of Statistics Pro: mathematical, „logical“ character of theoretical statistics Pro: mathematical, „logical“ character of theoretical statistics Pro: mechanical character of a lot of statistical practice (SPSS & Co.) Pro: mechanical character of a lot of statistical practice (SPSS & Co.)

13 The Logical Image of Statistics Pro: mathematical, „logical“ character of theoretical statistics Pro: mathematical, „logical“ character of theoretical statistics Pro: mechanical character of a lot of statistical practice (SPSS & Co.) Pro: mechanical character of a lot of statistical practice (SPSS & Co.) Pro: Connection between Bayesian statistics and probabilistic logic Pro: Connection between Bayesian statistics and probabilistic logic

14 The Logical Image of Statistics Pro: mathematical, „logical“ character of theoretical statistics Pro: mathematical, „logical“ character of theoretical statistics Pro: mechanical character of a lot of statistical practice (SPSS & Co.) Pro: mechanical character of a lot of statistical practice (SPSS & Co.) Pro: Connection between Bayesian statistics and probabilistic logic Pro: Connection between Bayesian statistics and probabilistic logic Cons: presented in this work... Cons: presented in this work...

15 II. Parameter Estimation

16 A Simple Experiment Five random numbers are drawn from {1, 2,..., N} (N unknown): Five random numbers are drawn from {1, 2,..., N} (N unknown): 21, 4, 26, 18, 12 21, 4, 26, 18, 12 What is the optimal estimate of N on the basis of the data? What is the optimal estimate of N on the basis of the data?

17 A Simple Experiment Five random numbers are drawn from {1, 2,..., N} (N unknown): Five random numbers are drawn from {1, 2,..., N} (N unknown): 21, 4, 26, 18, 12 21, 4, 26, 18, 12 What is the optimal estimate of N on the basis of the data? What is the optimal estimate of N on the basis of the data? That depends on the loss function!

18 Estimation and Loss Functions Aim: estimated parameter value close to true value Aim: estimated parameter value close to true value Loss function measures distance between estimated and true value Loss function measures distance between estimated and true value

19 Estimation and Loss Functions Aim: estimated parameter value close to true value Aim: estimated parameter value close to true value Loss function measures distance between estimated and true value Loss function measures distance between estimated and true value Choice of loss function sensitive to external constraints Choice of loss function sensitive to external constraints

20 A Bayesian approach Elicit prior distribution for the parameter N Elicit prior distribution for the parameter N Use incoming data for updating via conditionalization Use incoming data for updating via conditionalization Summarize data in a posterior distribution (credal set, etc.) Summarize data in a posterior distribution (credal set, etc.) Perform a decision-theoretic analysis Perform a decision-theoretic analysis

21 III. Model Selection

22 Model Selection True model usually „out of reach“ True model usually „out of reach“ Main idea: minimzing discrepancy between the approximating and the true model Main idea: minimzing discrepancy between the approximating and the true model Discrepancy can be measured in various ways Discrepancy can be measured in various ways cf. choice of a loss function cf. choice of a loss function Kullback-Leibler divergence, Gauß distance, etc. Kullback-Leibler divergence, Gauß distance, etc.

23 Model Selection A lot of model selection procedures focuses on estimating the discrepancy between the candidate model and the true model A lot of model selection procedures focuses on estimating the discrepancy between the candidate model and the true model Choose the model with the lowest estimated discrepancy to the true model Choose the model with the lowest estimated discrepancy to the true model That is easier said than done...

24 Problem-specific Premises Asymptotic behavior Asymptotic behavior Small or large candidate model set? Small or large candidate model set? Nested vs. non-nested models Nested vs. non-nested models Linear vs. non-linear models Linear vs. non-linear models Random error structure Random error structure

25 Problem-specific Premises Asymptotic behavior Asymptotic behavior Small or large candidate model set? Small or large candidate model set? Nested vs. non-nested models Nested vs. non-nested models Linear vs. non-linear models Linear vs. non-linear models Random error structure Random error structure Scientific understanding required to fix the premises!

26 Bayesian Model Selection Idea: Search for the most probable model (or the model that has the highest Bayes factor) Idea: Search for the most probable model (or the model that has the highest Bayes factor) Variety of Bayesian methods (BIC, intrinsic and fractional Bayes Factors,...) Variety of Bayesian methods (BIC, intrinsic and fractional Bayes Factors,...)

27 Bayesian Model Selection Idea: Search for the most probable model (or the model that has the highest Bayes factor) Idea: Search for the most probable model (or the model that has the highest Bayes factor) Variety of Bayesian methods (BIC, intrinsic and fractional Bayes Factors,...) Variety of Bayesian methods (BIC, intrinsic and fractional Bayes Factors,...) Does Bayes show a way out of the problems?

28 Bayesian Model Selection If the true model is not contained in the set of candidate models: must Bayesian methods be justified by their distance-minimizing properties? If the true model is not contained in the set of candidate models: must Bayesian methods be justified by their distance-minimizing properties?

29 Bayesian Model Selection If the true model is not contained in the set of candidate models: must Bayesian methods be justified by their distance-minimizing properties? If the true model is not contained in the set of candidate models: must Bayesian methods be justified by their distance-minimizing properties? It is not trivial that a particular distance function (e.g. K-L divergence) is indeed minimized by the model with the highest posterior! It is not trivial that a particular distance function (e.g. K-L divergence) is indeed minimized by the model with the highest posterior! Bayesian probabilities = probabilities of being close to the true model? Bayesian probabilities = probabilities of being close to the true model?

30 Model Selection and Parameter Estimation In the elementary parameter estimation case, posterior distributions were independent of decision-theoretic elements (utilities/loss functions) In the elementary parameter estimation case, posterior distributions were independent of decision-theoretic elements (utilities/loss functions) The reasonableness of a posterior distribution in Bayesian model selection is itself relative to the choice of a distance/loss function The reasonableness of a posterior distribution in Bayesian model selection is itself relative to the choice of a distance/loss function

31 IV. Conclusions

32 Conclusions (I) Quality of a model selection method subject to a plethora of problem-specific premises Quality of a model selection method subject to a plethora of problem-specific premises Model selection methods must be adapted to a specific problem (“engineering“) Model selection methods must be adapted to a specific problem (“engineering“)

33 Conclusions (I) Quality of a model selection method subject to a plethora of problem-specific premises Quality of a model selection method subject to a plethora of problem-specific premises Model selection methods must be adapted to a specific problem (“engineering“) Model selection methods must be adapted to a specific problem (“engineering“) Bayesian methods in model selection should have an instrumental interpretation Bayesian methods in model selection should have an instrumental interpretation Difficult to separate proper statistics from decision theory Difficult to separate proper statistics from decision theory

34 Conclusions (II) Optimality of an estimator is a highly ambiguous notions Optimality of an estimator is a highly ambiguous notions Statistics more alike to scientific modelling than to a branch of mathematics? Statistics more alike to scientific modelling than to a branch of mathematics? More empirical science than inductive logic? More empirical science than inductive logic?

35 Thanks a lot for your attention!!! © by Jan Sprenger, Tilburg, September 2007


Download ppt "Statistics between Inductive Logic and Empirical Science Jan Sprenger University of Bonn Tilburg Center for Logic and Philosophy of Science 3 rd PROGIC."

Similar presentations


Ads by Google