Presentation is loading. Please wait.

Presentation is loading. Please wait.

Stefan Arnborg, KTH Statistical Methods in Applied Computer Science DD2447, DD3342, spring 2010.

Similar presentations


Presentation on theme: "Stefan Arnborg, KTH Statistical Methods in Applied Computer Science DD2447, DD3342, spring 2010."— Presentation transcript:

1 Stefan Arnborg, KTH http://www.nada.kth.se/~stefan Statistical Methods in Applied Computer Science DD2447, DD3342, spring 2010

2 SYLLABUS Common statistical models and their use: Bayesian, testing, and fiducial statistical philosophy Hypothesis choice Parametric inference Non-parametric inference Elements of regression Clustering Graphical statistical models Prediction and retrodiction Chapman-Kolmogoroff formulation Elements of Vapnik/Chervonenki's learning theory Evidence theory, estimation and combination of evidence. Support Vector Machines and Kernel methods Vovk/Gammerman hedged prediction technology Stochastic simulation, Markov Chain Monte Carlo.

3 LEARNING GOALS After successfully taking this course, you will be able to: -motivate the use of uncertainty management and statistical methodology in computer science applications, as well as the main methods in use, -account for algorithms used in the area and use the standard tools, -critically evaluate the applicability of these methods in new contexts, and design new applications of uncertainty management, -follow research and development in the area.

4 GRADING DD2447: Bologna grades Grades are E-A during 2009. 70% of homeworks and a very short oral discussion of them gives grade C. Less gives F-D. For higher grades, essentially all homeworks should be turned in on time. Alternative assignments will be substituted for those homeworks you miss. For grade B you must pass one Master's test, for grade A you must do two Master's tests or a project with some research content. DD3342: Pass/Fail Research level project, or deeper study of part of course

5

6

7

8

9 Applications of Uncertainty everywhere Medical Imaging/Research (Schizophrenia) Land Use Planning Environmental Surveillance and Prediction Finance and Stock Marketing into Google Robot Navigation and Tracking Security and Military Performance Tuning …

10 Some Master’s Projects using this syllabus (subset) Recommender system for Spotify Behavior of mobile phone users Recommender system for book club Recommender for job search site Computations in evolutionary genetics Gene hunting Psychiatry: genes, anatomy, personality Command and control: Situation awareness Diagnosing drilling problems Speech, Music, …

11

12 Aristotle: Logic Logic as a semi-formal system was created by Aristotle, probably inspired by current practice in mathematical arguments. There is no record of Aristotle himself applying logic, but probably the Elements of Euclid derives from Aristotles illustrations of the logical method. Which role has logic in Computer Science??

13 Nicomachean Ethics Every action is thought to aim at some good Should we not, like archers, aim at what is right? We must be content to indicate the truth roughly and in outline, and with premises of the same kind, to reach conclusions that are no better. It is equally foolish to expect probable reasoning from a mathematician as it is to demand, from a rhetorician, scientific proofs.

14 Visualization Visualize data in such a way that the important aspects are obvious - A good visualization strikes you as a punch between your eyes (Tukey, 1970) Pioneered by Florence Nightingale, first female member of Royal Statistical Society, inventor of pie charts and performance metrics

15 Probabilistic approaches Bayes: Probability conditioned by observation Cournot: An event with very small probability will not happen. Vapnik-Chervonenkis: VC-dimension and PAC, distribution-independence Kolmogorov/Vovk: A sequence is random if it cannot be compressed

16 Peirce: Abduction and uncertainty Aristotles induction, generalizing from particulars, is considered invalid by strict deductionists. Peirce made the concept clear, or at least confused on a higher level. Abduction is verification by finding a plausible explanation. Key process in scientific progress.

17 Sherlock Holmes: common sense inference Techniques used by Sherlock are modeled on Conan Doyle’s professor in medical school, who followed the methodological tradition of Hippocrates and Galen. Abductive reasoning, first spelled out by Peirce, is found in 217 instances in Sherlock Holmes adventures - 30 of them in the first novel, ‘A study in Scarlet’.

18 Thomas Bayes, amateur mathematician If we have a probability model of the world we know how to compute probabilities of events. But is it possible to learn about the world from events we see? Bayes’ proposal was forgotten but rediscovered by Laplace.

19 An alternative to Bayes’ method - hypothesis testing - is based on ’Cournot’s Bridge’: an event with very small probability will not happen Antoine Augustine Cournot (1801--1877) Pioneer in stochastic processes, market theory and structural post-modernism. Predicted demise of academic system due to discourses of administration and excellence(cf Readings).

20 Kolmogorov and randomness Andrei Kolmogorov(1903-1987) is the mathematician best known for shaping probability theory into a modern axiomatized theory. His axioms of probability tells how probability measures are defined, also on infinite and infinite-dimensional event spaces and complex product spaces. Kolmogorov complexity characterizes a random string by the smallest size of a description of it. Used to explain Vovk/Gammerman scheme of hedged prediction. Also used in MDL (Minimum Description Length) inference.

21 Normative claim of Bayesianism EVERY type of uncertainty should be treated as probability This claim is controversial and not universally accepted: Fisher(1922), Cramér, Zadeh, Dempster, Shafer, Walley(1999) … Students encounter many approaches to uncertainty management and identify weaknessess in foundational arguments.

22 Foundations for Bayesian Inference Bayes method, first documented method based on probability: Plausibility of event depends on observation, Bayes rule: Bayes’ rule organizing principle for uncertainty Parameter and observation spaces can be extremely complex, priors and likelihoods also. MCMC current approach -- often but not always applicable (difficult when posterior has many local maxima separated by low density regions) Better than Numerics??

23 Showcase application: PET-camera Camera geometry&noise film scene regularity and also any other camera or imaging device …

24 PET camera D: film, count by detector j X: radioactivity in voxel i a: camera geometry likelihood prior Inference about Y gives posterior, its mean is often a good picture

25 Sinogram and reconstruction Tumour Fruit Fly Drosophila family (Xray)

26 Support transformation of tasks and solutions in a generic fashion Integrate different command levels and services in a dynamic organization Facilitate consistent situation awareness Project Aims

27 * WIRED on Total Information Awareness WIRED (Dec 2, 2002) article "Total Info System Totally Touchy" discusses the Total Information Awareness system. ~~~ Quote: "People have to move and plan before committing a terrorist act. Our hypothesis is their planning process has a signature." Jan Walker, Pentagon spokeswoman, in Wired, Dec 2, 2002. "What's alarming is the danger of false positives based on incorrect data," Herb Edelstein, in Wired, Dec 2, 2002.

28 Combination of evidence In Bayes’ method, evidence is likelihood for observation.

29 Particle filter- general tracking

30 Chapman Kolmogorov version of Bayes’ rule

31 Berry and Linoff have eloquently stated their preferences with the often quoted sentence: "Neural networks are a good choice for most classification problems when the results of the model are more important than understanding how the model works". “Neural networks typically give the right answer”

32

33

34 1950-1980: The age of rationality. Let us describe the world with a mathematical model and compute the best way to manage it!! This is a large Bayesian Network, a popular statistical model

35 Ed Jaynes devoted a large part of his career to promote Bayesian inference. He also championed the use of Maximum Entropy in physics Outside physics, he received resistance from people who had already invented other methods. Why should statistical mechanics say anything about our daily human world??

36 Robust Bayes Priors and likelihoods are convex sets of probability distributions (Berger, de Finetti, Walley,...): imprecise probability: Every member of posterior is a ’parallell combination’ of one member of likelihood and one member of prior. For decision making: Jaynes recommends to use that member of posterior with maximum entropy (Maxent estimate).

37 SVM and Kernel method Based on Vapnik-Chervonenkis learning theory Separate classes by wide margin hyperplane classifier, or enclose data points between close parallell hyperplanes for regression Possibly after non-linear mapping to highdimensional space Assumption is only point exchangeability

38 Classify with hyperplanes Frank Rosenblatt (1928 – 1971) Pioneering work in classifying by hyperplanes in high-dimensional spaces. Criticized by Minsky-Papert, since real classes are not normally linearly separable. ANN research taken up again in 1980:s, with non-linear mappings to get improved separation. Predecessor to SVM/kernel methods

39 Find parallel hyperplanes Classification Red: true separating plane. Blue: wide margin separation in sample Classify by plane between blue planes

40 SVM and Kernel method

41 Vovk/Gammerman Hedged predictions Based on Kolmogorov complexity or non-conformance measure In classification, each prediction comes with confidence Asymptotically, misclassifications appear independently and with probability 1-confidence. Only assumption is exchangeability


Download ppt "Stefan Arnborg, KTH Statistical Methods in Applied Computer Science DD2447, DD3342, spring 2010."

Similar presentations


Ads by Google