Learning to Decode Cognitive States from Brain Images Tom Mitchell et al. Carnegie Mellon University Presented by Bob Stark.

Slides:



Advertisements
Similar presentations
COMPUTER AIDED DIAGNOSIS: CLASSIFICATION Prof. Yasser Mostafa Kadah –
Advertisements

Relevant characteristics extraction from semantically unstructured data PhD title : Data mining in unstructured data Daniel I. MORARIU, MSc PhD Supervisor:
Integrated Instance- and Class- based Generative Modeling for Text Classification Antti PuurulaUniversity of Waikato Sung-Hyon MyaengKAIST 5/12/2013 Australasian.
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
An Introduction of Support Vector Machine
Machine learning continued Image source:
An Overview of Machine Learning
Indian Statistical Institute Kolkata
A (very) brief introduction to multivoxel analysis “stuff” Jo Etzel, Social Brain Lab
Discriminative and generative methods for bags of features
Assuming normally distributed data! Naïve Bayes Classifier.
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
MACHINE LEARNING 9. Nonparametric Methods. Introduction Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 
Lesson learnt from the UCSD datamining contest Richard Sia 2008/10/10.
Hidden Process Models with applications to fMRI data Rebecca Hutchinson Oregon State University Joint work with Tom M. Mitchell Carnegie Mellon University.
1 Hidden Process Models Rebecca Hutchinson Joint work with Tom Mitchell and Indra Rustandi.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Principle of Locality for Statistical Shape Analysis Paul Yushkevich.
1 Learning fMRI-Based Classifiers for Cognitive States Stefan Niculescu Carnegie Mellon University April, 2003 Our Group: Tom Mitchell, Luis Barrios, Rebecca.
These slides are based on Tom Mitchell’s book “Machine Learning” Lazy learning vs. eager learning Processing is delayed until a new instance must be classified.
1 Classifying Instantaneous Cognitive States from fMRI Data Tom Mitchell, Rebecca Hutchinson, Marcel Just, Stefan Niculescu, Francisco Pereira, Xuerui.
Hidden Process Models Rebecca Hutchinson Tom M. Mitchell Indrayana Rustandi October 4, 2006 Women in Machine Learning Workshop Carnegie Mellon University.
1 Automated Feature Abstraction of the fMRI Signal using Neural Network Clustering Techniques Stefan Niculescu and Tom Mitchell Siemens Medical Solutions,
Multi-voxel Pattern Analysis (MVPA) and “Mind Reading” By: James Melrose.
What is Learning All about ?  Get knowledge of by study, experience, or being taught  Become aware by information or from observation  Commit to memory.
CHAPTER 4: Parametric Methods. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 Parametric Estimation X = {
CHAPTER 4: Parametric Methods. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 Parametric Estimation Given.
MACHINE LEARNING 6. Multivariate Methods 1. Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 Motivating Example  Loan.
Dimensionality Reduction for fMRI Brain Imaging Data Leman Akoglu Carnegie Mellon University, Computer Science Department Abstract Functional Magnetic.
Modeling fMRI data generated by overlapping cognitive processes with unknown onsets using Hidden Process Models Rebecca A. Hutchinson (1) Tom M. Mitchell.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Discriminative and generative methods for bags of features
Learning to Identify Overlapping and Hidden Cognitive Processes from fMRI Data Rebecca Hutchinson, Tom Mitchell, Indra Rustandi Carnegie Mellon University.
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
Step 3: Classification Learn a decision rule (classifier) assigning bag-of-features representations of images to different classes Decision boundary Zebra.
How To Do Multivariate Pattern Analysis
Outline Classification Linear classifiers Perceptron Multi-class classification Generative approach Naïve Bayes classifier 2.
CSE 446 Gaussian Naïve Bayes & Logistic Regression Winter 2012
ADVANCED CLASSIFICATION TECHNIQUES David Kauchak CS 159 – Fall 2014.
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 24 – Classifiers 1.
The Broad Institute of MIT and Harvard Classification / Prediction.
1 Preliminary Experiments: Learning Virtual Sensors Machine learning approach: train classifiers –fMRI(t, t+  )  CognitiveState Fixed set of possible.
ALIP: Automatic Linguistic Indexing of Pictures Jia Li The Pennsylvania State University.
Machine Learning for Analyzing Brain Activity Tom M. Mitchell Machine Learning Department Carnegie Mellon University October 2006 Collaborators: Rebecca.
Overview of Supervised Learning Overview of Supervised Learning2 Outline Linear Regression and Nearest Neighbors method Statistical Decision.
Classifiers Given a feature representation for images, how do we learn a model for distinguishing features from different classes? Zebra Non-zebra Decision.
1 Data Analysis for fMRI Computational Analyses of Brain Imaging CALD and Psychology Tom M. Mitchell and Marcel Just January 15, 2003.
MLE’s, Bayesian Classifiers and Naïve Bayes Machine Learning Tom M. Mitchell Machine Learning Department Carnegie Mellon University January 30,
Classification Derek Hoiem CS 598, Spring 2009 Jan 27, 2009.
MVPD – Multivariate pattern decoding Christian Kaul MATLAB for Cognitive Neuroscience.
USE RECIPE INGREDIENTS TO PREDICT THE CATEGORY OF CUISINE Group 7 – MEI, Yan & HUANG, Chenyu.
Learning to distinguish cognitive subprocesses based on fMRI Tom M. Mitchell Center for Automated Learning and Discovery Carnegie Mellon University Collaborators:
Support vector machine LING 572 Fei Xia Week 8: 2/23/2010 TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A 1.
KNN & Naïve Bayes Hongning Wang Today’s lecture Instance-based classifiers – k nearest neighbors – Non-parametric learning algorithm Model-based.
1 Modeling the fMRI signal via Hierarchical Clustered Hidden Process Models Stefan Niculescu, Tom Mitchell, R. Bharat Rao Siemens Medical Solutions Carnegie.
Locally Linear Support Vector Machines Ľubor Ladický Philip H.S. Torr.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 9: Review.
FUZZ-IEEE Kernel Machines and Additive Fuzzy Systems: Classification and Function Approximation Yixin Chen and James Z. Wang The Pennsylvania State.
BAYESIAN LEARNING. 2 Bayesian Classifiers Bayesian classifiers are statistical classifiers, and are based on Bayes theorem They can calculate the probability.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
KNN & Naïve Bayes Hongning Wang
Palm Oil Plantation Area Clusterization for Monitoring
Overview of Supervised Learning
Hidden Process Models with applications to fMRI data
Machine Learning Week 1.
Support Vector Machines
Prepared by: Mahmoud Rafeek Al-Farra
School of Computer Science, Carnegie Mellon University
Machine Learning with Clinical Data
Derek Hoiem CS 598, Spring 2009 Jan 27, 2009
Presentation transcript:

Learning to Decode Cognitive States from Brain Images Tom Mitchell et al. Carnegie Mellon University Presented by Bob Stark

Outline ● Experiment Setup ● Machine learning ● 3 case studies ● Results ● Analysis

Experiment Setup ● Functional Magnetic Resonance Imaging (fMRI) used – Regions of Interest (ROIs) ● fMRI scans taken constantly, once per second for 20 minutes: – each image =~ 15,000 voxels – tens of millions of observations total – each voxel a machine learning feature

Experiment Setup (2)

Machine Learning ● Find f: fMRI-sequence(t 1, t 2 ) -> CognitiveState – Supervised – Cross-validation ● Gaussian Naive Bayes (GNB) ● Support Vector Machine (SVM) ● k Nearest Neighbor (kNN)

Gaussian Naive Bayes ● Estimate distribution of P(c|x) ● P(x|c) := Gaussian ● P(c) := Bernoulli ● Naive? ● SharedVariance, DistinctVariance ● Parameters estimated with MLE

Super Vector Machine ● n-1-dimensional hyperplane classifier ● Maximum-margin hyperplane ● Linear kernel SVM

k Nearest Neighbor ● A new sample is classified based on its nearest “neighbor” in the training set ● Used Euclidean distance (nearest brain location) ● Most frequent class among k training samples – k := {1,3,5,7,9}

3 Case Studies ● Picture vs. Sentence ● Syntactic Ambiguity ● Semantic Categories

Picture vs. Sentence ● “Does this sentence correctly describe this picture?” ● Classify (differentiate between) examining picture and sentence ● Picture (or sentence) 4 seconds then blank 4 seconds = 8 ● f: fMRI-sequence(t 0, t 0 + 8) -> {Picture, Sentence}

Syntactic Ambiguity ● “The experienced soldiers warned about the dangers conducted the midnight raid” ● “The experienced soldiers spoke about the dangers before the midnight raid” ● f: fMRI-sequence(t , t ) -> {Ambiguous, Unambiguous}

Semantic Categories ● Quickly presented with categories, then words and asked to vote to tell if it was in one of the categories or not – Fish, four-legged animals, trees, flowers, fruits, vegetables, family members, occupations, tools, kitchen items, dwellings, building parts ● Classify the category from one fMRI image ● f: fMRI(t) -> WordCategory – Notice: no interval ● Image on slide 4

Results ● Listed: cross-validation classification error ● GNB & SVM > kNN ● kNN is best with larger k

Analysis ● It's feasible! ● But only for specific cognitive states from specific tasks ● Not true/false, correct/incorrect, negative/positive! – Maybe more training data? ● Most uses or types = open question ● Single-subject classifiers were used – Implications of multi-subject classifiers?

References ● "Learning to Decode Cognitive States from Brain Images," T.M. Mitchell, R. Hutchinson, R.S. Niculescu, F.Pereira, X. Wang, M. Just, and S. Newman, Machine Learning, Vol. 57, Issue 1-2, pp October ●