MRI preprocessing and segmentation.

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

A Growing Trend Larger and more complex models are being produced to explain brain imaging data. Bigger and better computers allow more powerful models.
SPM5 Segmentation. A Growing Trend Larger and more complex models are being produced to explain brain imaging data. Bigger and better computers allow.
VBM Voxel-based morphometry
COMPUTER AIDED DIAGNOSIS: CLASSIFICATION Prof. Yasser Mostafa Kadah –
Gordon Wright & Marie de Guzman 15 December 2010 Co-registration & Spatial Normalisation.
Clustering Beyond K-means
Data Mining Anomaly Detection Lecture Notes for Chapter 10 Introduction to Data Mining by Minqi Zhou © Tan,Steinbach, Kumar Introduction to Data Mining.
Basics of fMRI Preprocessing Douglas N. Greve
Supervised Learning Recap
Automatic Identification of Bacterial Types using Statistical Image Modeling Sigal Trattner, Dr. Hayit Greenspan, Prof. Shimon Abboud Department of Biomedical.
Lecture 17: Supervised Learning Recap Machine Learning April 6, 2010.
Level-Set Evolution with Region Competition: Automatic 3-D Segmentation of Brain Tumors 1 Sean Ho, 2 Elizabeth Bullitt, and 1;3 Guido Gerig 1 Department.
0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
McDaniels – Oct 10, Outline Geometric Uncertainty Uncertainty in average intensity due to lesion placement ADC uncertainty.
Volumetric Analysis of Brain Structures Using MR Imaging Lilach Shay, Shira Nehemia Bio-Medical Engineering Dr. Alon Friedman and Dr. Akiva Feintuch Department.
Anomaly Detection. Anomaly/Outlier Detection  What are anomalies/outliers? The set of data points that are considerably different than the remainder.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Bayesian Classification with a brief introduction to pattern recognition Modified from slides by Michael L. Raymer, Ph.D.
Automatic Image Segmentation of Lesions in Multispectral Brain
Brain segmentation and Phase unwrapping in MRI data ECE 738 Project JongHoon Lee.
Dimensionality reduction Usman Roshan CS 675. Supervised dim reduction: Linear discriminant analysis Fisher linear discriminant: –Maximize ratio of difference.
1 Linear Methods for Classification Lecture Notes for CMPUT 466/551 Nilanjan Ray.
MNTP Trainee: Georgina Vinyes Junque, Chi Hun Kim Prof. James T. Becker Cyrus Raji, Leonid Teverovskiy, and Robert Tamburo.
SegmentationSegmentation C. Phillips, Institut Montefiore, ULg, 2006.
ECE 8443 – Pattern Recognition LECTURE 06: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Bias in ML Estimates Bayesian Estimation Example Resources:
Image Segmentation and Seg3D Ross Whitaker SCI Institute, School of Computing University of Utah.
Bayesian networks Classification, segmentation, time series prediction and more. Website: Twitter:
Sparse Shape Representation using the Laplace-Beltrami Eigenfunctions and Its Application to Modeling Subcortical Structures Xuejiao Chen.
PVE for MRI Brain Tissue Classification Zeng Dong SLST, UESTC 6-9.
Overview of Supervised Learning Overview of Supervised Learning2 Outline Linear Regression and Nearest Neighbors method Statistical Decision.
2004 All Hands Meeting Analysis of a Multi-Site fMRI Study Using Parametric Response Surface Models Seyoung Kim Padhraic Smyth Hal Stern (University of.
1 E. Fatemizadeh Statistical Pattern Recognition.
Bayesian Parameter Estimation Liad Serruya. Agenda Introduction Bayesian decision theory Scale-Invariant Learning Bayesian “One-Shot” Learning.
: Chapter 3: Maximum-Likelihood and Baysian Parameter Estimation 1 Montri Karnjanadecha ac.th/~montri.
National Alliance for Medical Image Computing Segmentation Foundations Easy Segmentation –Tissue/Air (except bone in MR) –Bone in CT.
NA-MIC National Alliance for Medical Image Computing Segmentation Core 1-3 Meeting, May , SLC, UT.
Prototype Classification Methods Fu Chang Institute of Information Science Academia Sinica ext. 1819
MINC meeting 2003 Pipelines: analyzing structural MR data Jason Lerch.
Chapter1: Introduction Chapter2: Overview of Supervised Learning
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
SPM Pre-Processing Oli Gearing + Jack Kelly Methods for Dummies
Multiple comparisons problem and solutions James M. Kilner
Intro. ANN & Fuzzy Systems Lecture 16. Classification (II): Practical Considerations.
Cell Segmentation in Microscopy Imagery Using a Bag of Local Bayesian Classifiers Zhaozheng Yin RI/CMU, Fall 2009.
Level Set Segmentation ~ 9.37 Ki-Chang Kwak.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
ASP algorithm Image term Stretch term Bending term Self-proximity term Vertex-vertex proximity constraints.
Lecture 2. Bayesian Decision Theory
Medical Image Analysis
Random Forests For Multiple Sclerosis Lesion Segmentation
Ch8: Nonparametric Methods
Dimensionality reduction
Overview of Supervised Learning
Lecture Notes for Chapter 9 Introduction to Data Mining, 2nd Edition
Outlier Discovery/Anomaly Detection
Outline Parameter estimation – continued Non-parametric methods.
Course Outline MODEL INFORMATION COMPLETE INCOMPLETE
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Statistical Challenges in “Big Data” Human Neuroimaging
LECTURE 07: BAYESIAN ESTIMATION
Anatomical Measures John Ashburner
Mathematical Foundations of BME
Hairong Qi, Gonzalez Family Professor
Data Mining Anomaly Detection
EM Algorithm and its Applications
MultiModality Registration using Hilbert-Schmidt Estimators
Data Mining Anomaly Detection
Presentation transcript:

MRI preprocessing and segmentation

Bias References

Segmentation References

Segmentation pipeline Validation Clarke, 1995

1. Preprocessing 1.1. Brain extraction 1.2. Removal of field inhomogeneities (bias-field)

1.1. Brain extraction MRI of head Intracranial volume Extracted brain

1.1. Brain extraction FSL: Initiate a mesh inside the skull and expand-wrap onto brain surface Huh, 2002 method: go to mid sagittal, find brain, copy mask on adjacent slices correct the copied mask

1.1. Brain extraction initial mask adjacent slice j mask of slice j Huh, 2002 challenge

1.1. Brain extraction restoring truncated boundary

Let voxel have a value 1 if its intensity is higher than t (determine t arbitrarily, increase when needed)

1.2. Removal of field inhomogeneities Bias field Phantom studies: Typical signal falloff in SI direction is 20% intensity S I 20 % x

1.2. Removal of field inhomogeneities Statistical methods: probabilistic, gaussian and mixture models of bias-field Polynomial methods: smooth polynomial fit to bias-field

1.2. Removal of field inhomogeneities Polynomial method example: Milchenko, 2006

Milchenko, 2006

1.2. Removal of field inhomogeneities orig model bias result Shattuck, 2001

2. Feature extraction Features: - Intensities in a single MRI: univariate classification - Feature vector from a single MRI: multi-variate class. ex: [I(x,y,z) f(N(x,y,z)) g(N(x,y,z))] where N : neighbourhood around (x,y,z) f: distribution of I in neighborhood (entropy) g: average I in neighborhood or f, g specify edge or boundary information - Intensities in multiple MRIs with different contrast: multi-variate (multi-spectral)

3. Segmentation 3 tissue types: CSF, GM, WM 4 regions: R1: air, scalp, fat, skull (background, removed) R2: subarachnoid space (CSF) R3: parenchyma (GM, WM) R4: ventricles(CSF)

3. Segmentation (dual echo:T2, PD or T1, T2, PD weighted) (T1 weighted) Clarke, 1995

3. Segmentation T1 weighted, single intensity dual echo:T2, PD or T1, T2, PD weighted or T1 weighted with feature vector 3.1. Histogram based thresholding 3.2. Bayesian Unsupervised 3.6. k-means 3.7. fuzzy cmeans Supervised Parametric Non-parametric ANN 3.3. Max. Likelihood 3.4. k-NN 3.5. MLP

3.1. Histogram based thresholding WM GM Lcp crossing point of tangents Histogram of extracted, bias corrected brain in T1-weighted MRI L = g * Lcp (set g manually on 80 images) if I(x,y,z) < L then GM else WM Schnack, 2001

3.2. Bayesian segmentation WM Population1 Population2 Population3 (#of voxels/#ofallvoxels in the brain) (intensity) Hypothetical distributions

3.2. Bayes’ classifier For each voxel, x,y,z: Assume K tissue types (for eg. T1, T2, ..., Tk) possible, for 1 observed intensity, I: setup graphs above from regional data GM, WM, CSF ratios from volumetric studies P(Tj ! I) = P(I ! Tj) . P(Tj) Ξ P(I ! Tk). P(Tk) k J,k=1,2,3: 1: CSF, 2: GM, 3:WM Decide on tissue type m if: P(Tm ! I) > P(Tj ! I) for all j Kovacevic, 2002

Methods based on feature vector or multi-spectral data Supervised vs unsupervised Methods Supervised: - Color indicates known classes - Separation contour is to be found during training phase - Separation contour is used for classification during recall phase Unsupervised: - No color, classes unknown - Clusters are found during training phase - Association with clusters are made during recall phase

PD weighted image intensity intensity voxel x,y,z T2 weighted T2 weighted image Kovacevic, 2001

Suckling, 1999

3.3. Maximum likelihood classifier - Assume the distribution P(I ! Tj) in Bayes can be obtained by a mixture of Gaussian or Normal distribution - Estimate means and co-variance matrix - For better results use Hidden Markov fields within neighborhoods 15 classes Zavaljevski, 2000

3.3. Maximum likelihood classifier Zavaljevski, 2000 Normal subject Stroke patient

3.4. K-NN, K-Nearest neighbor classifier Hypothetical distribution T2 intensity T1 intensity - k is always odd, 1<k<15 (as k increases comput time increases) - given a point p find k closest samples known from before - decide on class m where m is the highest number of classes among these k samples

3.4. K-NN classifier Uses 5 different contrast MRIs manual atlas labels atlas labels labels with linear reg. with non-lin reg. k=1 k=45 Vrooman, 2007

3.5. ANN, MLP classifier for segmentation, :F M = 3, 3 classes MLP Architecture: 1 layer: linear contour >1 layers: complex contours countours are used for class separation W1 W3 feature vector transfer fcn: sigmoid

3.5. ANN, MLP classifier Results This page is empty on purpose

3.6. k-means classifier This classifier is not used much in segmentation, but explained here as an introduction to fuzzy c-means Algorithm: - k is equal to number of classes - choose k arbitrary initial seed points (*) - assume seed points are class centroids 1 for each sample point j, find distance to all k centroids Let j belong to class m if j is closest to centroid m 2 for each class k, recalculate centroids repeat steps 1 and 2 above until no change in centroids Note how class assignments change at each iteration Minimized measure:

3.7. fuzzy c-means (FCM) classifier k-means classifier FCM classifier U: membership row=each sample x col=each class minimized cost

3.7. fuzzy c-means (FCM) classifier Initialize U=[uij] matrix, U(0) At k-step: calculate the centers vectors C(k)=[cj] with U(k) initial iteration 8 iteration 37 Update U(k) , U(k+1) If || U(k+1) - U(k)||< then STOP; otherwise return to step 2.

3.7. fuzzy c-means classifier Results

4. Validation Important issues: - Partial volume effect, visualization - Validation in manually segmented image - Performance comparison with other methods on simulated image: Ex: Brainweb from Mcgill

4. Validation Clark, 2006 Partial volume effect for boundary separation Shattuck, 2001 segmented gold std corrrect WM misclassified (colored by subejct number there are a total of 10 subjects)