MultiModality Registration Using Hilbert-Schmidt Estimators By: Srinivas Peddi Computer Integrated Surgery II April 27 th, 2001 Final Presentation.

Slides:



Advertisements
Similar presentations
Psych 5500/6500 t Test for Two Independent Groups: Power Fall, 2008.
Advertisements

Programming exercises: Angel – lms.wsu.edu – Submit via zip or tar – Write-up, Results, Code Doodle: class presentations Student Responses First visit.
Order Structure, Correspondence, and Shape Based Categories Presented by Piotr Dollar October 24, 2002 Stefan Carlsson.
Object Specific Compressed Sensing by minimizing a weighted L2-norm A. Mahalanobis.
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
1. Tuesday: Halloween Shoot due TOMORROW. You must make a contact sheet of your photos and print it from my computer tomorrow. -5 pts for every day I don’t.
Systems Analysis, Prototyping and Iteration Systems Analysis.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Case Tools Trisha Cummings. Our Definition of CASE  CASE is the use of computer-based support in the software development process.  A CASE tool is a.
Lecture 3 Nonparametric density estimation and classification
Complex Feature Recognition: A Bayesian Approach for Learning to Recognize Objects by Paul A. Viola Presented By: Emrah Ceyhan Divin Proothi Sherwin Shaidee.
Raster Based GIS Analysis
Data Structures Data Structures Topic #13. Today’s Agenda Sorting Algorithms: Recursive –mergesort –quicksort As we learn about each sorting algorithm,
1cs542g-term Notes  Assignment 1 will be out later today (look on the web)
Mrs. Chapman. Tabs (Block Categories) Commands Available to use Script Area where you type your code Sprite Stage All sprites in this project.
(1) Feature-point matching by D.J.Duff for CompVis Online: Feature Point Matching Detection,
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Approximation Algorithms
Final Gathering on GPU Toshiya Hachisuka University of Tokyo Introduction Producing global illumination image without any noise.
Computing motion between images
Independent Component Analysis (ICA) and Factor Analysis (FA)
Computing Sketches of Matrices Efficiently & (Privacy Preserving) Data Mining Petros Drineas Rensselaer Polytechnic Institute (joint.
Cliff Rhyne and Jerry Fu June 5, 2007 Parallel Image Segmenter CSE 262 Spring 2007 Project Final Presentation.
Value of Information for Complex Economic Models Jeremy Oakley Department of Probability and Statistics, University of Sheffield. Paper available from.
CS 106 Introduction to Computer Science I 10 / 16 / 2006 Instructor: Michael Eckmann.
Mr. Wortzman. Tabs (Block Categories) Available Blocks Script Area Sprite Stage All sprites in this project.
Nachos Phase 1 Code -Hints and Comments
Standardizing catch per unit effort data
storing data in k-space what the Fourier transform does spatial encoding k-space examples we will review:  How K-Space Works This is covered in the What.
Othello Artificial Intelligence With Machine Learning
Introduction Algorithms and Conventions The design and analysis of algorithms is the core subject matter of Computer Science. Given a problem, we want.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Alternating Series.
CMPF144 FUNDAMENTALS OF COMPUTING THEORY Module 10: Applied Mathematical - Logical Knowledge Riddle/Puzzle.
240-Current Research Easily Extensible Systems, Octave, Input Formats, SOA.
Learning Theory Reza Shadmehr LMS with Newton-Raphson, weighted least squares, choice of loss function.
A New Method of Probability Density Estimation for Mutual Information Based Image Registration Ajit Rajwade, Arunava Banerjee, Anand Rangarajan. Dept.
Conclusions The success rate of proposed method is higher than that of traditional MI MI based on GVFI is robust to noise GVFI based on f1 performs better.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Presented by Teererai Marange. According to Caliskan-Islam et al.(2015), authorship attribution using the Code Stylometry feature set is possible when.
Solution of. Linear Differential Equations The first special case of first order differential equations that we will look is the linear first order differential.
Taking Notes A step-by-step guide on how to maximise your Senior School Experience.
UNIT 5.  The related activities of sorting, searching and merging are central to many computer applications.  Sorting and merging provide us with a.
1 Spring 2003 Prof. Tim Warburton MA557/MA578/CS557 Lecture 23.
Click to add text Systems Analysis, Prototyping and Iteration.
Week 6. Statistics etc. GRS LX 865 Topics in Linguistics.
Written by Changhyun, SON Chapter 5. Introduction to Design Optimization - 1 PART II Design Optimization.
Evaluation of gene-expression clustering via mutual information distance measure Ido Priness, Oded Maimon and Irad Ben-Gal BMC Bioinformatics, 2007.
Designing Factorial Experiments with Binary Response Tel-Aviv University Faculty of Exact Sciences Department of Statistics and Operations Research Hovav.
MultiModality Registration Using Hilbert-Schmidt Estimators By: Srinivas Peddi Computer Integrated Surgery II April 6 th, 2001.
Othello Artificial Intelligence With Machine Learning Computer Systems TJHSST Nick Sidawy.
Geology 6600/7600 Signal Analysis 23 Oct 2015
Artificial Intelligence in Game Design Lecture 20: Hill Climbing and N-Grams.
Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.
Reflective Writing. Being reflective So far, in looking at using thoughts, feelings, details, descriptions and dialogue, we have been concentrating on.
6/13/20161 Greedy A Comparison. 6/13/20162 Greedy Solves an optimization problem: the solution is “best” in some sense. Greedy Strategy: –At each decision.
A Collapsed Variational Bayesian Inference Algorithm for Latent Dirichlet Allocation Yee W. Teh, David Newman and Max Welling Published on NIPS 2006 Discussion.
Maryam Pourebadi Kent State University April 2016.
CS479/679 Pattern Recognition Dr. George Bebis
Hypothesis Testing for Proportions
Multi-modality image registration using mutual information based on gradient vector flow Yujun Guo May 1,2006.
Logistic Regression & Parallel SGD
MPHY8149: Image Registration
Image Registration 박성진.
Title of Project Joseph Hallahan Computer Systems Lab
Learning Theory Reza Shadmehr
Fourier Transform of Boundaries
Mathematical Foundations of BME
MultiModality Registration using Hilbert-Schmidt Estimators
Approximated Volumetric Reconstruction from Projected Images
Presentation transcript:

MultiModality Registration Using Hilbert-Schmidt Estimators By: Srinivas Peddi Computer Integrated Surgery II April 27 th, 2001 Final Presentation

Outline Brief refresher of my project Things accomplished in the project Things left to improve upon Future Directions Conclusions

The Project T1 PD T2 I want to be able to register different modalities of MR images accurately. This means coming up with a new registration algorithm and getting around the intensity difference problem.

Original Maximal Deliverables To register the three different modalities accurately by first using Bayesian Segmentation to circumvent the intensity-difference problem. To compare this approach with other multimodality registration algorithms such as the Maximization of Mutual Information algorithm. To examine the feasibility of using the Hilbert- Schmidt algorithm in real-life applications.

What Has Been Achieved? T1 PD T2 Original images PD Segmentation T1 Segmentation T2 Segmentation Original images  Bayesian Segmentation Original images  Bayesian Segmentation  Switching of Intensities PD Switched Segmentation T1 Switched Segmentation T2 Switched Segmentation

Once the segmentation process is done, and the intensities have been switched, we can actually do the registration. We apply the Hilbert-Schmidt algorithm which uses a minimum mean-squared (MMSE) estimator. Registration is achieved by finding the element of the special Euclidean group (SE n ) that minimizes the error. Registration

Registering different modalities The maximum aim of the project has been achieved. At this point, we are able to register PD with T1 (which you saw in the checkpoint presentation) but we can now also register PD with T2 and T1 with T2. These registrations are possible at different noise levels as long as the segmentation is reasonable.

How accurate is this method? To examine how accurate something is, we must first define an error measure. The one that I will be using is called the Hilbert-Schmidt bound. A second thing that can help in getting an intuitive feel about the accuracy of the algorithm, is having another algorithm to compare it to. In this presentation, I will be using the Maximization of Mutual Information Algorithm.

What is the Hilbert-Schmidt bound? The Hilbert-Schmidt norm is defined as the norm of a matrix. Example: A = [ ] ||A|| = [ (-1) 2 + (-4) 2 + (-2) 2 ] 1/2 = 5 The Hilbert-Schmidt bound (HSB) is defined as the matrix norm of the difference between the true matrix transformation and the calculated matrix transformation.

Maximization of Mutual Information This algorithm was implemented by Wells et al. at the SPL in Since then, it has become the registration tool of choice when doing multimodality registration. The algorithm attempts to find the registration by maximizing the information that one volumetric image provides about the other.

Comparison of the two algorithms As one can see, the Hilbert-Schmidt algorithm outperforms the mutual information algorithm at high noise but at low noise, they both register the images accurately.

Pros and Cons Accuracy: As mentioned, it seems that the Hilbert-Schmidt algorithm outperforms the mutual information algorithm in this category, especially at high noise levels. Speed: The Mutual Information algorithm runs much faster, at least for now, especially because it does not do much preprocessing. Generality: The Mutual Information algorithm assumes no a priori relationship between the two modalities and hence all modalities can be registered using the same algorithm. The Hilbert-Schmidt algorithm is striving to do the same.

Pros and Cons Cont’d Ease of use: Since the Mutual Information algorithm has less steps or at least is better integrated, it is easier to use. The hope is that later, the segmentation and the registration can be coupled in one program in which case, the Hilbert-Schmidt would also become easier to use. Robustness: Since the Mutual Information algorithm is essentially a simpler algorithm with less steps, it is very robust. With time, I hope to make the Hilbert-Schmidt algorithm as robust.

Improvements to be made Gradient descent algorithm has been implemented but can be improved upon especially by using a ‘blurring’ algorithm and also by selecting random points more wisely. The algorithm needs to be extended so that one can register 3D volumes rather than just 2D images which is what we have now.

Gradient Descent Algorithm What I presently do is pick a series of random points from 0 to 360, and then march in the direction of increasing probability. It would be nice to add two layers of random points so that there would be more density of random points near the maximum. Being able to blur the image without shifting the maximum would also be helpful.

Future Directions From the analysis so far it seems that the Hilbert- Schmidt algorithm outperforms other algorithms at higher noise levels. Therefore, it is possible that the HS algorithm might be very useful in functional modalities such as fMRI and PET where the SNRs are considerably higher than MR or CT. The segmentation algorithm can be improved to incorporate things such as windowing so as to improve the accuracy of the algorithm.

Feasibility of HS algorithm in real-life circumstances The algorithm developed in this project definitely has potential for real-life applications. However, some key improvements have to be made first which integrate the various steps of the protocol better and the algorithm needs to extended to 3D volumes. As far as speed goes, computers are becoming more powerful as we speak, and it does not seem that this will be a limiting factor in the application of this algorithm.

What have I learned? Projects are a lot of work, and many times, things do not go according to plan. Always have contingencies. Presentation of ideas well is very important. Almost as important as the project itself. Presentation is also a lot more work than I thought. There are many tools out there that can make one’s life easier. It is much better to use these tools than trying to reconstruct the wheel.

What would I do differently? Have a more detailed plan at the beginning of the project. There were points when I wasn’t sure if I was ahead or behind schedule. Have a thorough understanding of the theory of the topic before starting to code. It can save a lot of time and effort! Efficiency and ease-of-use are the two very important features of an algorithm. I wish I had started working earlier on those aspects of the algorithm.

Conclusions I have a learned a lot from this class and there are some things that I would do differently if I have were to start from the beginning again. However, all in all, I think that the project went relatively smoothly and that the Hilbert-Schmidt algorithm developed here has potential for success in the future. The most important conclusion to draw from this presentation though is that …

I’m DONE!!!