Presentation in Aircraft Satellite Image Identification Using Bayesian Decision Theory And Moment Invariants Feature Extraction Dickson Gichaga Wambaa.

Slides:



Advertisements
Similar presentations
Pseudo-Relevance Feedback For Multimedia Retrieval By Rong Yan, Alexander G. and Rong Jin Mwangi S. Kariuki
Advertisements

Institute of Information Theory and Automation Introduction to Pattern Recognition Jan Flusser
INTRODUCTION TO MACHINE LEARNING Bayesian Estimation.
2 – In previous chapters: – We could design an optimal classifier if we knew the prior probabilities P(wi) and the class- conditional probabilities P(x|wi)
1 A Briefly Introduction of Region- Based Image Segmentation Advice Researcher: 丁建均教授 (Jian-Jiun Ding ) Presenter: 郭政錦 (Cheng-Jin Kuo) Digital Image and.
Pattern Classification, Chapter 2 (Part 2) 0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R.
Pattern Classification, Chapter 2 (Part 2) 0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R.
Bayesian Decision Theory
Pattern Classification Chapter 2 (Part 2)0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O.
What is Statistical Modeling
Hidden Markov Model based 2D Shape Classification Ninad Thakoor 1 and Jean Gao 2 1 Electrical Engineering, University of Texas at Arlington, TX-76013,
Beyond bags of features: Part-based models Many slides adapted from Fei-Fei Li, Rob Fergus, and Antonio Torralba.
CII504 Intelligent Engine © 2005 Irfan Subakti Department of Informatics Institute Technology of Sepuluh Nopember Surabaya - Indonesia.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Recognition Topic 1: Principle Component Analysis Shapiro chap
Chapter 2: Bayesian Decision Theory (Part 1) Introduction Bayesian Decision Theory–Continuous Features All materials used in this course were taken from.
Pattern Classification, Chapter 3 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P.
Multiple Human Objects Tracking in Crowded Scenes Yao-Te Tsai, Huang-Chia Shih, and Chung-Lin Huang Dept. of EE, NTHU International Conference on Pattern.
Visual Recognition Tutorial
Computer Vision I Instructor: Prof. Ko Nishino. Today How do we recognize objects in images?
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Digital Image Processing & Pattern Analysis (CSCE 563) Course Outline & Introduction Prof. Amr Goneid Department of Computer Science & Engineering The.
Introduction to machine learning
Crash Course on Machine Learning
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
: Chapter 1: Introduction 1 Montri Karnjanadecha ac.th/~montri Principles of Pattern Recognition.
ECSE 6610 Pattern Recognition Professor Qiang Ji Spring, 2011.
Principles of Pattern Recognition
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Chapter 9 Hypothesis Testing: Single Population
Speech Recognition Pattern Classification. 22 September 2015Veton Këpuska2 Pattern Classification  Introduction  Parametric classifiers  Semi-parametric.
1 Physical Fluctuomatics 5th and 6th Probabilistic information processing by Gaussian graphical model Kazuyuki Tanaka Graduate School of Information Sciences,
ECE 8443 – Pattern Recognition LECTURE 03: GAUSSIAN CLASSIFIERS Objectives: Normal Distributions Whitening Transformations Linear Discriminants Resources.
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
Image Classification 영상분류
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
Image processing Fourth lecture Image Restoration Image Restoration: Image restoration methods are used to improve the appearance of an image.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
COMP322/S2000/L171 Robot Vision System Major Phases in Robot Vision Systems: A. Data (image) acquisition –Illumination, i.e. lighting consideration –Lenses,
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Introduction to Pattern Recognition (การรู้จํารูปแบบเบื้องต้น)
A Statistical Approach to Texture Classification Nicholas Chan Heather Dunlop Project Dec. 14, 2005.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
Lecture 1.31 Criteria for optimal reception of radio signals.
IMAGE PROCESSING RECOGNITION AND CLASSIFICATION
Lecture 26: Faces and probabilities
REMOTE SENSING Multispectral Image Classification
REMOTE SENSING Multispectral Image Classification
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Statistical Models for Automatic Speech Recognition
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
ELEC6111: Detection and Estimation Theory Course Objective
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
EM Algorithm and its Applications
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Presentation transcript:

Presentation in Aircraft Satellite Image Identification Using Bayesian Decision Theory And Moment Invariants Feature Extraction Dickson Gichaga Wambaa Supervised By Professor Elijah Mwangi University Of Nairobi Electrical And Information Engineering Dept. 9 th May 2012 IEK Presentation

OUTLINE Introduction Statistical Classification Satellite images Denoising Results Conclusion References

All aircraft are built with the same basic elements:  Wings  Engine(s)  Fuselage  Mechanical Controls  Tail assembly. The differences of these elements distinguish one aircraft type from another and therefore its identification.

STAGES OF STATISTICAL PATTERN RECOGNITION PROBLEM FORMULATION DATA COLLECTION AND EXAMINATION FEATURE SELECTION OR EXTRACTION CLUSTERING DISCRIMINATION ASSESSMENT OF RESULTS INTERPRETATION

Classification ONE There are two main divisions of classification: Supervised unsupervised

SUPERVISED CLASSIFICATION BAYES CLASSIFICATION IS SELECTED SINCE IT IS POSSIBLE TO HAVE EXTREMELY HIGH VALUES IN ITS OPTIMISATION.

A decision rule partitions the measurement space into C regions.

Preprocessing

PREPROCESSING IMAGE ACQUISITION IMAGE ENHANCEMENT IMAGE BINARIZATION AND THRESHOLDING FEATURES EXTRACTION

NOISE IMAGES ARE CONTAMINATED BY NOISE THROUGH – IMPERFECT INSTRUMENTS – PROBLEMS WITH DATA ACQUISITION PROCESS – NATURAL PHENOMENA INTERFERENCE – TRANSMISSION ERRORS

SPECKLE NOISE(SPKN) THE TYPE OF NOISE FOUND IN SATELLITE IMAGES IS SPECKLE NOISE AND THIS DETERMINES THE ALGORITHM USED IN DENOISING.

Speckle Noise (SPKN) 2 This is a multiplicative noise. The distribution noise can be expressed by: J = I + n*I Where, J is the distribution speckle noise image, I is the input image and n is the uniform noise image.

CHOICE OF FILTER FILTERING CONSISTS OF MOVING A WINDOW OVER EACH PIXEL OF AN IMAGE AND TO APPLY A MATHEMATICAL FUNCTION TO ACHIEVE A SMOOTHING EFFECT.

CHOICE OF FILTER II THE MATHEMATICAL FUNCTION DETERMINES THE FILTER TYPE. MEAN FILTER-AVERAGES THE WINDOW PIXELS MEDIAN FILTER-CALCULATES THE MEDIAN PIXEL

CHOICE OF FILTER II LEE-SIGMA AND LEE FILTERS-USE STATISTICAL DISTRIBUTION OF PIXELS IN THE WINDOW LOCAL REGION FILTER-COMPARES THE VARIANCES OF WINDOW REGIONS. THE FROST FILTER REPLACES THE PIXEL OF INTEREST WITH A WEIGHTED SUM OF THE VALUES WITHIN THE NxN MOVING WINDOW AND ASSUMES A MULTIPLICATIVE NOISE AND STATIONARY NOISE STATISTICS.

LEE FILTER Adaptive Lee filter converts the multiplicative model into an additive one. It preserves edges and detail.

BINARIZATION AND THRESHOLDING

TRAINING DATA SET

RESULTS: FEATURE EXTRACTION ORIGINAL IMAGES Aircraf ts Classe s Ø1Ø1 Ø2Ø2 Ø3Ø3 Ø4Ø4 Ø5Ø5 Ø6Ø6 Ø7Ø7 B2 (Class 1) AH64 (Class 2) C5 (Class 3)

NOISE ADDITION Noise with Probabilities of 0.1, 0.2, 0.3 and 0.4 was used for simulation.

FEATURE EXTRACTION: SAMPLE IMAGES Ø1Ø1 Ø2Ø2 Ø3Ø3 Ø4Ø4 Ø5Ø5 Ø6Ø6 Ø7Ø7 B2 Class Test Image NOISE FILTERED Test Image ( 0.1 Noise Prob) Test Image ( 0.2 Noise Prob) Test Image ( 0.3 Noise Prob) Test Image ( 0.4 Noise Prob)

WHY BAYES CLASSIFICATION 1 Bayes statistical method is the classification of choice because of its minimum error rate.

WHY BAYES CLASSIFICATION 2 Probabilistic learning: among the most practical approaches to certain types of learning problems Incremental: Each training example can incrementally increase/decrease the probability that a hypothesis is correct

WHY BAYES CLASSIFICATION 3 Probabilistic prediction: Predict multiple hypotheses Benchmark: Provide a benchmark for other algorithms

Bayesian Classification For a minimum error rate classifier the choice is on the class with maximum posterior probability.

Probabilities Let λ be set of 3 classes C 1,C 2,C 3. x be an unknown feature vector of dimension 7. Calculate the conditional posterior probabilities of every class C i and choose the class with maximum posteriori probability.

Prior Probabilities 3 classes of Data which are all likely to happen therefore P(C i )= 0.333

Posterior Probability 1 Posterior = likelihood x prior evidence P(C i \x) = P(x\C i )P(C i ) P(x)

POSTERIOR PROBABILITY 2 Posterior(AH 64)=P(AH 64)P(x/ AH 64) p(evidence) Posterior(C5)=P(C5)P(x/ C5) p(evidence) Posterior(B2)=P(B2)P(x/ B2) p(evidence)

POSTERIOR PROBABILITY 3 Posterior probability Test Image NOISE FILTERED Test Image ( 0.1 Noise Prob) Test Image ( 0.2 Noise Prob) Test Image ( 0.3 Noise Prob) Test Image ( 0.4 Noise Prob) AH X X X X X10 -2 C X X X X X10 -2 B X X X X X10 -2

CONCLUSION COMBINING MOMENTS FEATURES EXTRACTION WITH BAYESIAN CLASSIFICATION WHILE USING LEE FILTERS IN PREPROCESSING INCREASES THE CHANCES OF CORRECT IDENTIFICATION AS COMPARED TO NON USE OF THE FILTERS USE OF OTHER TYPES OF FILTERS THIS IS SEEN BY THE INCREASE OF THE POSTERIOR PROBABILITY VALUES.

References [1] Richard O. Duda,Peter E. Hart and David G. Stork.Pattern Classification 2 nd edition John Wiley and Sons,US,2007 [2]Rafael C. Gonzalez,Richard E. Woods and Steven L. Eddins. Digital image processing using matlab 2nd edition Pearson/Prentice Hall,US,2004 [3]William K. Pratt. Digital image processing 4th edition John Wiley,US,2007 [4] Anil K. Jain. Fundamentals of Digital Image Processing Prentice Hall,US,1989

References [5] Wei Cao, Shaoliang Meng, “Imaging systems and Techniques”,IEEE International Workshop, IST ,pp , Shenzhen, 2009 [6]Bouguila.N, Elguebaly.T, “A Bayesian approach for texture images classification and retrieval”,International Conference on Multimedia Computing and Systems, ICMS ,pp 1-6,Canada, 2011

References [7] Dixit. M, Rasiwasia. N, Vasconcelos. N, “Adapted Gaussian models for image classification”,2011 IEEE Conference on Computer Vision and Pattern, CVPR , pp , USA,2011 [8] Mukesh C. Motwani,Mukesh C. Gadiya,Rakhi C. Motwani, Frederick C. Harris Jr., “Survey Of Image Denoising Techniques”,University of Nevada Reno, US, 2001