Download presentation

Presentation is loading. Please wait.

Published byDana Griffin Modified over 3 years ago

1
Presentation in Aircraft Satellite Image Identification Using Bayesian Decision Theory And Moment Invariants Feature Extraction Dickson Gichaga Wambaa Supervised By Professor Elijah Mwangi University Of Nairobi Electrical And Information Engineering Dept. 9 th May 2012 IEK Presentation

2
OUTLINE Introduction Statistical Classification Satellite images Denoising Results Conclusion References

3
All aircraft are built with the same basic elements: Wings Engine(s) Fuselage Mechanical Controls Tail assembly. The differences of these elements distinguish one aircraft type from another and therefore its identification.

4
STAGES OF STATISTICAL PATTERN RECOGNITION PROBLEM FORMULATION DATA COLLECTION AND EXAMINATION FEATURE SELECTION OR EXTRACTION CLUSTERING DISCRIMINATION ASSESSMENT OF RESULTS INTERPRETATION

5
Classification ONE There are two main divisions of classification: Supervised unsupervised

6
SUPERVISED CLASSIFICATION BAYES CLASSIFICATION IS SELECTED SINCE IT IS POSSIBLE TO HAVE EXTREMELY HIGH VALUES IN ITS OPTIMISATION.

7
A decision rule partitions the measurement space into C regions.

8
Preprocessing

9
PREPROCESSING IMAGE ACQUISITION IMAGE ENHANCEMENT IMAGE BINARIZATION AND THRESHOLDING FEATURES EXTRACTION

10
NOISE IMAGES ARE CONTAMINATED BY NOISE THROUGH – IMPERFECT INSTRUMENTS – PROBLEMS WITH DATA ACQUISITION PROCESS – NATURAL PHENOMENA INTERFERENCE – TRANSMISSION ERRORS

11
SPECKLE NOISE(SPKN) THE TYPE OF NOISE FOUND IN SATELLITE IMAGES IS SPECKLE NOISE AND THIS DETERMINES THE ALGORITHM USED IN DENOISING.

12
Speckle Noise (SPKN) 2 This is a multiplicative noise. The distribution noise can be expressed by: J = I + n*I Where, J is the distribution speckle noise image, I is the input image and n is the uniform noise image.

13
CHOICE OF FILTER FILTERING CONSISTS OF MOVING A WINDOW OVER EACH PIXEL OF AN IMAGE AND TO APPLY A MATHEMATICAL FUNCTION TO ACHIEVE A SMOOTHING EFFECT.

14
CHOICE OF FILTER II THE MATHEMATICAL FUNCTION DETERMINES THE FILTER TYPE. MEAN FILTER-AVERAGES THE WINDOW PIXELS MEDIAN FILTER-CALCULATES THE MEDIAN PIXEL

15
CHOICE OF FILTER II LEE-SIGMA AND LEE FILTERS-USE STATISTICAL DISTRIBUTION OF PIXELS IN THE WINDOW LOCAL REGION FILTER-COMPARES THE VARIANCES OF WINDOW REGIONS. THE FROST FILTER REPLACES THE PIXEL OF INTEREST WITH A WEIGHTED SUM OF THE VALUES WITHIN THE NxN MOVING WINDOW AND ASSUMES A MULTIPLICATIVE NOISE AND STATIONARY NOISE STATISTICS.

16
LEE FILTER Adaptive Lee filter converts the multiplicative model into an additive one. It preserves edges and detail.

17
BINARIZATION AND THRESHOLDING

18
TRAINING DATA SET

19
RESULTS: FEATURE EXTRACTION ORIGINAL IMAGES Aircraf ts Classe s Ø1Ø1 Ø2Ø2 Ø3Ø3 Ø4Ø4 Ø5Ø5 Ø6Ø6 Ø7Ø7 B2 (Class 1) 6.613214.053 8 15.246 2 17.452 1 33.946 9 24.679 8 39.264 8 AH64 (Class 2) 7.172916.672 3 19.741 3 21.878 4 42.803 8 30.214 6 47.133 6 C5 (Class 3) 7.148720.279 3 22.412 9 24.496 2 48.061 4 34.640 1 50.198 0

20
NOISE ADDITION Noise with Probabilities of 0.1, 0.2, 0.3 and 0.4 was used for simulation.

21
FEATURE EXTRACTION: SAMPLE IMAGES Ø1Ø1 Ø2Ø2 Ø3Ø3 Ø4Ø4 Ø5Ø5 Ø6Ø6 Ø7Ø7 B2 Class 1 6.613214.053815.246217.452133.946924.679839.2648 Test Image NOISE FILTERED 6.600113.981015.167817.443433.845624.657840.9765 Test Image ( 0.1 Noise Prob) 6.557913.911515.038217.244233.532924.403141.0169 Test Image ( 0.2 Noise Prob) 6.540613.889814.967317.173733.392324.322338.6145 Test Image ( 0.3 Noise Prob) 6.470313.713614.635116.840332.7292 23.904538.4642 Test Image ( 0.4 Noise Prob) 6.412413.576314.259316.461431.976523.460936.7216

22
WHY BAYES CLASSIFICATION 1 Bayes statistical method is the classification of choice because of its minimum error rate.

23
WHY BAYES CLASSIFICATION 2 Probabilistic learning: among the most practical approaches to certain types of learning problems Incremental: Each training example can incrementally increase/decrease the probability that a hypothesis is correct

24
WHY BAYES CLASSIFICATION 3 Probabilistic prediction: Predict multiple hypotheses Benchmark: Provide a benchmark for other algorithms

25
Bayesian Classification For a minimum error rate classifier the choice is on the class with maximum posterior probability.

26
Probabilities Let λ be set of 3 classes C 1,C 2,C 3. x be an unknown feature vector of dimension 7. Calculate the conditional posterior probabilities of every class C i and choose the class with maximum posteriori probability.

27
Prior Probabilities 3 classes of Data which are all likely to happen therefore P(C i )= 0.333

28
Posterior Probability 1 Posterior = likelihood x prior evidence P(C i \x) = P(x\C i )P(C i ) P(x)

29
POSTERIOR PROBABILITY 2 Posterior(AH 64)=P(AH 64)P(x/ AH 64) p(evidence) Posterior(C5)=P(C5)P(x/ C5) p(evidence) Posterior(B2)=P(B2)P(x/ B2) p(evidence)

30
POSTERIOR PROBABILITY 3 Posterior probability Test Image NOISE FILTERED Test Image ( 0.1 Noise Prob) Test Image ( 0.2 Noise Prob) Test Image ( 0.3 Noise Prob) Test Image ( 0.4 Noise Prob) AH 641.6954X10 -2 1.6789X10 -2 1.6034X10 -2 1.5674X10 -2 1.5045X10 -2 C51.9653X10 -2 1.8965X10 -2 1.8463X10 -2 1.8062X10 -2 1.7453X10 -2 B22.4239X10 -2 2.2346X10 -2 2.21567X10 -2 2.1866X10 -2 1.9889X10 -2

31
CONCLUSION COMBINING MOMENTS FEATURES EXTRACTION WITH BAYESIAN CLASSIFICATION WHILE USING LEE FILTERS IN PREPROCESSING INCREASES THE CHANCES OF CORRECT IDENTIFICATION AS COMPARED TO NON USE OF THE FILTERS USE OF OTHER TYPES OF FILTERS THIS IS SEEN BY THE INCREASE OF THE POSTERIOR PROBABILITY VALUES.

32
References [1] Richard O. Duda,Peter E. Hart and David G. Stork.Pattern Classification 2 nd edition John Wiley and Sons,US,2007 [2]Rafael C. Gonzalez,Richard E. Woods and Steven L. Eddins. Digital image processing using matlab 2nd edition Pearson/Prentice Hall,US,2004 [3]William K. Pratt. Digital image processing 4th edition John Wiley,US,2007 [4] Anil K. Jain. Fundamentals of Digital Image Processing Prentice Hall,US,1989

33
References [5] Wei Cao, Shaoliang Meng, “Imaging systems and Techniques”,IEEE International Workshop, IST.2009.5071625,pp 164-167, Shenzhen, 2009 [6]Bouguila.N, Elguebaly.T, “A Bayesian approach for texture images classification and retrieval”,International Conference on Multimedia Computing and Systems, ICMS.2011.5945719,pp 1-6,Canada, 2011

34
References [7] Dixit. M, Rasiwasia. N, Vasconcelos. N, “Adapted Gaussian models for image classification”,2011 IEEE Conference on Computer Vision and Pattern, CVPR.2011.5995674, pp 937-943, USA,2011 [8] Mukesh C. Motwani,Mukesh C. Gadiya,Rakhi C. Motwani, Frederick C. Harris Jr., “Survey Of Image Denoising Techniques”,University of Nevada Reno, US, 2001

Similar presentations

OK

Pattern Classification, Chapter 2 (Part 2) 0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R.

Pattern Classification, Chapter 2 (Part 2) 0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R.

© 2018 SlidePlayer.com Inc.

All rights reserved.

To ensure the functioning of the site, we use **cookies**. We share information about your activities on the site with our partners and Google partners: social networks and companies engaged in advertising and web analytics. For more information, see the Privacy Policy and Google Privacy & Terms.
Your consent to our cookies if you continue to use this website.

Ads by Google

Ppt on game theory examples Ppt on success and failure of jamestown Ppt on network theory business Download ppt on science and technology Ppt on low level language and high level Cell surface display ppt online Ppt on world book day character Ppt on water the essence of life Ppt on range of motion Maths ppt on triangles for class 9