1 / 35

9 th May 2012 IEK Presentation

Presentation in Aircraft Satellite Image Identification Using Bayesian Decision Theory And Moment Invariants Feature Extraction Dickson Gichaga Wambaa Supervised By Professor Elijah Mwangi University Of Nairobi Electrical And Information Engineering Dept.

malory
Télécharger la présentation

9 th May 2012 IEK Presentation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Presentation in Aircraft Satellite Image Identification Using Bayesian Decision Theory And Moment Invariants Feature ExtractionDickson Gichaga WambaaSupervised By Professor Elijah MwangiUniversity Of Nairobi Electrical And Information Engineering Dept. • 9th May 2012 IEK Presentation

  2. All aircraft are built with the same basic elements: • Wings • Engine(s) • Fuselage • Mechanical Controls • Tail assembly. • The differences of these elements distinguish one aircraft type from another and therefore its identification.

  3. STAGES OF STATISTICAL PATTERN RECOGNITION • PROBLEM FORMULATION • DATA COLLECTION AND EXAMINATION • FEATURE SELECTION OR EXTRACTION • CLUSTERING • DISCRIMINATION • ASSESSMENT OF RESULTS • INTERPRETATION

  4. Classification ONE • There are two main divisions of classification: • Supervised • unsupervised

  5. SUPERVISED CLASSIFICATION • BAYES CLASSIFICATION IS SELECTED SINCE IT IS POSSIBLE TO HAVE EXTREMELY HIGH VALUES IN ITS OPTIMISATION.

  6. A decision rule partitions the measurement space into C regions.

  7. Preprocessing

  8. PREPROCESSING IMAGE ACQUISITION IMAGE ENHANCEMENT IMAGE BINARIZATION AND THRESHOLDING FEATURES EXTRACTION

  9. NOISE IMAGES ARE CONTAMINATED BY NOISE THROUGH • IMPERFECT INSTRUMENTS • PROBLEMS WITH DATA ACQUISITION PROCESS • NATURAL PHENOMENA INTERFERENCE • TRANSMISSION ERRORS

  10. SPECKLE NOISE(SPKN) • THE TYPE OF NOISE FOUND IN SATELLITE IMAGES IS SPECKLE NOISE AND THIS DETERMINES THE ALGORITHM USED IN DENOISING.

  11. Speckle Noise (SPKN) 2 • This is a multiplicative noise. The distribution noise can be expressed by: J = I + n*I • Where, J is the distribution speckle noise image, I is the input image and n is the uniform noise image.

  12. CHOICE OF FILTER FILTERING CONSISTS OF MOVING A WINDOW OVER EACH PIXEL OF AN IMAGE AND TO APPLY A MATHEMATICAL FUNCTION TO ACHIEVE A SMOOTHING EFFECT.

  13. CHOICE OF FILTER II • THE MATHEMATICAL FUNCTION DETERMINES THE FILTER TYPE. • MEAN FILTER-AVERAGES THE WINDOW PIXELS • MEDIAN FILTER-CALCULATES THE MEDIAN PIXEL

  14. CHOICE OF FILTER II • LEE-SIGMA AND LEE FILTERS-USE STATISTICAL DISTRIBUTION OF PIXELS IN THE WINDOW • LOCAL REGION FILTER-COMPARES THE VARIANCES OF WINDOW REGIONS. • The Frost filter replaces the pixel of interest with a weighted sum of the values within the nxn moving window and assumes a multiplicative noise and stationary noise statistics.

  15. LEE FILTER Adaptive Lee filter converts the multiplicative model into an additive one. It preserves edges and detail.

  16. BINARIZATION AND THRESHOLDING

  17. TRAINING DATA SET AH64 C5 B2

  18. RESULTS: FEATURE EXTRACTION ORIGINAL IMAGES

  19. NOISE ADDITION • Noise with Probabilities of 0.1, 0.2, 0.3 and 0.4 was used for simulation.

  20. FEATURE EXTRACTION:SAMPLE IMAGES

  21. WHY BAYES CLASSIFICATION 1 Bayes statistical method is the classification of choice because of its minimum error rate.

  22. WHY BAYES CLASSIFICATION 2 • Probabilistic learning: among the most practical approaches to certain types of learning problems • Incremental: Each training example can incrementally increase/decrease the probability that a hypothesis is correct

  23. WHY BAYES CLASSIFICATION 3 • Probabilistic prediction: Predict multiple hypotheses • Benchmark: Provide a benchmark for other algorithms

  24. Bayesian Classification • For a minimum error rate classifier the choice is on the class with maximum posterior probability.

  25. Probabilities • Let λ be set of 3 classes C1,C2 ,C3. • x be an unknown feature vector of dimension 7. • Calculate the conditional posterior probabilities of every class Ci and choose the class with maximum posteriori probability.

  26. Prior Probabilities • 3 classes of Data which are all likely to happen therefore P(Ci)= 0.333

  27. Posterior Probability 1 • Posterior = likelihood x prior evidence • P(Ci\x) = P(x\Ci)P(Ci) P(x)

  28. POSTERIOR PROBABILITY 2 • Posterior(AH 64)=P(AH 64)P(x/ AH 64) p(evidence) • Posterior(C5)=P(C5)P(x/ C5) p(evidence) • Posterior(B2)=P(B2)P(x/ B2) p(evidence)

  29. POSTERIOR PROBABILITY 3

  30. CONCLUSION • COMBINING MOMENTS FEATURES EXTRACTION WITH BAYESIAN CLASSIFICATION WHILE USING LEE FILTERS IN PREPROCESSING • INCREASES THE CHANCES OF CORRECT IDENTIFICATION AS COMPARED TO NON USE OF THE FILTERS • USE OF OTHER TYPES OF FILTERS THIS IS SEEN BY THE INCREASE OF THE POSTERIOR PROBABILITY VALUES.

  31. References • [1] Richard O. Duda,Peter E. Hart and David G. Stork.Pattern Classification 2nd edition John Wiley and Sons,US,2007 • [2] Rafael C. Gonzalez,Richard E. Woods and Steven L. Eddins . Digital image processing using matlab 2nd edition Pearson/Prentice Hall,US,2004 • [3] William K. Pratt. Digital image processing 4th edition John Wiley,US,2007 • [4] Anil K. Jain. Fundamentals of Digital Image Processing Prentice Hall,US,1989

  32. References • [5] Wei Cao, ShaoliangMeng, “Imaging systems and Techniques”,IEEE International Workshop, IST.2009.5071625,pp 164-167, Shenzhen, 2009 • [6] Bouguila.N, Elguebaly.T , “A Bayesian approach for texture images classification and retrieval”,International Conference on Multimedia Computing and Systems, ICMS.2011.5945719,pp 1-6,Canada, 2011

  33. References • [7] Dixit. M,Rasiwasia. N, Vasconcelos. N, “Adapted Gaussian models for image classification” ,2011 IEEE Conference on Computer Vision and Pattern, CVPR.2011.5995674, pp 937-943, USA,2011 • [8] Mukesh C. Motwani,Mukesh C. Gadiya,Rakhi C. Motwani, Frederick C. Harris Jr. , “Survey Of Image DenoisingTechniques”,University of Nevada Reno, US, 2001

More Related