Presentation is loading. Please wait.

Presentation is loading. Please wait.

Performance Measurement of Image Processing Algorithms By Dr. Rajeev Srivastava ITBHU, Varanasi.

Similar presentations


Presentation on theme: "Performance Measurement of Image Processing Algorithms By Dr. Rajeev Srivastava ITBHU, Varanasi."ā€” Presentation transcript:

1 Performance Measurement of Image Processing Algorithms By Dr. Rajeev Srivastava ITBHU, Varanasi

2 Performance Measurement: Image Reconstruction Image Reconstruction In various image applications, where an image is to be reconstructed, from its degraded version, the performance of the image processing algorithms need to be evaluated quantitatively. For evaluation purposes, we must have the original image. Some examples of such image processing algorithms include: Image Restoration (Where original image is available for comparison purposes) Image Enhancement Image Compression Image Reconstruction (Tomographic Reconstruction etc.) Image Interpolation/ Zooming Image Inpainting etc.

3 Performance Measures: Image Reconstruction Some Performance Measures for evaluating previously mentioned Image processing algorithms where the original image and reconstructed image from its degraded version is available for evaluation purposes are listed as follows: -Mean Square Error (MSE) -Root Mean Square Error (RMSE) -Peak Signal-to-Noise Ratio (PSNR) -Mean Absolute Error (MAE) -Cross Correlation Parameter (CP) -Mean Structure Similarity Index Map (MSSIM) - Histogram Analysis

4 Performance Measures: Image Reconstruction For those cases where original image is not available for comparison purposes, such as for Blind Restoration of Images and Image enhancement following performance measures can be used: Blurred Signal-to-Noise Ratio (BSNR)

5 Performance measurement metrics Mean square error: Where I is the original image and Iā€™ is the reconstructed image. Root mean square error: Peak signal-to-noise ratio: For optimal performance, measured values of MSE, RMSE and should be small and PSNR should be large.

6 Performance measurement metrics Correlation parameter (CP): CP is a qualitative measure for edge preservation. If one is interested in suppressing noise or artefacts while at the same time preserving the edges of the original image then this parameter proposed by (M. Salinas, H. and Fernandez, Delia C., 2007) can be used. To evaluate the performance of the edge preservation or sharpness, the correlation parameter is defined as follows Where and are high pass filtered versions of original image and filtered image obtained via a 3x3 pixel standard approximation of the Laplacian operator. The and are the mean values of and, respectively. The correlation parameter should be closer to unity for an optimal effect of edge preservation.

7 Performance measurement metrics...... Structure similarity index map (SSIM): SSIM is used to compare luminance, contrast and structure of two different images. It can be treated as a similarity measure of two different images. SSIM of two images X and Y can be defined as Where (i = X or Y) is the mean intensity, (i=X or Y) is the standard deviation, and C i (i=1 or 2) is the constant to avoid instability when is very close to zero and is defined as in which and L is the dynamic range of pixel values e.g. L=255 for 8-bit gray scale image. In order to have an overall quality measurement of the entire image, mean SSIM is defined as The MSSIM value should be closer to unity for optimal measure of similarity. Normalized Mean Square Error :

8 Example :Histogram Analysis Figure Histogram analysis for 4x4 image interpolation a) original image b) nearest neighbour interpolation c) bilinear interpolation d) bicubic interpolation e) anisotropic diffusion method f) proposed method, mri_head_2.jpg (256x256)

9 Performance Measures: Speckle Reduction Algorithms For measuring the performance of speckle reduction algorithms which are responsible for reducing speckle/ multiplicative noise following performance measures can be used to evaluate the algorithms: Speckle Index (SI) Average Signal-to-Noise Ratio (Average SNR) Effective Number of Looks (ENL)

10 Performance measurement metrics...... Speckle index : Since speckle noise is multiplicative in nature, average contrast of an image may be treated as a measure of speckle removal. Speckle index (SI) is defined as and its discrete version for an image reads where is the size of the image, is the mean and is the standard deviation. For optimal performance, the measured value of S.I. should be low. The speckle index can be regarded as an average reciprocal signal-to noise ratio (SNR) with the signal being the mean value and noise being the standard deviation. Average SNR=1/SI.

11 Performance measurement metrics...... Effective number of looks (ENL): The number of looks in an intensity image is a measure of the statistical fluctuations introduced by speckle resulting from the interference between randomly positioned scatterer. Thus ENL gives essentially an idea about the smoothness in the regions on the image that is supposed to have a homogeneous appearance but are corrupted by noise. ENL is generally defined as Where t denotes the target area or region of interest, and are the pixel mean and standard deviation of a target area of the image. In this work, target area is the whole image. A large value of ENL reflects better quantitative performance of the filter.

12 Performance Measures: Edge Detection Algorithms Edge Detection Error Rate If n 0 is the number of edge pixels declared and n 1 is the number of missed or new edge pixels after adding noise. If n 0 is held fixed for noiseless as well as noisy images, then the edge detection error rate (Pe) is defined as: Another measure for the noise performance of edge detection operators is given by the quantity: Where is the distance between a pixel declared as edge and the nearest ideal edge pixel, is calibration constant, and N1 and ND are the number of ideal and detected edge pixels respectively.

13 Performance Measures: Image Segmentation Algorithms The performance of the segmentation algorithm can be evaluated by obtaining three segmentation performance measures namely: Probabilistic Rand Index (PRI) [Unnikrishnan. R et al (2007)] Variation of Information (VOI) [Meila M. (2005)] Global Consistency Error (GCE) [Martin D.et al (2001)] with the sample images.

14 Performance Measures: Segmentation Algorithm REFERENCES Meila M. (2005), ā€œComparing Clustering ā€“ An axiomatic viewā€, in proc. 22nd Int. Conf. Machine Learning, pp. 577-584. Unnikrishnan R., Pantofaru C., and Hernbert M. (2007), ā€œToward objective evaluation of image segmentation algorithms,ā€ IEEE Trans. Pattern Annl. Mach. Intell, Vol.29, No.6, pp. 929-944. Martin D. Fowlkes C., Tal D. and Malik J. (2001), ā€œ A database of human segmented natural images and its application to evaluating segmentation algorithms and measuring ecological statist ics,ā€ in proc. 8 th Int. Conference Computer vision, vol.2, pp.416- 423

15 Classification and Analysis Feature Evaluation For evaluating the feature extraction and classification algorithms following measures are used: Error = (100/40) * (con (1, 2) +con (2, 1)) Accuracy = 100 ā€“ Error Above mentioned measures are derived from the following Confusion Matrix. Example: Selecting useful features is important in cancer analysis/ classification (malignant/benign classification). Some important features can be extracted and evaluated for analysis to get to our goal. We can supply a matrix with 20*100 dimension, which means 20 images and 100 features that each row of the matrix represents one feature in specific angle and distance. Then we can organize the matrix by Kmeans clustering method to separate 2 groups of malignant and benign tumors For the classification of benign and malignant tumor for cancer detection, a 2*2 confusion matrix (con) can be formulated where position (1, 1) shows the true classification of benign, position (2, 2) represents the true classification of malignant, position (1, 2) wrongly distinguishes malignant instead of benign and position (2, 1) wrongly distinguishes benign instead of malignant tumors.

16 Classification and analysis Confusion Matrix Position (1,1) i.e. Con (1,1) shows the true classification of benign Position (1, 2) i.e. Con (1,2) wrongly distinguishes malignant instead of benign Position (2, 1) i.e. Con (2, 1) wrongly distinguishes benign instead of malignant tumors Position (2, 2) i.e. Con (2,2) represents the true classification of malignant

17 END


Download ppt "Performance Measurement of Image Processing Algorithms By Dr. Rajeev Srivastava ITBHU, Varanasi."

Similar presentations


Ads by Google