Download presentation

Presentation is loading. Please wait.

Published byLayton Burr Modified over 2 years ago

1
Evolutionary Optimization of JPEG Quantization Tables for Compressing Iris Polar Images in Iris Recognition Systems Mario Konrad*, Herbert Stögner* and Andreas Uhl*, ** ** Department of Computer Sciences, Salzburg University, Austria * Carinthia Tech Institute, University of Applied Sciences, Austria

2
Andreas Uhl2 Outline Introduction JPEG Quantization Table Optimization Experiments –Discovery of the Optimal Settings –Resulting Matrices Conclusion & Future Work

3
Andreas Uhl3 Introduction Recognition performance in iris biometrics strongly depends on the image quality. The appliance of compression algorithms to iris images raises the question whether it is possible to adapt those algorithms for biometrical purposes. Two example scenarios for compressing sample data are: Transmission of sample data after sensor data acquisition: In distributed biometric systems, the data acquisition stage is often dislocated from the feature extraction and matching stage. Hence, the sensor data have to be transferred via a network link to the respective location (wireless channels/ low bandwidth /high latency). Storage of reference data: Storing the original sensor data in template databases in addition to the features helps to switch to a different feature extraction technique if required for some reason. The amount of data to be stored is obviously high.

4
Andreas Uhl4 Introduction In literature, the performance of compression formats in biometric contexts is analysed under default settings so far (JPEG, JPEG2000). We have utilized customized quantization tables for compressing polar iris images in previous work and we have demonstrated the potential for the enhancement of recognition results. In this work, we apply a genetic algorithm in order to successively create various quantization matrices for different target compression ratios in different application scenarios. We investigate the impact of various GA-algorithm settings on the resulting recognition performance of the iris recognition system. We are capable of reducing the FAR and the FRR as compared to the error rates produced by images which are compressed with JPEGs standard matrix and of uncompressed image in some cases.

5
Andreas Uhl5 Quantization Table Optimization Evolutionary Table Optimization –Application of a Genetic Algorithm Selects the better performing tables out of one generation. A fitness function judges each member of one population. One individual consists of a certain amount of genes (64, [1 255]). The best individuals pass their genes on to the next generation. –Crossover »Process of mixing genes of the most successful generation members. –Mutation »Randomly forming of a new individual The mean population fitness should converge towards an optimal solution of the given problem and produce better performing tables.

6
Andreas Uhl6 Quantization Table Optimization

7
Andreas Uhl7 Experiments: Settings Setup: –Recognition: 1-D implementation of the Daugman iris recognition algorithm, provided by Libor Masek. –Polar images: extracted from the CASIA V1.0 database. –Scenarios for image matching: Uncompressed images are compared with compressed images. Both images are compressed. –Applied images: Genetic algorithm: –70 polar images from 10 different persons. »4,410 impostor and 420 legitimates hamming distance values. Evaluation of the optimized matrices: –348 polar images from 50 various persons »118,680 impostor and 2,076 legitimate hamming distance values (for the scenario of two compressed images these values are halved due to the symmetry). The images of the optimization process are not a subset of the image series for the evaluation.

8
Andreas Uhl8 Experiments: GA Parameters Discovery of the optimal settings: –Analysis of the various genetic algorithm parameters. to improve the convergence speed. –Only the most relevant algorithms settings have been changed. –Following Table illustrates these parameters and settings and their used values within the test stage. –We keep most of the other settings according to the algorithms implementation.

9
Andreas Uhl9 Experiments: Fitness Function Discovery of the optimal settings: –Fitness Function The fitness function evaluates the different quantization tables. It focuses on the mean error rates of a particular picture set which is generated by one specific quantization matrix. To gather these values, all different pictures are compared with each other to obtain the particular Hamming distances. –This leads to two vectors of legitimate and impostor comparison values. –A threshold sweep provides certain FAR and FRR results for each step. –The mean error rates should be minimized during the optimization process. –The algorithm ends when the populations mean fitness value remains unchanged for 10 generations (after at least 50 processed generations).

10
Andreas Uhl10 Results: Population & Elite Count Discovery of the optimal settings: –Population Size The size of the population has been varied between 10, 20 and 30. –Impacts the optimization performance as well as the computational demand. –10 individuals/60 generations and 20 individuals/30 generations both yield a fitness value of individuals/20 generations provide a fitness of –Therefore, we have selected a medium population size of 20 individuals. –Elite Members The amount of best-fitness individuals which are able to bequeath their genes and survive the alternation of generations. Two elite members provide in nearly every test a better performance than just one. We apply two elite members in each generation.

11
Andreas Uhl11 Results: Crossover Discovery of the optimal settings: –Crossover fraction: Specifies the percentage of individuals which genes are created through a crossover mixture of two parent individuals. We gain the best results with a 60% crossover fraction. –A higher number (80%) lessens the individuals available for mutation and the amount of elite members. –A small number (40%) of crossover individuals are just well performing in small populations, obviously as a consequence of many of elite members. 40% crossover60% crossover

12
Andreas Uhl12 Results: Mutation Discovery of the optimal settings: –Mutation Interval: A mutated individual is created by inheritance of randomly changed genes from elite members of the previous generation. The mutation interval defines the amount of allowed gene mutation (e.g., Gene 1 of value 96 and a defined interval of [1 32] creates a random new gene between 64 and 128.) We discover fewer and smaller steps during the optimization with an interval of [1 32]; therefore we define a mutation interval of [1 64]. [1 32] [1 64]

13
Andreas Uhl13 Results: Quantization Tables Resulting Matrices: –15 tables for compression rates between 2 and 16 are produced for each scenario. –For two compressed images, 7 tables are able to outperform the uncompressed case and another 6 matrices beat the performance of the standard matrix. –For one compressed image, 6 matrices are able to reach the uncompressed case and 8 tables are able to outperform the standard matrix. –The comparison between two compressed images generally leads to better performance than comparing uncompressed and compressed images. –Interestingly, the performance of some quantization matrices is quite similar while the tables look very different.

14
Andreas Uhl14 Results: Similar Performance ! Resulting Matrices:

15
Andreas Uhl15 Results: Similar ROC Resulting Matrices:

16
Andreas Uhl16 Results: Example Matrices Resulting Matrices:

17
Andreas Uhl17 Results: CR 5, Uc - Co Resulting Matrices:

18
Andreas Uhl18 Results: CR 5, Co - Co Resulting Matrices:

19
Andreas Uhl19 Results: CR 12, Uc - Co Resulting Matrices:

20
Andreas Uhl20 Results: CR 12, Co - Co Resulting Matrices:

21
Andreas Uhl21 Results: Average Hamming Distance Resulting Matrices:

22
Andreas Uhl22 Results: Comparing Scenarios Resulting Matrices:

23
Andreas Uhl23 Conclusion & Future Work Conclusion –The identified tables perform significantly better as compared to the default table in most cases. In terms of average hamming distances. In terms of ROC behaviour. The performance of uncompressed images is beaten in some cases. –The optimization procedure itself induces high computational costs due to the expensive evaluation of the fitness function for each individual in all generations. But the optimization process has to be applied only once for a certain image database. The recognition results would be improved and the amount of necessary data would be reduced. Future Work –In upcoming research, we will extend our analysis in order to optimize matrices for rectangular iris images and for different image databases.

24
Andreas Uhl24 Thank you for your attention ! Questions ?

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google