U.S. flag

An official website of the United States government, Department of Justice.

NCJRS Virtual Library

The Virtual Library houses over 235,000 criminal justice resources, including all known OJP works.
Click here to search the NCJRS Virtual Library

A selective feature information approach for iris image-quality measure

NCJ Number
305369
Journal
IEEE Transactions on Information Forensics and Security Volume: 3 Issue: 3 Dated: Sept 2008 Pages: 572-577
Author(s)
Craig Belcher; Yingzi Du
Date Published
September 2008
Length
6 pages
Annotation

This research article examines image quality and the impact on accuracy of positive identification in iris-recognition systems.

Abstract

Poor quality images can significantly affect the accuracy of iris-recognition systems because they do not have enough feature information. However, existing quality measures have focused on parameters or factors other than feature information. The quality of feature available for measure is a combination of the distinctiveness of the iris region and the amount of iris region available. Some irises may only have a small area of changing patterns. Due to this, the proposed approach automatically selects the portions of the iris with the most distinguishable changing patterns to measure the feature information. The combination of occlusion and dilation determines the amount of iris region available and is considered in the proposed quality measure. The quality score is the fused result of the feature information score, the occlusion score, and the dilation score. The relationship between the quality score and recognition accuracy is evaluated using 2-D Gabor and 1-D Log-Gabor wavelet approaches and validated using a diverse data set. In addition, the proposed method is compared with the convolution matrix, spectrum energy, and Mexican hat wavelet methods. These three methods represent a variety of approaches for iris-quality measure. The experimental results show that the proposed quality score is highly correlated with the recognition accuracy and is capable of predicting the recognition results. (Publisher abstract provided)

Downloads