Scientific Comparison between two Face Recognition Algorithms

Abstract

The objective of this paper is to make a comparison between two different face recognition algorithms named Eigenfaces and Local binary patterns (LBP) based on the accuracy of face recognition. Four datasets with images of different sizes, quality and dimensions namely (AR, Yale, UFI and AT&T) have been used in an experiment to compare between the two selected algorithms. We find that LBP algorithm is producing higher rate of accuracy comparing to Eigenfaces on most datasets.

Introduction

Due to increasing security concern, the area of face recognition is gaining more attention since it plays a pivotal role in face identification and face verification. Face identification is trying to identify the identity of the person while face verification is to verify the person is the one he/she claims. For several decades, Face Recognition has been researched and many algorithms have been developed to solve the problem of face identification and verification. Each algorithm performs differently on a specific function. For example, Local binary patterns (LBP) performs well in face verification [1].

The remaining part of the paper is organized as follows. Section II shows background about the workflow of face recognition and both LBP and Eigenfaces algorithms. Section III provides the literature review and related work. Section IV describe research methodology in terms of scope, planning and design and implementation. Experimental results are presented in Section V. Finally, Section VI draws the conclusion and future work.

Background

Face recognition techniques are divided into Geometric and Photometric approaches. Geometric approaches build a model using shape, size & and position of individual features such as eyes, nose, mouth and a shape of the head. In photometric approaches, on the other hand, use statistical techniques to extract values [7]. Figure 1 illustrate the workflow steps of the Paradigm of the Face Recognition.

Steps in the face recognition workflow
Figure 1: Steps in the face recognition workflow.

Step 1: Feature extraction

First step in face recognition is to extract features from individual images to build discriminative information required for face recognition. Algorithms such as LBP, Fisherfaces, and Eigenfaces are used to extract those features. [8].

Step 2: Building Classification Model

Second step is to use machine learning techniques to build a model out of the extracted features to perform face recognition or classification. Machine learning provides multiple supervised learning techniques to build discriminative model such as, support vector machines (SVM) and decision trees, ensemble learning methods, and deep neural networks [8].

Research Methodology

In order to meet the objective of this paper, we will conduct an online experiment to compare between LBP & Eigenfaces algorithms in the accuracy of recognizing faces.

  1. Experiment Scope

    The scope of the experiment is to analyze LBP and Eigenfaces face detection algorithm for the purpose of comparing their performance with respect to face detection accuracy.

  2. Experiment Planning and Design

    An on-line experiment will be conducted on two different face recognition algorithms LBP & Eigenfaces. Four datasets have been selected to help in building solid conclusion about the two algorithms. The experiment can be considered as general in the sense that the objective is to compare two face recognition techniques in general from a research perspective.

    The null hypothesis (H0 ) for this experiment is that the number of detected faces using LBP and Eigenfaces are equal. While the alternative hypothesis (H1) is the number of detected faces using LBP and Eigenfaces are not equal.

    The independent variable is a thresholding parameter in both algorithms. For LBP, it is the Chi Square Distance between the training image and the test image. For Eigenfaces, it is the L2- norm of the difference between the training image and test image in the PCA projection. The dependent variable is the number of accurately recognized faces.

    The Experiment uses paired one factor with two treatments comparison design type. Treatments are Algorithms (LBP, Eigenfaces) Treatments will be applied to subjects, which are datasets in this case, as shown in Table 1.

    Table 1: Experiment Treatments
    Subjects Treatment 1 Treatment 1
    AR LBP Eigenfaces
    Yale Eigenfaces LBP
    UFI LBP Eigenfaces
    AT&T Eigenfaces LBP
  3. Datasets

    Four datasets of images are used in this experiment. Each data set has different size, quality and resolutions. Each dataset has been randomly divided into training and test in each trial as shown in Table 2.

    Table 2: Datasets
    Dataset Size # Subjects(People) # Images Image Dimension Training Testing
    AR 1.53 GB 136 3315 (768x576) 2924 391
    Yale 92 MB 39 2414 (168x192) 2149 265
    UFI 96 MB 605 4921 (128x128) 4315 605
    AT&T 4.7 MB 40 400 (92x112) 360 30
  4. Implementation
    • Software

      OpenCV is an open-source Computer Vision API. It has support for C/C++, Java and Python. It is release under BSD License, and hence it is free for Academic and commercial use [10]. We used OpenCV along with C++ Programming Language since it has a very user-friendly and easy to use API. A disadvantage of using OpenCV API is that we were not able to take advantage of the 16-core (32-threads) the machine has.

      The LBP parameters provided by OpenCV are shown in Table 3. In our experiments, we fixed all the parameters except the threshold.

      Table 3: LBP Parameters
      Parameter Description Default Value
      radius Radius used for the circular local binary pattern 1
      neighbors Number of samples to build a circular local binary pattern from 8
      grid_x Number of cells in the horizontal direction 8
      grid_y Number of cells in the vertical direction 8
      threshold Threshold applied in the prediction. If the distance to the nearest neighbor is larger than the threshold, the prediction is -1 Infinity

      The Eigenfaces parameters provided by OpenCV are shown in Table 4. In our experiments, we made num_components the max value (i.e. number of samples/images to train on) and we varied the threshold as we did for LBP.

      Table 4: Eigenfaces Parameters
      Parameter Description Default Value
      num_components Number of eigenfaces to use in prediction. 1
      threshold Threshold applied in the prediction. If the distance to the nearest neighbor is larger than the threshold, the prediction is -1 Infinity
    • Hardware

      The machine we used to conduct our experiment is a High- Performance computer from the computer engineering department. Its specification is shown in Table 5.

      Table 5: Machine Specification
      Component Value
      CPU Dual Intel® Xeon E5-2640
      RAM 64 GB
      GPU Dual Nvidia Tesla K20X 6GB GDDR5

Experiment Results

In Figure 2, we can see first four Eigenfaces & Eigenvalues of all datasets and the mean face. The Average runs of our experiments of running both algorithms LBP and Eigenfaces on our subjects (Datasets) are shown in Table 6.

Eigenfaces and Eigenvalues of Datasets
Figure 2: Eigenfaces and Eigenvalues of Datasets.
Table 6: Experimental Results
Dataset LBP Eigenfaces
Correct Incorrect Accuracy Correct Incorrect Accuracy
AR 280 86 71.65% 288 28 73.73%
Yale 164 17 61.80% 152 46 57.38%
UFI 162 199 26.73% 152 453 25.13%
AT&T 27 1 68.10% 24 0 60.00%

Conclusion and Future Work

To conclude, this paper’s purpose was to compare between two famous algorithms (i.e. LBP & Eigenfaces) in terms of accuracy at a high level. According to the experiments we have done, LBP yields better accuracy on all datasets except in the case of AR dataset. A possible reason why Eigenfaces did not perform well on some datasets might be that some of the images in the datasets are very dark (i.e. low lighting and low contrast) this was the case for Yale Dataset. While our results suggests that there is no significant difference between the two algorithms (except for Yale), we cannot generalize our finding to all other datasets. We can only conclude that for the three datasets (AR, Yale, and AT&T) LBP and Eigenfaces perform the same on average.

A side note worth mentioning about Eigenfaces is that it requires lots of memory (RAM) and time to finish training. For instance, training on AR dataset took around 5.8 Hours and 32 GB of RAM for Eigenfaces. However, it only took 3 minutes and 1.5 GB of RAM for LBP. The reason Eigenfaces takes such long time is it was implemented in OpenCV as sequential algorithm not a parallel one. The core of Eigenfaces is PCA, which can be written in a parallel and efficient way.

There are many opportunities for extending this research paper, the accuracy of both algorithms can be increase by adding more training data in favor of the high dimensionality problem which can be minimized by using parallel computing and clustering. In addition, using different Machine learning algorithm for the classification task like Support Vector Machines where we might see a better accuracy.

References