1 / 17

Face Recognition based on Kernel Radial Basis Function

Neural Networks. Face Recognition based on Kernel Radial Basis Function. Ali Razmkhah Dr. Ebrahimi. 9 Jan 2010 Sahand University of Technology. Introduction. Face recognition is a very challenging research problem due to variations in

tamika
Télécharger la présentation

Face Recognition based on Kernel Radial Basis Function

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Neural Networks Face Recognition based on Kernel Radial Basis Function Ali Razmkhah Dr. Ebrahimi 9 Jan 2010 Sahand University of Technology

  2. Introduction • Face recognition is a very challenging research problem due to variations in • Illumination • Facial expression • Pose • Applications: • Human Computer Interaction • Biometrics • Security • Also it is a typical pattern recognition problem whose solution would help in solving other classification problems • RBF neural networks have been successfully applied to face recognition • New algorithm, called Kernel RBF Networks (KRBF Networks), using RBF networks to extract features from kernel feature space 9 Jan 2010 Sahand University of Technology

  3. Subspaceanalysis • A successful face recognition methodology depends largely on the particular choice of the features used by the (pattern) classifier • Linear subspace analysis • Such as Principal Component Analysis (PCA) and Fisher Linear Discriminate Analysis (FLDA) • These methods only extract features from the input space without considering nonlinear information between the components of input data • Nonlinear subspace analysis • Such as such as Kernel PCA(KPCA) and Kernel FLDA(KFDA) • They extract features from a feature space which contains nonlinear information of the input data through kernel methods, such as the nonlinear relationships between two or more image pixels 9 Jan 2010 Sahand University of Technology

  4. Kernelfeaturespace • First, the input data is projected into feature space F by a nonlinear mapping • Φ: RN→ RF, F >> N (1) • We can use the kernel method to extract features from the feature space • Φ (xi )⋅Φ(xj) = k(xi , xj) (2) • Here by using a polynomial kernel to project a vector into high-dimensional space and produce nonlinear information • Φ(x)⋅Φ( y) = k (x, y) = (x, y) 2 (3) • k (x, y) = (x12 , x1 x2 , x2 x1 , x22)( y12 , y1 y2 , y2 y1 , y22)T • Where x = (x1 , x2 ) and y = ( y1 , y2 ) • And low dimensional space projected to high dimensional space 9 Jan 2010 Sahand University of Technology

  5. KRBFNetworks • Thearchitecture of KRBF networks is just like that of RBF networks. • Training includes following steps: • Use a kernel k-means algorithm to cluster the input data • Standard k-means algorithm used in RBF networks • Train the parameters of the Radial Basis Functions (the hidden layer nodes) according to the training samples of each cluster • Train the weights between hidden layer and output layer 9 Jan 2010 Sahand University of Technology

  6. Kernelk-meansalgorithm • K-means algorithm • For the input data x1,x2,…,xN, the k-means clustering algorithm aims to partition the N samples into K clusters, x1,x2,…, xN, according to the Euclidean distance, and return the center of each cluster. • The kernel k-means • Partition samples in the kernel feature space • The key point: Computing Euclidean distance in the kernel feature space • is a face vector • is the projection of in the feature space • we can get a Euclidean distance formula in feature space as following 9 Jan 2010 Sahand University of Technology

  7. Kernelk-meansalgorithm • are the centers of the classes in kernel space • Where is an indicator function and is the number of samples in • : • then the distance between a vector, and a center, in feature space is: 9 Jan 2010 Sahand University of Technology

  8. Kernelk-meansalgorithm (A) 9 Jan 2010 Sahand University of Technology

  9. Kernelk-meansalgorithm • The kernel k-means algorithm comprises the following steps: • Randomly initialize the samples to K clusters, assigning • Computethe distances from each sample to each center with formula (A) and reassign to the new value • Repeat step 2 until it converges • For each cluster , select the sample that is closest to the center as the representative of 9 Jan 2010 Sahand University of Technology

  10. Training Parameters of the RBF Here we select the Gaussian RBF with parameters: and , for the hidden layer. are the centers of clusters. Compute by (A) can be achieved by computing the variance of samples of each cluster in the kernel feature space Training the weights between hidden and output layers The outputs of the hidden units lie between 0 and 1. The closer the input is to the center of the Gaussian, the larger the response of the node will be. , an output unit of the output layer, is given by: 9 Jan 2010 Sahand University of Technology

  11. Experiments • Our experiments are performed on two benchmarks • ORLdatabase • FERETdatabase • We reduce the face images from 112× 92 to 28 × 23 in order to compute • efficiently. • We chose a polynomial function as the kernel function for space projection • ORL database • 400 images of size 112× 92 • There are 40 persons, with 10 images of each person 9 Jan 2010 Sahand University of Technology

  12. Experiments • Each set of ten images for a person is randomly partitioned into a training subset of five images and a test set of the other five images • Results • The recognition rate increases with the number of hidden layer nodes, and KRBF networks achieve higher recognition rates than RBF networks • Next • In the following experiment, each set of ten images for a person is 9 Jan 2010 Sahand University of Technology

  13. Experiments • randomly partitioned into training subsets of 2, 4, 6, and 9 images, and at the same time, test sets of the other 8, 6, 4, and 1 images. The same training and test data are used for both methods • Results • KRBF networks achieve better recognition rates than RBF networks as the amount of training data increases 9 Jan 2010 Sahand University of Technology

  14. Experiments • FERETdatabase • 420 images of size 112× 92 • There are 70 persons, with six different image for each person • Each set of six images for a person is randomly partitioned into a training subset of four images and a test set of the other two images 9 Jan 2010 Sahand University of Technology

  15. Experiments • FERETdatabase • Results • The recognition rate increases with the number of hidden layer nodes, and KRBF networks achieve higher recognition rates than RBF networks 9 Jan 2010 Sahand University of Technology

  16. Experiments • FERETdatabase • Results • Recognition rate increases when the amount of training data grows 9 Jan 2010 Sahand University of Technology

  17. CONCLUSION • RBF networks use a k-means algorithm to cluster sample vectors • In the testing process, the distances from each test data to the center of each cluster reflect the features • KRBF networks apply RBF networks in kernel feature space and extract features from the feature space • KRBF networks is employed in whole face recognition, better results than with RBF Networks are achieved 9 Jan 2010 Sahand University of Technology

More Related