1 / 88

880 likes | 1.09k Vues

3-D Face Recognition Based on Warped Example Faces. 指導老師 : 萬書言 老師 報告學生 : 何炳杰 報告日期 : 2010/05/05. Outline. Abstract Introduction Generic Face Warping Linear Combination of Face Warpings Feature Extration Procedure Feature Classification Using Mahalanobis Distance Experimental Results

Télécharger la présentation
## 3-D Face Recognition Based on Warped Example Faces

**An Image/Link below is provided (as is) to download presentation**
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.
Content is provided to you AS IS for your information and personal use only.
Download presentation by click this link.
While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

**3-D Face Recognition Basedon Warped Example Faces**指導老師 : 萬書言 老師 報告學生 : 何炳杰 報告日期 :2010/05/05**Outline**• Abstract • Introduction • Generic Face Warping • Linear Combination of Face Warpings • Feature Extration Procedure • Feature Classification Using Mahalanobis Distance • Experimental Results • Conclusion**Abstract**• In this paper, we describe a novel 3-D face recognition scheme for 3-Dface recognition that can automatically identify faces from range images, and is insensitive to holes, facial expression, and hair. • In our scheme, a number of carefully selected rangeimages constitute a set of example faces, and another range imageis chosen as a “generic face.” • The generic face is then warped to match each of the example faces in the least mean square sense.**Abstract**• Each such warp is specified by a vector ofdisplacement values. • In feature extraction operation, when a target face image comesin, the generic face is warped to match it. The geometric transformation used in the warping is a linear combination of the example face warping vectors. • Our technique is tested on adata set containing more than 600 range images. • Experimental results in the access-control scenario show the effectiveness of the extracted features.**Introduction**• FACE recognition has received broad attention in both academia and industry due to its practical relevance to homeland security. • Facial appearance can uniquely identify a person, and it is a primary factor that people use to recognize each other. • Although human face images can be affected by illumination, facial expression, makeup, etc., they are noninvasive in nature and easy to be collected in environments where other biometrics (e.g., fingerprint, iris) require cooperation of the subjects.**Introduction**• In this paper,we propose a scheme based on a number of carefully selected 3-D faces called “example faces” and a 3-D facecalled a “generic face.” The set comprising the example faces isthen defined as “example set.” In the design process, the genericface is warped to match each example face. Each such warpingcan be represented by a displacement vector. • In a feature extraction operation, when a target face image comes in, the genericface is warped to match it.**Introduction**• The geometric transformation usedin the warping is defined by a linear combination of those displacement vectors specified by the warpings from the genericface to the example faces. • The coefficients in the linear combination are then used as features of the target image and passedto a classiﬁer for recognition. • To illustrate the idea of synthesizing 3-D faces based on anear combination of warps derived from example faces, consider the 3-D feature space shown in Fig. 1.**Introduction**Here, we have three example faces, each of which corresponds to a particular geometric transformation (shown in Fig. 2), namely, the one that makes the generic face best match that example face.**Introduction**• Each axis represents one coefﬁcient in a three-term weighted sum. Each example face lies at 1.0 along its corresponding axis. Further, any point in that 3-D space corresponds to a particular linear combination of those three warps (shown in Fig. 3)**Introduction**• In feature extraction operation, the algorithm adjusts the coefficients to minimize the root-mean-square (rms) error between the target face and the warped generic face. • The coefficient vector corresponding to the minimum then constitutes a set of feature values that represents that face.**Introduction**The feature extraction procedure of our scheme is illustrated in Fig. 4.**Introduction**• This procedure starts with an initial coefﬁcient vector , typically all equal to 1/K . • Based on this initial coefficient vector, the generic face is warped to generate a synthesized face . Then, the rms error between and the target face T is computed.**Introduction**• To summarize, our proposed scheme can be outlined as follows. 1) During the design process, warp a generic model of a face to match each K example face, retaining the parameters of each K such warps in a warp vector. 2) In a feature extraction procedure, warp the generic face using a geometric transformation that is a linear combination of the warp vectors that were developed for the example faces. 3) Adjust the coefficients in the weighted sum to minimize the rms error between the input target face image and the warped generic face. 4) Take the coefﬁcients that result in minimum rms error as extracted features that describe the face. 5) Use a linear classifier based on Mahalanobis distance to classify the extracted features.**Generic Face Warping**• Preface : - The goal of generic face warping is to establish point-to-point correspondences between the generic face and example faces. • we can consider the warped generic face based on a combination of face warps derived from different example faces as a synthesized face. • The main thrust is the generic face warpingprocedure that establishes the point-to-point correspondence between the generic face and example faces.**Generic Face Warping**• A. Necessity for Alignment: • we need to first develop the proper face warpings between the generic face and example faces. • In other words, we need to ﬁnd out the point-to-point correspondence between the generic face and the example faces. Otherwise, when different warps derived from example faces are combined together, points with different physical meanings will be added up, resulting in blurred faces.**Generic Face Warping**For instance, if we develop two face warpings between the generic face and two example faces [shown as example faces in Fig. 5(a) and (b)] without alignments, a combination of the face warpings will generate a blurred face shown in Fig. 5(c).**Generic Face Warping**With the alignment, however, pixels with the same physical meaning are aligned together and generate a valid human face, as shown in Fig. 5(d).**Generic Face Warping**• B. Landmark Points: Note that the numberof landmark points can vary. In order to find out the point-to-point correspondences between the generic face and example faces, we first manually define 70 landmark points (e.g., eye corners and mouth corners) on eachK example face and the generic face, as shown in Fig. 6.**Generic Face Warping**• B. Landmark Points: • In our face recognition scheme,we find 70 landmark points are enough to accurately describe the correspondences between the generic face and an example face. • The and coordinates of these 70 landmark points are concatenated to generate position vectors and , i = 1,…, K**Generic Face Warping**• B. Landmark Points: (x1, y1) (x2, y2) Position vector Position vector (x4, y4) (x3, y3)**Generic Face Warping**• B. Landmark Points: • The positionvectors of the landmark points on are denoted as andand . • :The generic face. (共通臉) With these position vectors, the displacement vectorsand( i = 1, …,K) are defined as**Generic Face Warping**Based on these displacement vectors, we can further generate new displacement vectors • B. Landmark Points: : the weighting coefficients 意義: Depending on, and actually represent a set of displacements of the 70 landmark points on the generic face .**Generic Face Warping**• C. Face Warping: -In this section, we describe the warping procedure that calculates the displacements of every point on the generic facebased on the displacement vectors , . after this procedure a pointin the generic facewill be mapped to a point in the warped face .**Generic Face Warping**• C. Face Warping: • Image size: 750 * 500 range image size. • Process: We first divide the generic face into Delaunay triangles based on the 70 landmark points defined by and , as shown in Fig. 7.**Generic Face Warping**• C. Face Warping: Fig. 7. Range face image divided by Delaunaytriangles.**Generic Face Warping**• C. Face Warping: • The coordinatesand of the triangle vertices are thenmapped to and • We can focus on a pair of such correspondingDelaunay triangles as shown in Fig. 8, in which the triangle are mappedto the triangle.**Generic Face Warping**• C. Face Warping: • Now given a pointinside , we want to find out the corresponding position in . • If we restrict ourselves to linear mappings, we have : the coefficients that need to be determined.**Generic Face Warping**• C. Face Warping: - Because of the correspondences between vertices of and ,we have**Generic Face Warping**• C. Face Warping: - From (4) and (5), we can obtain**Generic Face Warping**• C. Face Warping: • Note that is invertible as long as has a nonzero area, so we have**Generic Face Warping**• C. Face Warping: • Let • , from (8) and (9), we can rewrite (3) as P:位置向量 since P= then P=**Generic Face Warping**• C. Face Warping: - If we draw vertical and horizontal lines passing through the landmark points of the generic face , we can further divide into more than 300 small grids, as shown in Fig. 9.**Generic Face Warping**• C. Face Warping: • Suppose a particular grid has four vertices , i = 1, 2, 3, 4 - Let , i = 1, 2, 3, 4 be the corresponding displacements of the vertices, then the displacements of a pixel with the position inside a grid.**Generic Face Warping**C. Face Warping: W: the weighting coefficients**Linear Combination of Face Warpings**• A. Synthesized Faces From a Linear Combination of Face Warpings: - The displacement images and of example faces. - : The displacement image of the ith example face . - x and y : The coordinates of the image.**Linear Combination of Face Warpings**• A. Synthesized Faces From a Linear Combination of Face Warpings: - From (12), we can reconstruct the example face as follows:**Linear Combination of Face Warpings**• A. Synthesized Faces From a Linear Combination of Face Warpings: - The obtained displacement images of the example images , and to generate new displacement images**Linear Combination of Face Warpings**• A. Synthesized Faces From a Linear Combination of Face Warpings: • Where , , and can additionally be used to create a new synthesized face :**Linear Combination of Face Warpings**If we allocate each displacement image as the same weight 1/K, we can generate the average warped generic face, as shown in Fig. 10.**Linear Combination of Face Warpings**• B. Selection of Example Faces: think as “points” 2-D feature space cloud**Linear Combination of Face Warpings**• B. Selection of Example Faces: - The larger the “cloud,” the wider range of valid synthesized faces will be generated. - Hence, we should select the extreme faces (e.g., long faces and short faces) as example faces, so that the “cloud” composed by the example faces will be as large as possible 弦外之音 言外之意， 該系統的人臉資料庫中， 長臉與短臉數量較多。**Linear Combination of Face Warpings**• From Section III-A, we get a range face E, • And • V : Let the length of vector V be L. • We can odtainthe corresponding • displacement images by these. • can be concatenated as a vector V**Linear Combination of Face Warpings**• B. Selection of Example Faces: - By calculating the mean vector of these Hvectors , we can obtain the data matrix : • H : range images in the data set • The covariance matrixof Dcan therefore be expressed as . Note that the covariance matrix P is L*L, where L can be a very large number.**Linear Combination of Face Warpings**• B. Selection of Example Faces: - In order to save calculation complexity, we compute the eigenvectors from . - Let the corresponding eigenvalue of an eigenvector be . We have :eigenvector(特徵向量) :該特徵向量的特徵值 :matrix(矩陣)**Linear Combination of Face Warpings**• B. Selection of Example Faces: • By multiplying the matrix D to the left side of the above equation, the following equation can be obtained: - where is clearly an eigenvector of the covariance matrix P. We can then sort the eigenvectors according to the values of their corresponding eigenvalues, from large to small.**Linear Combination of Face Warpings**• B. Selection of Example Faces: - In order to reduce the noise in target images caused by facial expressions, we add example faces with facial expressions to the example set, as illustrated in Fig. 12. without experssions with experssions**Linear Combination of Face Warpings**• B. Selection of Example Faces: with neutral expressions with laughing expressions**Feature Extration Procedure**• The purpose : • 給予一個新的目標人臉影像T，特徵擷取過程的目標是在擷取一個可信賴的人臉特徵點，而這些被擷取下來的特徵點可以足夠與其他人臉做區別。 • 基於從例子人臉校準後的臉部變型，作者設計出一個可以使已變型的共通人臉(合成臉)盡可能的與目標人臉相似的一個演算法。 • 在這個小節裡，作者首先引入cost function的概念來描述合成後的人臉與目標人臉的差別。

More Related