Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống
1
/ 20 trang
THÔNG TIN TÀI LIỆU
Thông tin cơ bản
Định dạng
Số trang
20
Dung lượng
1,56 MB
Nội dung
Face recognition using PCA DANG THE HUONG VINH UNIVERSITY CONTENTS • • • • • IDEA OPERATIONS MERITS DEMERITS APPLICATIONS IDEA PCA Eigenfaces: the idea Eigenvectors and Eigenvalues Learning Eigenfaces from training sets of faces Co-variance Recognition and reconstruction PCA PCA means Principle Component Analysis PCA was invented in 1901 by Karl Pearson PCA involves the calculation of the eigenvalue decomposition of a data covariance matrix or singular value decomposition of a data matrix, usually after mean centering the data for each attribute Algorithm Three basic steps involved in PCA are: Identification {by eigen faces} Recognition {matching eigen faces} Categorization {by grouping} EIGEN VECTORS In Digital Image Processing, we convert 2-D images into matrix form for clear analysis Every matrix can be represented with the help of its eigen vectors An eigenvector is a vector that obeys the following rule: µ Av = µ v Where A is a matrix , 3 A= e.g is a scalar (called the eigenvalue) 3since ν = 2 one eigenvector of is 3 12 3 1 = = × so for this eigenvector of this matrix the eigenvalue is EIGEN FACES Think of a face as being a weighted combination of some “component” or “basis” faces These basis faces are called eigen faces -8029 2900 1751 1445 4238 6193 Eigenfaces: representing faces a1 ÷ a ÷ = M ÷ ÷ ÷ a N2 d1 ÷ d ÷ = M ÷ ÷ ÷ d N b1 ÷ b ÷ = M÷ ÷ ÷ b N2 c1 ÷ c ÷ = M÷ ÷ ÷ c N e1 ÷ e ÷ = M÷ ÷ ÷ e N = f1 ÷ f2 ÷ M ÷ ÷ fN2 ÷ We compute the average face a1 + b1 + L + h1 ÷ r a2 + b2 + L + h2 ÷ m= , M ÷ MM M ÷ ÷ a +b +L + h N N N where M = Then subtract it from the training faces a1 − m1 b1 − m1 c1 − m1 d1 − m1 ÷ r ÷ ÷ r ÷ b2 − m2 ÷ r c2 − m2 ÷ d − m2 ÷ r a2 − m2 ÷ am = , bm = , cm = , dm = , M M M M M ÷ M ÷ M ÷ M ÷ ÷ ÷ ÷ ÷ ÷ ÷ ÷ ÷ a m b m c m d m − 2 − 2 − 2 − N N N N N N N N e1 − m1 ÷ e − m r 2 ÷ em = , M M ÷ ÷ ÷ e m − N N r fm = f1 − m1 g1 − m1 h1 − m1 ÷ ÷ r ÷ f − m2 ÷ r g − m h − m 2 ÷ 2 ÷ , gm = , hm = M M M M ÷ M ÷ M ÷ ÷ ÷ ÷ ÷ ÷ ÷ f N − mN g m h m − 2 − N N N N Now we build the matrix which is N by M r r r r r r r r A = am bm cm d m em f m g m hm 2 The covariance matrix which is N by N Cov = AA Τ The covariance matrix has eigenvectors covariance matrix eigenvectors eigenvalues Eigenvectors with larger eigenvectors correspond to directions in which the data varies more Finding the eigenvectors and eigenvalues of the covariance matrix for a set of data is termed principle components analysis .617 615 C= .615 717 −.735 ν1 = 678 µ1 = 0.049 The covariance of two variables is: n .678 ν2 = .735 µ = 1.284 cov( x1 , x2 ) = i i ( x − x )( x ∑ 1 −x2 ) i =1 n −1 Recognition A face image can be projected into this face space by T pk = U (xk – m) where k=1,…,m To recognize a face Subtract the average face from it r1 ÷ r2 = ÷ M ÷ ÷ ÷ rN r1 − m1 ÷ r − m r 2 ÷ rm = M M ÷ ÷ ÷ rN − mN r Ω = U ( rm ) Τ Compute its projection onto the face space U Compute the distance in the face space between the face and all known faces Compute the threshold { ε = Ω − Ωi i θ = max Ωi − Ω j } for i = M for i, j = M Distinguish between • If ξ ≥then θ it’s not a face; the distance between the face and its reconstruction is larger than threshold • • If If then it’s a new face ξ < θ and { εthen i} < it’sθ a known face because the distance in the face ξ < θ and ε ≥ θ , (i = M ) space between the face andi all known faces is larger than threshold RECONSTRUCTION Image is reconstructed in the rd case, if ξ < θ and ε i ≥ θ , (i = M ) Using the MATLAB code, original image and reconstructed image are shown Ex: MERITS Relatively simple Fast Robust Expression - Change in feature location and shape DEMERITS Variations in lighting conditions Different lighting conditions for enrolment and query Bright light causing image saturation APPLICATIONS: Various potential applications, such as • • • Person identification Human-computer interaction Security systems Thank You