1 / 15

Properties of Kernels

Properties of Kernels. Presenter: Hongliang Fei Date: June 11, 2009. Overview. Inner product and Hilbert space Characteristics of kernels The kernel Matrix Kernel construction. Hilbert spaces.

cassandra
Télécharger la présentation

Properties of Kernels

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Properties of Kernels Presenter: Hongliang Fei Date: June 11, 2009

  2. Overview • Inner product and Hilbert space • Characteristics of kernels • The kernel Matrix • Kernel construction

  3. Hilbert spaces • Linear function: Given a vector space X over the reals, a function f: X->R is linear if f(ax)=af(x) and f(x+z) = f(x)+f(z) for all x,z \in X and a \in R. • Inner product space: A vector space X over the reals R is an inner product space if there exists a real-valued symmetric bilinear (linear in each argument) map (.,.), that satisfies

  4. Hilbert spaces • A Hilbert Space F is an inner product space with the additional properties that it is separable and complete. • Completeness refers to the property that every Cauchy sequence {hn} n≥1 of elements of F converges to an element h ∈ F. • A space F is separable if and only if it admits a countable orthonormal basis.

  5. Cauchy–Schwarz inequality • In an inner product space, and the equality sign holds in a strict inner product space if and only if x and z are rescalings of the same vector.

  6. Gram matrix

  7. Positive semi-definite matrices • A symmetric matrix is positive semidefinite, iff its eigenvalues are all non-negative. for all v, • A symmetric matrix is positive semidefinite, iff its eigenvalues are all postive • Gram and kernel matrices are positive semi-definite.

  8. Finitely positive semi-definite functions • A function satisfies the finitely positive semi-definite property if it is a symmetric functionfor which the matrices formed by restriction to any finite subset of the space X are positive semi-definite.

  9. Mercier Kernel Theorem • A function which is either continuous or has a finite domain, can be decomposed into a feature map φ into a Hilbert space F applied to both its arguments followed by the evaluation of the inner product in F if and only if it satisfies the finitely positive semi-definite property.

  10. The kernel matrix • Implementation issues • Kernels and prior knowledge • Kernel Selection • Kernel Alignment

  11. Kernel Selection • Ideally select the optimal kernelbased on our prior knowledge of the problem domain. • Actually, consider a familyof kernels defined in a way that again reflects our prior expectations. • Simple way: require only limitedamount of additional information from the trainingdata. • Elaborate way: Combine label information

  12. Kernel Alignment • Measure similarity between two kernels • The alignment A(K1,K2) between two kernel matricesK1 and K2 is given by

  13. Kernel Construction

  14. Operations on Kernel matrices • Simple transformation • Centering data • Subspace projection: chapter 6 • Whitening: Set all eigenvalues to 1 (spherically symmetric)

  15. That’s all. Any questions?

More Related