520 likes | 682 Vues
Chapter 10 The Support Vector Method For Estimating Indicator Functions. jpzhang@fudan.edu.cn Intelligent Information Processing Laboratory, Fudan University. Optimal hyperplane Remarkable statistical properties. Construct a new class of learning machines. Support vector machines.
E N D
Chapter 10The Support Vector Method For Estimating Indicator Functions jpzhang@fudan.edu.cn Intelligent Information Processing Laboratory, Fudan University
Optimal hyperplane • Remarkable statistical properties. • Construct a new class of learning machines. • Support vector machines.
The optimal hyperplane • The optimal hyperplane for nonseparable sets • Statistical properties of the optimal hyperplane • Proof of the theorems • The Idea of the Support Vector Machine
One More Approach to the Support Vector Method • Selection of SV Machine Using Bounds • Examples of SV Machines For Pattern Recognition • SV Method for transductive inference • Multiclass classification • Remarks on generalization of the SV method
Properties • Objective function do not depend explicitly on the dimensionality of the vector x • Depend on the inner product of two vectors. • Allow us to construct separating hyperplanes in high-dimensional space.
Proof of the theorems • (略)
The Idea of the Support Vector Machine • Support Vector Machine: • Maps the input vectors x into the high-dimensional feature space Z through nonlinear mapping, chosen a priori. • In this space, an optimal separating hyperplane is constructed.
Problem • How to find a separating hyperplane that generalizes well. (conceptual problem) • Dimensionality is huge • Generalize well • How to treat such high-dimensional spaces computationally. (technical problem) • Curse of dimensionality
Generalization in high-dimensional space • Conceptual • Optimal hyperplane • Generalization ability is high even if the feature space has a high dimensionality. • Technical • One does not need to consider the feature space in explicit form. • Calculate the inner products between support vectors and the vectors of the feature space.
One More Approach to the Support Vector Method • Minimizing the Number of Support Vectors • Generalization for the Nonseparable Case • Linear Optimization Method for SV Machines
Minimizing the Number of Support Vectors • The optimal hyperplane has an expansion on the support vectors • If a method of constructing the hyperplane has a unique solution • then the generalization ability of the constructed hyperplane depends on the number of support vectors.