1 / 12

Multiple Parameter Selection of Support Vector Machine

Multiple Parameter Selection of Support Vector Machine. Hung-Yi Lo. Outline. Phonetic Boundary Refinement Using Support Vector Machine (ICASSP’07, ICSLP’07) Automatic Model Selection for Support Vector Machine (Distance Metric Learning for Support Vector Machine).

Télécharger la présentation

Multiple Parameter Selection of Support Vector Machine

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multiple Parameter Selection of Support Vector Machine Hung-Yi Lo

  2. Outline • Phonetic Boundary Refinement Using Support Vector Machine (ICASSP’07, ICSLP’07) • Automatic Model Selection for Support Vector Machine (Distance Metric Learning for Support Vector Machine)

  3. Automatic Model Selection for Support Vector Machine (Distance Metric Learning for Support Vector Machine)

  4. min (QP) s. t. Automatic Model Selection for SVM • The problem of choosing a good parameter or model setting for a better generalization ability is the so called model selection. • We have two parameter in support vector machine: • regularization variable C • Gaussian kernel width parameter γ • Support vector machine formulation: • Gaussian kernel:

  5. Automatic Model Selection for SVM • C.-M. Huang, Y.-J. Lee, Dennis K. J. Lin and S.-Y. Huang. "Model Selection for Support Vector Machines via Uniform Design", A special issue on Machine Learning and Robust Data Mining of Computational Statistics and Data Analysis. (To appear)

  6. Automatic Model Selection for SVM • Strength: • Automate the training progress of SVM, nearly no human-effort needed. • The object of the model selection procedure is directly related to testing performance. In my experimental experience, testing correctness always better than the results of human-tuning. • Nested uniform-designed-based method is much faster than exhaustive grid search. • Weakness: • No closed-form solution, need doing experimental search. • Time consuming.

  7. Distance Metric Learning • L. Yang "Distance Metric Learning: A Comprehensive Survey", Ph.D. survey • Many works have done to learn a quadratic (Mahalanobis)distance measures: where xi is the input vector for the ith training case and Q is a symmetric, positive semi-definite matrix. • Distance metric learning is equivalent to feature transformation:

  8. Global Distance Metric Learning by Convex Programming Local Adaptive Distance Metric Learning Supervised Distance Metric Learning Relevant Component Analysis Local Neighborhood Components Analysis Linear embedding PCA, MDS Unsupervised Distance Metric Learning Nonlinear embedding LLE, ISOMAP, Laplacian Eigenmaps Large Margin Nearest Neighbor Based Distance Metric Learning Distance Metric Learning based on SVM Cast Kernel Margin Maximization into a SDP problem Kernel Alignment with SDP Kernel Methods for Distance Metrics Learning Learning with Idealized Kernel

  9. Distance Metric Learning • Strength: • Usually have closed-form solution. • Weakness: • The object of the distance metric learning is based some data distribution criterion, but not the evaluation performance.

  10. Automatic Multiple Parameter Selection for SVM • Gaussian kernel: • Traditionally, each dimension of the feature vector will be normalized into zero-mean and one standard deviation. So each dimension have the same contribute to the kernel. • However, some features should be more important. which is equivalent to diagonal distance metric learning:

  11. Automatic Multiple Parameter Selection for SVM • I would like to do this task by experimental search, and incorporate data distribution criterion as some heuristic. • Much more time consuming, might only applicable on small data. • Feature selection is another similar task and can be solved by experimental search, while the diagonal of the matrix is zero or one. • Applicable on large data. • But, already have many publication.

  12. Thank you!

More Related