Stray Example Sheltering by Loss Regularized SVM and k NN Preprocessor
學年 97
學期 2
出版(發表)日期 2009-02-01
作品名稱 Stray Example Sheltering by Loss Regularized SVM and k NN Preprocessor
作品名稱(其他語言)
著者 Yang, Chan-yun; Hsu, Che-chang; Yang, Jr-syu
單位 淡江大學機械與機電工程學系
出版者 New York: Springer New York LLC
著錄名稱、卷期、頁數 Neural Processing Letters 29(1), pp.7-27
摘要 This paper presents a new model developed by merging a non-parametric k-nearest-neighbor (kNN) preprocessor into an underlying support vector machine (SVM) to provide shelters for meaningful training examples, especially for stray examples scattered around their counterpart examples with different class labels. Motivated by the method of adding heavier penalty to the stray example to attain a stricter loss function for optimization, the model acts to shelter stray examples. The model consists of a filtering kNN emphasizer stage and a classical classification stage. First, the filtering kNN emphasizer stage was employed to collect information from the training examples and to produce arbitrary weights for stray examples. Then, an underlying SVM with parameterized real-valued class labels was employed to carry those weights, representing various emphasized levels of the examples, in the classification. The emphasized weights given as heavier penalties changed the regularization in the quadratic programming of the SVM, and brought the resultant decision function into a higher training accuracy. The novel idea of real-valued class labels for conveying the emphasized weights provides an effective way to pursue the solution of the classification inspired by the additional information. The adoption of the kNN preprocessor as a filtering stage is effective since it is independent of SVM in the classification stage. Due to its property of estimating density locally, the kNN method has the advantage of distinguishing stray examples from regular examples by merely considering their circumstances in the input space. In this paper, detailed experimental results and a simulated application are given to address the corresponding properties. The results show that the model is promising in terms of its original expectations.
關鍵字 k-nearest-neighbor preprocessor; Stray training examples; Support vector machines; Classification; Pattern recognition
語言 en
ISSN 1370-4621 1573-773X
期刊性質
收錄於
產學合作
通訊作者
審稿制度
國別 USA
公開徵稿
出版型式 紙本 電子版
相關連結

機構典藏連結 ( http://tkuir.lib.tku.edu.tw:8080/dspace/handle/987654321/50525 )

機構典藏連結