Citation: | ZHAO San-yuan, SHEN Ting-zhi, SUN Chen-sheng, LIU Peng-zhang, YUE Lei. Feature subset selection method for AdaBoost training[J].JOURNAL OF BEIJING INSTITUTE OF TECHNOLOGY, 2011, 20(3): 399-402. |
[1] |
Roweis S T, Saul L K. Nonlinear dimensionality reduction by locally linear embedding[J]. Science, 2000, 290: 2323-2326.
|
[2] |
Geladi P, Kowalski B. Partial leastsquares regression: A tutorial[J]. Analytica Chimica Acta, 1986, 185: 1-17.
|
[3] |
Rosipal R, Krame N. Overview and recent advances in partial least squares //Sounders C. Subspace, latent structure and feature selection techniques. Slovenia: Springer, 2006: 34-51.
|
[4] |
Ng Andrew. Feature selection, l1 vs. l2 regularization, and rotational invariance //Carla E. Proceedings of the twenty-first international conference on Machine learning. Banff Alberta, Canada: ACM, 2004, 21:78.
|
[5] |
Guyon I, Weston J, Barnhill S, et al. Gene selection for cancer classification using support vector machines[J]. Machine Learning, 2002, 46(3): 389-422.
|
[6] |
Bar-Hillel A, Levi D, Krupka E, et al. Part based feature synthesis for human detection //Kostas D. European Conference on Computer Vision. Heraklion Crete, Greece: Springer, 2010: 127-142.
|
[7] |
Guyon I, Andr'e E. An introduction to variable and feature selection[J]. Machine Learning Research, 2003, 3: 1157-1182.
|
[8] |
Xu Lu, Jiang Jianhui, Lin Weiqi, et al. Optimized sample-weighted partial least squares[J]. Talanta, 2007, 71:561-566.
|
[9] |
Friedman J, Hastie T, Tibshirani R. Additive logistic regression: A statistical view of boosting[J]. The Annals of Statistics, 2000 , 28: 337-407.
|
[10] |
Schlkopf B, Smola A J. Learning with kernels: support vector machines, regularization, optimization, and beyond[M]. Cambridge: MIT Press, 2002: 305-315.
|