Multinomial Kernel Logistic Regression via Bound Optimization Approach
Multinomial logistic regression is probably the most popular representative of probabilistic discriminative classifiers for multiclass classification problems. In this paper, a kernel variant of multinomial logistic regression is proposed by combining a Newton's method with a bound optimization approach. This formulation allows us to apply highly efficient approximation methods that effectively overcomes conceptual and numerical problems of standard multiclass kernel classifiers. We also provide the approximate cross validation (ACV) method for choosing the hyperparameters which affect the performance of the proposed approach. Experimental results are then presented to indicate the performance of the proposed procedure.
- Blake, C. L. and Merz, C. J. (1998). UCI Repository of machine learning databases. University of California, Department of Information and Computer Science. Available from: http://www.ics.ucLedu/ mlearn/MLRepository.html
- Bohning, D. (1992). Multinomial logistic regression algorithm. Annals of the Institute of Statistical Mathematics, 44, 197-200
- Craven, P. and Wahba, G. (1979). Smoothing noisy data with spline functions: estimating the correct degree of smoothing by the method of generalized cross-validation. Numerische Mathematic, 31, 317-403
- Krishnapuram, B., Carin, L., Figueiredo, M. A. T. and Hartemink, A. J. (2005). Sparse multinomiallogistic regression: fast algorithms and generalization bounds. IEEE Ttransaction on Pattern Analysis and Machine Intelligence, 27, 957-968
- Mercer, J. (1909). Functions of positive and negative type and their connection with the theory of integral equations. Philosophical Transactions of the Royal Society of London, 209, 415-446
- Minka, T. (2003). A comparison of numerical optimizers for logistic regression. Technical Report, Department of Statistics, Carnegie Mellon University
- Suykens, J. A. K. and Vandewalle, J. (1999). Multiclass least squares support vector machines, Proceeding of the International Joint Conference on Neural Networks, 900-903
- Vapnik, V. N. (1995). The Nature of Statistical Learning Theory. Springer-Verlag, New York
- Vapnik, V. N. (1998). Statistical Learning Theory. Springer-Verlag, New York
- Wahba, G., Lin, Y., and Zhang, H. (1999). Generalized approximate cross validation for support vector machine, or, another way to look at margin-Like quantities. Technical Report No. 1006, University of Wisconsin
- Weston, J. and Watkins, C. (1998). Multi-class SVM. Technical Report 98-04, Royal Holloway University of London
- Rifkin, R. and Klautau, A. (2004). In defense of one-vs-all classification. Journal of Machine Learning Research, 5, 101-141
- Kimeldorf, G. S. and Wahba, G. (1971). Some results on Tchebycheffian spline functions. Journal of Mathematical Analysis and its Applications, 33, 82-95
이 논문을 인용한 문헌 (2)
- 2008. "" 한국통계학회 논문집 = Communications of the Korean Statistical Society, 15(6): 1003~1011
- 2008. "" 한국통계학회 논문집 = Communications of the Korean Statistical Society, 15(3): 441~450
원문복사신청을 하시면, 일부 해외 인쇄학술지의 경우 외국학술지지원센터(FRIC)에서
무료 원문복사 서비스를 제공합니다.
NDSL에서는 해당 원문을 복사서비스하고 있습니다. 위의 원문복사신청 또는 장바구니 담기를 통하여 원문복사서비스 이용이 가능합니다.
- 이 논문과 함께 출판된 논문 + 더보기