본문 바로가기
HOME> 논문 > 논문 검색상세

논문 상세정보

Sparse Multinomial Kernel Logistic Regression

Shim, Joo-Yong   (Department of Applied Statistics, Catholic University of DaeguUU0013933  ); Bae, Jong-Sig   (Department of Mathematics, Sungkyunkwan UniversityUU0000759  ); Hwang, Chang-Ha   (Division of Information and Computer Science, Dankook UniversityUU0000336  );
  • 초록

    Multinomial logistic regression is a well known multiclass classification method in the field of statistical learning. More recently, the development of sparse multinomial logistic regression model has found application in microarray classification, where explicit identification of the most informative observations is of value. In this paper, we propose a sparse multinomial kernel logistic regression model, in which the sparsity arises from the use of a Laplacian prior and a fast exact algorithm is derived by employing a bound optimization approach. Experimental results are then presented to indicate the performance of the proposed procedure.


  • 주제어

    Bound optimization .   Laplacian regularization .   multinomial logistic regression .   sparsity .   support vector machine.  

  • 참고문헌 (11)

    1. Bohning, D. (1992). Multinomial logistic regression algorithm. Annals of the Institute of Statistical Mathematics, 44, 197-200 
    2. Cawley, G. C., Talbot, N. L. C. and Girolami, M. (2006). Sparse multinomial logistic regression via Bayesian L1 regularisation. Advances in Neural Information Processing Systems, 18, 609-616 
    3. Lawrence, N. D., Seeger, M. and Herbrich, R. (2003). Fast sparse Gaussian process methods: the informative vector machine. Advances in Neural Information Processing Systems, 15, 609-616 
    4. Mercer, J. (1909). Functions of positive and negative type and their connection with the theory of integral equations. Philosophical Transactions of the Royal Society of London, 209, 415-446 
    5. Csato, L. and Opper, M. (2002). Sparse online Gaussian processes. Neural Computation, 14, 641-668 
    6. Kimeldorf, G. S. and Wahba, G. (1971). Some results on Tchebycheffian spline functions. Journal of Mathematical Analysis and its Applications, 33, 82-95 
    7. Tipping, M. (2001). Sparse Bayesian learning and the relevance vector machine. Journal of Machine Learning Research, 1, 211-244 
    8. Vapnik, V. N. (1995). The Nature of Statistical Learning Theory. Springer-Verlag, New York 
    9. Minka, T. (2003). A comparison of numerical optimizers for logistic regression. Technical Report, Department of Statistics, Carnegie Mellon University 
    10. Rifkin, R. and Klautau, A. (2004). In defense of one-vs-all classification. Journal of Machine Learning Research, 5, 101-141 
    11. Krishnapuram, B., Carin, L., Figueiredo, M. A. T. and Hartemink, A. J. (2005). Sparse multi-nomial logistic regression: fast algorithms and generalization bounds. IEEE Ttransaction on Pattern Analysis and Machine Intelligence, 27, 957-968 

 활용도 분석

  • 상세보기

    amChart 영역
  • 원문보기

    amChart 영역

원문보기

무료다운로드
  • NDSL :
  • 한국통계학회 : 저널
유료다운로드

유료 다운로드의 경우 해당 사이트의 정책에 따라 신규 회원가입, 로그인, 유료 구매 등이 필요할 수 있습니다. 해당 사이트에서 발생하는 귀하의 모든 정보활동은 NDSL의 서비스 정책과 무관합니다.

원문복사신청을 하시면, 일부 해외 인쇄학술지의 경우 외국학술지지원센터(FRIC)에서
무료 원문복사 서비스를 제공합니다.

NDSL에서는 해당 원문을 복사서비스하고 있습니다. 위의 원문복사신청 또는 장바구니 담기를 통하여 원문복사서비스 이용이 가능합니다.

이 논문과 함께 출판된 논문 + 더보기