Open Science Research Excellence

Open Science Index

Commenced in January 2007 Frequency: Monthly Edition: International Publications Count: 29530


Select areas to restrict search in scientific publication database:
8398
A Kernel Classifier using Linearised Bregman Iteration
Abstract:
In this paper we introduce a novel kernel classifier based on a iterative shrinkage algorithm developed for compressive sensing. We have adopted Bregman iteration with soft and hard shrinkage functions and generalized hinge loss for solving l1 norm minimization problem for classification. Our experimental results with face recognition and digit classification using SVM as the benchmark have shown that our method has a close error rate compared to SVM but do not perform better than SVM. We have found that the soft shrinkage method give more accuracy and in some situations more sparseness than hard shrinkage methods.
Digital Object Identifier (DOI):

References:

[1] Donoho. D.L.: "Compressed sensing", IEEE Trans. Inform. Theory, 52:1289-1306, (2006)
[2] Daubechies I., Defrise M., De Mol C. "An iterative thresholding algorithm for linear inverse problems with a sparsity constraint", Comm. Pure Appl. Math. 57, pp. 14131457 (2004)
[3] Beck A., Teboulle M.,"A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems", SIAM J. Imaging Sciences, forthcoming.
[4] YinW., Osher S., Goldfarb D., Darbon J., "Bregman Iterative Algorithms for l1-Minimization with Applications to Compressed Sensing", SIAM J. IMAGING SCIENCES Vol. 1, No. 1, pp. 143168 (2008)
[5] Tibshirani R., "Regression selection and shrinkage via the lasso",J. R. Stat. Soc. Ser. B 58, pp. 267288 (1996)
[6] Zhu J., Rosset S., Hastie T., Tibshirani R., "1-norm Support Vector Machines", Advances in Neural Information Processing Systems 16, (2003)
[7] Raina R., Battle A., Lee H., Packer B., Ng. A. Y., "Self-taught learning: Transfer learning from unlabeled data", ICML 2007
[8] Lee H., Battle A., Raina R., Ng. A. Y., "Efficient sparse coding algorithms", In Advances in Neural Information Processing Systems 19 (NIPS-06), pages 801808, (2007)
[9] Langford J., Li L. Zhang T., "Sparse Online Learning via Truncated Gradient", Journal of Machine Learning Research Vol. 10, pp. 777-801 (2009)
[10] Koh K., Kim S., Boyd S., "An Interior-Point Method for Large- Scale l1-Regularized Logistic Regression", Journal of Machine Learning Research Vol. 8, pp. 1519-1555, 2007.
[11] Hale E., Yin W., Zhang. Y., "A Fixed-point continuation method for l1-regularization with application to compressed sensing", CAAM Technical Report TR07-07, Rice University, Houston, TX, 2007.
[12] Vapnik V. N., Statsitical Leanring Theory, Wiley Interscience, (1998)
[13] Rennie J.D.M., "Smooth Hinge Classfication", Technical report, MIT (2005)
[14] Bredies K., Lorenz D. A., "Iterated hard shrinkage for minimization problems with sparsity constraints", SIAM Journal on Scientific Computing, Vol. 30(2), pp. 657-683, 2008.
[15] Graham D. B., Allinson N. M., "Characterizing Virtual Eigensignatures for General Purpose Face Recognition", in Face Recognition: From Theory to Applications , NATO ASI Series F, Computer and Systems Sciences, Vol. 163. H. Wechsler, P. J. Phillips, V. Bruce, F. Fogelman- Soulie and T. S. Huang (eds), pp 446-456, 1998.
[16] Campbell C., Cristianini N., "Simple Learning Algorithms for Traning Support Vector Machines", Technical Report, UNiversity of Bristol, 1998.
[17] Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner. "Gradient-based learning applied to document recognition", Proceedings of the IEEE, 86(11):2278-2324, November 1998.
Vol:13 No:04 2019Vol:13 No:03 2019Vol:13 No:02 2019Vol:13 No:01 2019
Vol:12 No:12 2018Vol:12 No:11 2018Vol:12 No:10 2018Vol:12 No:09 2018Vol:12 No:08 2018Vol:12 No:07 2018Vol:12 No:06 2018Vol:12 No:05 2018Vol:12 No:04 2018Vol:12 No:03 2018Vol:12 No:02 2018Vol:12 No:01 2018
Vol:11 No:12 2017Vol:11 No:11 2017Vol:11 No:10 2017Vol:11 No:09 2017Vol:11 No:08 2017Vol:11 No:07 2017Vol:11 No:06 2017Vol:11 No:05 2017Vol:11 No:04 2017Vol:11 No:03 2017Vol:11 No:02 2017Vol:11 No:01 2017
Vol:10 No:12 2016Vol:10 No:11 2016Vol:10 No:10 2016Vol:10 No:09 2016Vol:10 No:08 2016Vol:10 No:07 2016Vol:10 No:06 2016Vol:10 No:05 2016Vol:10 No:04 2016Vol:10 No:03 2016Vol:10 No:02 2016Vol:10 No:01 2016
Vol:9 No:12 2015Vol:9 No:11 2015Vol:9 No:10 2015Vol:9 No:09 2015Vol:9 No:08 2015Vol:9 No:07 2015Vol:9 No:06 2015Vol:9 No:05 2015Vol:9 No:04 2015Vol:9 No:03 2015Vol:9 No:02 2015Vol:9 No:01 2015
Vol:8 No:12 2014Vol:8 No:11 2014Vol:8 No:10 2014Vol:8 No:09 2014Vol:8 No:08 2014Vol:8 No:07 2014Vol:8 No:06 2014Vol:8 No:05 2014Vol:8 No:04 2014Vol:8 No:03 2014Vol:8 No:02 2014Vol:8 No:01 2014
Vol:7 No:12 2013Vol:7 No:11 2013Vol:7 No:10 2013Vol:7 No:09 2013Vol:7 No:08 2013Vol:7 No:07 2013Vol:7 No:06 2013Vol:7 No:05 2013Vol:7 No:04 2013Vol:7 No:03 2013Vol:7 No:02 2013Vol:7 No:01 2013
Vol:6 No:12 2012Vol:6 No:11 2012Vol:6 No:10 2012Vol:6 No:09 2012Vol:6 No:08 2012Vol:6 No:07 2012Vol:6 No:06 2012Vol:6 No:05 2012Vol:6 No:04 2012Vol:6 No:03 2012Vol:6 No:02 2012Vol:6 No:01 2012
Vol:5 No:12 2011Vol:5 No:11 2011Vol:5 No:10 2011Vol:5 No:09 2011Vol:5 No:08 2011Vol:5 No:07 2011Vol:5 No:06 2011Vol:5 No:05 2011Vol:5 No:04 2011Vol:5 No:03 2011Vol:5 No:02 2011Vol:5 No:01 2011
Vol:4 No:12 2010Vol:4 No:11 2010Vol:4 No:10 2010Vol:4 No:09 2010Vol:4 No:08 2010Vol:4 No:07 2010Vol:4 No:06 2010Vol:4 No:05 2010Vol:4 No:04 2010Vol:4 No:03 2010Vol:4 No:02 2010Vol:4 No:01 2010
Vol:3 No:12 2009Vol:3 No:11 2009Vol:3 No:10 2009Vol:3 No:09 2009Vol:3 No:08 2009Vol:3 No:07 2009Vol:3 No:06 2009Vol:3 No:05 2009Vol:3 No:04 2009Vol:3 No:03 2009Vol:3 No:02 2009Vol:3 No:01 2009
Vol:2 No:12 2008Vol:2 No:11 2008Vol:2 No:10 2008Vol:2 No:09 2008Vol:2 No:08 2008Vol:2 No:07 2008Vol:2 No:06 2008Vol:2 No:05 2008Vol:2 No:04 2008Vol:2 No:03 2008Vol:2 No:02 2008Vol:2 No:01 2008
Vol:1 No:12 2007Vol:1 No:11 2007Vol:1 No:10 2007Vol:1 No:09 2007Vol:1 No:08 2007Vol:1 No:07 2007Vol:1 No:06 2007Vol:1 No:05 2007Vol:1 No:04 2007Vol:1 No:03 2007Vol:1 No:02 2007Vol:1 No:01 2007