Excellence in Research and Innovation for Humanity

International Science Index


Select areas to restrict search in scientific publication database:
15086
Modeling Language for Constructing Solvers in Machine Learning: Reductionist Perspectives
Abstract:
For a given specific problem an efficient algorithm has been the matter of study. However, an alternative approach orthogonal to this approach comes out, which is called a reduction. In general for a given specific problem this reduction approach studies how to convert an original problem into subproblems. This paper proposes a formal modeling language to support this reduction approach in order to make a solver quickly. We show three examples from the wide area of learning problems. The benefit is a fast prototyping of algorithms for a given new problem. It is noted that our formal modeling language is not intend for providing an efficient notation for data mining application, but for facilitating a designer who develops solvers in machine learning.

References:

[1] Abe, N., Zadrozny, B., and Langford, J. (2004). An Iterative Method for Multi-Class Cost-Sensitive Learning. KDD -04.
[2] Allison, L. (2003). Types and Classes of Machine Learning and Data Mining. Twenty-Six Australasian Computer Science Conference (ACSC2003), pp.207-215, Australia.
[3] Bartlett, P. L., Collins, M., McAllester, D., and Taskar, B. (2004). Large margin methods for structured classification: Exponentiated Gradient algorithms and PAC-Bayesian generalization bounds. NIPS Conference.
[4] Cristianini, N., Shawe-Taylor, J. (2000). Introduction to Support Vector Machines. Cambridge University Press.
[5] Dietterich, T.G., and Bakiri, G. (1995). Solving Multiclass Learning Problems via Error-Correcting Output Codes. Journal of Artificial Intelligence Research, 2:263-286.
[6] Jaakkola, T. (2000) Tutorial on Variational Approximation Method. In Advanced Mean Field Methods: Theory and Practice, MIT Press.
[7] Lafferty, J., McCallum, A., Pereira, F. (2001). Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data. International Conference on Machine Learning (ICML).
[8] Langford, J., Beygelzimer, A. (2002). Sensitive Error Correcting Output Codes.
[9] Lagoudakis, M.G.,Parr, R.(2003). Reinforcement Learning as Classification: Leveraging Modern Classifiers. Proceedings of the Twentieth International Conference on Machine Learning (ICML-2003).
[10] Mitchell, T. (1997). Machine Learning. McGraw Hills.
[11] Okita, T., Manderick, B. (2003). Distributed Learning in Support Vector Machines (poster), Conference On Learning Theory and Kernel Machines, Washington.
[12] Pednault, E., Abe, N., and Zadrozny, B. (2002). Sequential Cost- Sensitive Decision Making with Reinforcement Learning. SIGKDD -02.
[13] Rabiner, L. R. (1989) A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition. Proceedings of the IEEE, VOL. 77, No. 2, February 1989.
[14] Scholkopf, B., Williamson, R.C., Smola, A.J., Shawe-Taylor, J. (2000). Support Vector Method for Novelty Detection. In Neural Information Processing Systems.
[15] Shawe-Taylor, J., Cristianini, N. (2004). Kernel Methods for Pattern Analysis. Cambridge University Press.
[16] Zadrozny, B. (2001). Reducing Multiclass to Binary by Coupling Probability Estimates. In Neural Information Processing Systems.
Vol:11 No:10 2017Vol:11 No:09 2017Vol:11 No:08 2017Vol:11 No:07 2017Vol:11 No:06 2017Vol:11 No:05 2017Vol:11 No:04 2017Vol:11 No:03 2017Vol:11 No:02 2017Vol:11 No:01 2017
Vol:10 No:12 2016Vol:10 No:11 2016Vol:10 No:10 2016Vol:10 No:09 2016Vol:10 No:08 2016Vol:10 No:07 2016Vol:10 No:06 2016Vol:10 No:05 2016Vol:10 No:04 2016Vol:10 No:03 2016Vol:10 No:02 2016Vol:10 No:01 2016
Vol:9 No:12 2015Vol:9 No:11 2015Vol:9 No:10 2015Vol:9 No:09 2015Vol:9 No:08 2015Vol:9 No:07 2015Vol:9 No:06 2015Vol:9 No:05 2015Vol:9 No:04 2015Vol:9 No:03 2015Vol:9 No:02 2015Vol:9 No:01 2015
Vol:8 No:12 2014Vol:8 No:11 2014Vol:8 No:10 2014Vol:8 No:09 2014Vol:8 No:08 2014Vol:8 No:07 2014Vol:8 No:06 2014Vol:8 No:05 2014Vol:8 No:04 2014Vol:8 No:03 2014Vol:8 No:02 2014Vol:8 No:01 2014
Vol:7 No:12 2013Vol:7 No:11 2013Vol:7 No:10 2013Vol:7 No:09 2013Vol:7 No:08 2013Vol:7 No:07 2013Vol:7 No:06 2013Vol:7 No:05 2013Vol:7 No:04 2013Vol:7 No:03 2013Vol:7 No:02 2013Vol:7 No:01 2013
Vol:6 No:12 2012Vol:6 No:11 2012Vol:6 No:10 2012Vol:6 No:09 2012Vol:6 No:08 2012Vol:6 No:07 2012Vol:6 No:06 2012Vol:6 No:05 2012Vol:6 No:04 2012Vol:6 No:03 2012Vol:6 No:02 2012Vol:6 No:01 2012
Vol:5 No:12 2011Vol:5 No:11 2011Vol:5 No:10 2011Vol:5 No:09 2011Vol:5 No:08 2011Vol:5 No:07 2011Vol:5 No:06 2011Vol:5 No:05 2011Vol:5 No:04 2011Vol:5 No:03 2011Vol:5 No:02 2011Vol:5 No:01 2011
Vol:4 No:12 2010Vol:4 No:11 2010Vol:4 No:10 2010Vol:4 No:09 2010Vol:4 No:08 2010Vol:4 No:07 2010Vol:4 No:06 2010Vol:4 No:05 2010Vol:4 No:04 2010Vol:4 No:03 2010Vol:4 No:02 2010Vol:4 No:01 2010
Vol:3 No:12 2009Vol:3 No:11 2009Vol:3 No:10 2009Vol:3 No:09 2009Vol:3 No:08 2009Vol:3 No:07 2009Vol:3 No:06 2009Vol:3 No:05 2009Vol:3 No:04 2009Vol:3 No:03 2009Vol:3 No:02 2009Vol:3 No:01 2009
Vol:2 No:12 2008Vol:2 No:11 2008Vol:2 No:10 2008Vol:2 No:09 2008Vol:2 No:08 2008Vol:2 No:07 2008Vol:2 No:06 2008Vol:2 No:05 2008Vol:2 No:04 2008Vol:2 No:03 2008Vol:2 No:02 2008Vol:2 No:01 2008
Vol:1 No:12 2007Vol:1 No:11 2007Vol:1 No:10 2007Vol:1 No:09 2007Vol:1 No:08 2007Vol:1 No:07 2007Vol:1 No:06 2007Vol:1 No:05 2007Vol:1 No:04 2007Vol:1 No:03 2007Vol:1 No:02 2007Vol:1 No:01 2007