Open Science Research Excellence

Open Science Index

Commenced in January 2007 Frequency: Monthly Edition: International Publications Count: 29284


Select areas to restrict search in scientific publication database:
2610
Neural Network Learning Based on Chaos
Abstract:
Chaos and fractals are novel fields of physics and mathematics showing up a new way of universe viewpoint and creating many ideas to solve several present problems. In this paper, a novel algorithm based on the chaotic sequence generator with the highest ability to adapt and reach the global optima is proposed. The adaptive ability of proposal algorithm is flexible in 2 steps. The first one is a breadth-first search and the second one is a depth-first search. The proposal algorithm is examined by 2 functions, the Camel function and the Schaffer function. Furthermore, the proposal algorithm is applied to optimize training Multilayer Neural Networks.
Digital Object Identifier (DOI):

References:

[1] Masahiro Nakagawa, Chaos and Fractals in Engineering, World Science, Singapore, 1999.
[2] Luonan Chen, K. Aihara, "Global searching ability of chaos neural networks", IEEE Trans. Circuits Syst., 46.8, 1999.
[3] Isao Tokuda, Kazuyuki Aihara, and Tomomasa Nagashima, Adaptive annealing for chaotic optimization, Phys. Rev. E, 58.4, 1998.
[4] Mohammad Saleh Tavazoei and Mohammad Haeri, "An optimization algorithm based on chaotic behavior and fractal nature", J. Comput. Appl. Math., 2006.
[5] Dixiong Yang, Gang Li and Gengdong Cheng, "On the efficiency of chaos optimization algorithms for global optimization", J. Chaos, Solitons & Fractals, 2006.
[6] Guo Zilong, Wang Sun-an and Zhuang Jian, "A novel immune evolutionary algorithm incorporating chaos optimization", Pattern Recognition Lett., 27.1, 2006,pp. 2-8.
[7] Shengsong Liu; Zhijian Hou, "Weighted gradient direction based chaos optimization algorithm for nonlinear programming problem", Proc. Intelligent Control and Automation, 3, 2002, pp. 1779 - 1783.
[8] You Yong; Sheng Wanxing; Wang Sunan; "Study of chaos genetic algorithms and its application in neural networks", Proc. Computers, Communication, Control and Power Eng.TENCON,, 1 2002. pp. 232 - 235.
[9] Rongbin Qi; Feng Qian; Shaojun Li; Zhenlei Wang; "Chaos-Genetic Algorithm for Multiobjective Optimization", Proc. Intelligent Control and Automation WCICA, 1, 2006, pp 1563 - 1566.
[10] LU Hui-juan, ZHANG Huo-ming, MA Long-hua, A new optimization algorithm based on chaos, J. Zhejiang University sci. A, 7.4, 2006, pp. 539-542.
[11] Liu Shengsong; Hou Zhijian; Wang Min, "A hybrid algorithm for optimal power flow using the chaos optimization and the linear interior point algorithm", Proc. PowerCon, 2, 2002, pp. 793 - 797
[12] Ji MJ, Tang HW, "Application of chaos in simulated annealing", J. Chaos, Solitons & Fractals, 21, 2004, pp. 933-41.
[13] Liu B, Wang L, Yin HY, Tang F, Huang DX, "Improved particle swarm optimization combined with chaos", J. Chaos, Solitons & Fractals 25, 2005, pp. 1261-71.
[14] Behera, L., Kumar, S., Patnaik, A., "On Adaptive Learning Rate That Guarantees Convergence in Feedforward Networks", IEEE Trans. Neural Networks, 17.5, 2006, pp. 1116-1125.
[15] Yu, X., Onder Efe, M., Kaynak, O., "A backpropagation learning framework for feedforward neural networks", IEEE Int. Syposium Circuits Sys. ISCAS, 3, 2001, pp.700-702.
[16] Terrence L. Fine, Feedforward neural network methodology, Springer, New York, 1999, pp. 130-131.
[17] K. Bertels1, L. Neuberg1, S. Vassiliadis and D.G. Pechanek, "On Chaos and Neural Networks: The Backpropagation Paradigm", Artificial Intelligence Rev. 15, 2001, pp.165-187.
Vol:13 No:02 2019Vol:13 No:01 2019
Vol:12 No:12 2018Vol:12 No:11 2018Vol:12 No:10 2018Vol:12 No:09 2018Vol:12 No:08 2018Vol:12 No:07 2018Vol:12 No:06 2018Vol:12 No:05 2018Vol:12 No:04 2018Vol:12 No:03 2018Vol:12 No:02 2018Vol:12 No:01 2018
Vol:11 No:12 2017Vol:11 No:11 2017Vol:11 No:10 2017Vol:11 No:09 2017Vol:11 No:08 2017Vol:11 No:07 2017Vol:11 No:06 2017Vol:11 No:05 2017Vol:11 No:04 2017Vol:11 No:03 2017Vol:11 No:02 2017Vol:11 No:01 2017
Vol:10 No:12 2016Vol:10 No:11 2016Vol:10 No:10 2016Vol:10 No:09 2016Vol:10 No:08 2016Vol:10 No:07 2016Vol:10 No:06 2016Vol:10 No:05 2016Vol:10 No:04 2016Vol:10 No:03 2016Vol:10 No:02 2016Vol:10 No:01 2016
Vol:9 No:12 2015Vol:9 No:11 2015Vol:9 No:10 2015Vol:9 No:09 2015Vol:9 No:08 2015Vol:9 No:07 2015Vol:9 No:06 2015Vol:9 No:05 2015Vol:9 No:04 2015Vol:9 No:03 2015Vol:9 No:02 2015Vol:9 No:01 2015
Vol:8 No:12 2014Vol:8 No:11 2014Vol:8 No:10 2014Vol:8 No:09 2014Vol:8 No:08 2014Vol:8 No:07 2014Vol:8 No:06 2014Vol:8 No:05 2014Vol:8 No:04 2014Vol:8 No:03 2014Vol:8 No:02 2014Vol:8 No:01 2014
Vol:7 No:12 2013Vol:7 No:11 2013Vol:7 No:10 2013Vol:7 No:09 2013Vol:7 No:08 2013Vol:7 No:07 2013Vol:7 No:06 2013Vol:7 No:05 2013Vol:7 No:04 2013Vol:7 No:03 2013Vol:7 No:02 2013Vol:7 No:01 2013
Vol:6 No:12 2012Vol:6 No:11 2012Vol:6 No:10 2012Vol:6 No:09 2012Vol:6 No:08 2012Vol:6 No:07 2012Vol:6 No:06 2012Vol:6 No:05 2012Vol:6 No:04 2012Vol:6 No:03 2012Vol:6 No:02 2012Vol:6 No:01 2012
Vol:5 No:12 2011Vol:5 No:11 2011Vol:5 No:10 2011Vol:5 No:09 2011Vol:5 No:08 2011Vol:5 No:07 2011Vol:5 No:06 2011Vol:5 No:05 2011Vol:5 No:04 2011Vol:5 No:03 2011Vol:5 No:02 2011Vol:5 No:01 2011
Vol:4 No:12 2010Vol:4 No:11 2010Vol:4 No:10 2010Vol:4 No:09 2010Vol:4 No:08 2010Vol:4 No:07 2010Vol:4 No:06 2010Vol:4 No:05 2010Vol:4 No:04 2010Vol:4 No:03 2010Vol:4 No:02 2010Vol:4 No:01 2010
Vol:3 No:12 2009Vol:3 No:11 2009Vol:3 No:10 2009Vol:3 No:09 2009Vol:3 No:08 2009Vol:3 No:07 2009Vol:3 No:06 2009Vol:3 No:05 2009Vol:3 No:04 2009Vol:3 No:03 2009Vol:3 No:02 2009Vol:3 No:01 2009
Vol:2 No:12 2008Vol:2 No:11 2008Vol:2 No:10 2008Vol:2 No:09 2008Vol:2 No:08 2008Vol:2 No:07 2008Vol:2 No:06 2008Vol:2 No:05 2008Vol:2 No:04 2008Vol:2 No:03 2008Vol:2 No:02 2008Vol:2 No:01 2008
Vol:1 No:12 2007Vol:1 No:11 2007Vol:1 No:10 2007Vol:1 No:09 2007Vol:1 No:08 2007Vol:1 No:07 2007Vol:1 No:06 2007Vol:1 No:05 2007Vol:1 No:04 2007Vol:1 No:03 2007Vol:1 No:02 2007Vol:1 No:01 2007