Excellence in Research and Innovation for Humanity

International Science Index


Select areas to restrict search in scientific publication database:
15383
Particle Swarm Optimization with Reduction for Global Optimization Problems
Abstract:
This paper presents an algorithm of particle swarm optimization with reduction for global optimization problems. Particle swarm optimization is an algorithm which refers to the collective motion such as birds or fishes, and a multi-point search algorithm which finds a best solution using multiple particles. Particle swarm optimization is so flexible that it can adapt to a number of optimization problems. When an objective function has a lot of local minimums complicatedly, the particle may fall into a local minimum. For avoiding the local minimum, a number of particles are initially prepared and their positions are updated by particle swarm optimization. Particles sequentially reduce to reach a predetermined number of them grounded in evaluation value and particle swarm optimization continues until the termination condition is met. In order to show the effectiveness of the proposed algorithm, we examine the minimum by using test functions compared to existing algorithms. Furthermore the influence of best value on the initial number of particles for our algorithm is discussed.
Digital Article Identifier (DAI):

References:

[1] E. Aiyoshi and K. Yasuda, Metaheuristics and Applications, Ohmsha, 2007.
[2] H. Kitano, Genetic Algorithm vol.4, Sangyotosho, 2000.
[3] K. Price, R. Storn, and J. Lampinen, Differential Evolution, Springer- Verlag, 2005.
[4] J. Kennedy and R. Eberhart Particle swarm optimization, IEEE Proc. Int. Conf. Neural Networks, pp.1942-1948, 1995.
[5] S. Kinoshita, A. Ishigame and K. Yasuda, Particle swarm optimization with hierarchical structure, The Institute of Electrical Engineers of Japan, vol.130, pp.100-107, 2009.
[6] W. Langdon and R. Poli, Evolving problems to learn about particle swarm optimization and other search algorithms, IEEE Trans. Evolutionary Computation, vol.11, no.5, pp.561-578, 2007.
[7] J. Liang, A. Qin, P. Suganthan, and S. Baskar, Comprehensive learning particle swarm optimization for global optimization of multimodal functions, IEEE Trans. Evolutionary Computation, vol.10, no.3, pp.281- 295, 2006.
[8] K. Parsopoulos and M. Vrahatis, On the computation of all global minimizers through particle swarm optimization. IEEE Trans. Evolutionary Computation, vol.8, no.3, pp.211-224, 2004.
[9] Y. Shi, Particle swarm optimization, IEEE Connections, vol.2, no.1, pp.8-13, 2004.
[10] S. Tsuda, T. Liu, and M. Maeda, Solution search algorithm of particle swarm optimization with perturbation term, ICIC Express Letters, vol.5, no.5, pp.1515-1521, 2011.
[11] S. Wu and T. Chow, Self-organizing and self-evolving neurons: A new neural network for optimization, IEEE Trans. Neural Networks, vol.18, no.2, pp.385-396, 2007.
[12] W. Yeh, Y. Lin, Y. Chung, and M. Chih, A particle swarm optimization approach based on monte carlo simulation for solving the complex network reliability problem, IEEE Trans. Reliability, vol.59, no.1, pp.212- 221, 2010.
[13] R. Reed, Pruning algorithmsÔÇöA survey, IEEE Trans. Neural Networks, vol.4, pp.740-747, 1993.
[14] M. Ishikawa, Structural learning with forgetting, Neural Networks, vol.9, pp.509-521, 1996.
[15] M. Maeda, N. Shigei, and H. Miyajima, Adaptive vector quantization with creation and reduction grounded in the equinumber principle, Journal of Advanced Computational Intelligence and Intelligent Informatics, vol.9, pp.599-606, 2005.
[16] M. Maeda, N. Shigei, H. Miyajima, and K. Suzaki, Reduction models in competitive learning founded on distortion standards, Journal of Advanced Computational Intelligence and Intelligent Informatics, vol.12, pp.314-323, 2008.
Vol:12 No:05 2018Vol:12 No:04 2018Vol:12 No:03 2018Vol:12 No:02 2018Vol:12 No:01 2018
Vol:11 No:12 2017Vol:11 No:11 2017Vol:11 No:10 2017Vol:11 No:09 2017Vol:11 No:08 2017Vol:11 No:07 2017Vol:11 No:06 2017Vol:11 No:05 2017Vol:11 No:04 2017Vol:11 No:03 2017Vol:11 No:02 2017Vol:11 No:01 2017
Vol:10 No:12 2016Vol:10 No:11 2016Vol:10 No:10 2016Vol:10 No:09 2016Vol:10 No:08 2016Vol:10 No:07 2016Vol:10 No:06 2016Vol:10 No:05 2016Vol:10 No:04 2016Vol:10 No:03 2016Vol:10 No:02 2016Vol:10 No:01 2016
Vol:9 No:12 2015Vol:9 No:11 2015Vol:9 No:10 2015Vol:9 No:09 2015Vol:9 No:08 2015Vol:9 No:07 2015Vol:9 No:06 2015Vol:9 No:05 2015Vol:9 No:04 2015Vol:9 No:03 2015Vol:9 No:02 2015Vol:9 No:01 2015
Vol:8 No:12 2014Vol:8 No:11 2014Vol:8 No:10 2014Vol:8 No:09 2014Vol:8 No:08 2014Vol:8 No:07 2014Vol:8 No:06 2014Vol:8 No:05 2014Vol:8 No:04 2014Vol:8 No:03 2014Vol:8 No:02 2014Vol:8 No:01 2014
Vol:7 No:12 2013Vol:7 No:11 2013Vol:7 No:10 2013Vol:7 No:09 2013Vol:7 No:08 2013Vol:7 No:07 2013Vol:7 No:06 2013Vol:7 No:05 2013Vol:7 No:04 2013Vol:7 No:03 2013Vol:7 No:02 2013Vol:7 No:01 2013
Vol:6 No:12 2012Vol:6 No:11 2012Vol:6 No:10 2012Vol:6 No:09 2012Vol:6 No:08 2012Vol:6 No:07 2012Vol:6 No:06 2012Vol:6 No:05 2012Vol:6 No:04 2012Vol:6 No:03 2012Vol:6 No:02 2012Vol:6 No:01 2012
Vol:5 No:12 2011Vol:5 No:11 2011Vol:5 No:10 2011Vol:5 No:09 2011Vol:5 No:08 2011Vol:5 No:07 2011Vol:5 No:06 2011Vol:5 No:05 2011Vol:5 No:04 2011Vol:5 No:03 2011Vol:5 No:02 2011Vol:5 No:01 2011
Vol:4 No:12 2010Vol:4 No:11 2010Vol:4 No:10 2010Vol:4 No:09 2010Vol:4 No:08 2010Vol:4 No:07 2010Vol:4 No:06 2010Vol:4 No:05 2010Vol:4 No:04 2010Vol:4 No:03 2010Vol:4 No:02 2010Vol:4 No:01 2010
Vol:3 No:12 2009Vol:3 No:11 2009Vol:3 No:10 2009Vol:3 No:09 2009Vol:3 No:08 2009Vol:3 No:07 2009Vol:3 No:06 2009Vol:3 No:05 2009Vol:3 No:04 2009Vol:3 No:03 2009Vol:3 No:02 2009Vol:3 No:01 2009
Vol:2 No:12 2008Vol:2 No:11 2008Vol:2 No:10 2008Vol:2 No:09 2008Vol:2 No:08 2008Vol:2 No:07 2008Vol:2 No:06 2008Vol:2 No:05 2008Vol:2 No:04 2008Vol:2 No:03 2008Vol:2 No:02 2008Vol:2 No:01 2008
Vol:1 No:12 2007Vol:1 No:11 2007Vol:1 No:10 2007Vol:1 No:09 2007Vol:1 No:08 2007Vol:1 No:07 2007Vol:1 No:06 2007Vol:1 No:05 2007Vol:1 No:04 2007Vol:1 No:03 2007Vol:1 No:02 2007Vol:1 No:01 2007