Excellence in Research and Innovation for Humanity

International Science Index


Select areas to restrict search in scientific publication database:
10006712
Genetic Algorithm Based Deep Learning Parameters Tuning for Robot Object Recognition and Grasping
Abstract:
This paper concerns with the problem of deep learning parameters tuning using a genetic algorithm (GA) in order to improve the performance of deep learning (DL) method. We present a GA based DL method for robot object recognition and grasping. GA is used to optimize the DL parameters in learning procedure in term of the fitness function that is good enough. After finishing the evolution process, we receive the optimal number of DL parameters. To evaluate the performance of our method, we consider the object recognition and robot grasping tasks. Experimental results show that our method is efficient for robot object recognition and grasping.
Digital Article Identifier (DAI):

References:

[1] G. Hinton, S. Osindero, and Teh, Y.-W, “A fast learning algorithm for deep belief nets”, Neural Computation, vol. 18, no. 7, pp. 1527-1554, 2006.
[2] J. Lamos-Sweeney, “Deep learning using genetic algorithms”, Master’s Thesis, Rochester Institute of Technology, NY, USA, 2012.
[3] E. Levy, O. E. David, and N. S. Netanyahu, “Genetic algorithms and deep learning for automatic painter classification”, Proceedings of the 2014 Annual Conference on Genetic and Evolutionary Computation, 2014, pp. 1143-1150.
[4] S. S. Tirumala, "Implementation of Evolutionary Algorithms for Deep Architectures", AIC, pp. 164-171 2014.
[5] O. E. David, and I. Greental, "Genetic algorithms for evolving deep neural networks", In Proceedings of the Companion Publication of the 2014 Annual Conference on Genetic and Evolutionary Computation, pp. 1451-1452. ACM, 2014.
[6] P. Verbancsics, and Josh Harguess, "Generative neuroevolution for deep learning", arXiv preprint arXiv:1312.5355 (2013).
[7] L. Shao, L. Liu, and X. Li, “ Feature learning for image classification via multiobjective genetic programming”, IEEE Transactions on Neural Networks and Learning Systems, vol. 25, no. 7, pp.1359-1371, 2014.
[8] I. Lenz, H. Lee, and A. Saxena, “Deep Learning for Detecting Robotic Grasps”, International Journal of Robotics Research, vol. 34, no. 4-5, pp. 705-724, 2015.
[9] L. Pinto, and Gupta, “Supersizing Self-supervision: Learning to Grasp from 50K Tries and 700 Robot Hours”, arXiv:1509.06825 (cs.LG), 2015.
[10] D. Hossain, and G. Capi, “Application of Deep Belief Neural Network for Robot Object Recognition and Grasping”, The 2nd IEEJ International Workshop on Sensing, Actuation, and Motion Control (SAMCON 2016), Tokyo, Japan, March 2016.
[11] D. Hossain, G. Capi, and M. Jindai, “Object Recognition and Robot Grasping: A Deep Learning based Approach”, The 34th Annual Conference of the Robotics Society of Japan (RSJ 2016), Yamagata, Japan, September 2016.
[12] J. Redmon, and A. Angelova, “Real-Time Grasp Detection Using Convolutional Neural Networks”, arXiv:1412.3128, 2015.
[13] S. Levine, P. Pastor, A. Krizhevsky, and D. Quillen, D. “Learning Hand-Eye Coordination for Robotic Grasping with Deep Learning and Large-Scale Data Collection”, arXiv:1603.02199, 2016.
[14] G. Capi, and K. Doya, “Evolution of recurrent neural controllers using an extended parallel genetic algorithm”, Robotics and Autonomous System, vol. 52, no. 2-3, pp. 148-159, 2005.
Vol:11 No:11 2017Vol:11 No:10 2017Vol:11 No:09 2017Vol:11 No:08 2017Vol:11 No:07 2017Vol:11 No:06 2017Vol:11 No:05 2017Vol:11 No:04 2017Vol:11 No:03 2017Vol:11 No:02 2017Vol:11 No:01 2017
Vol:10 No:12 2016Vol:10 No:11 2016Vol:10 No:10 2016Vol:10 No:09 2016Vol:10 No:08 2016Vol:10 No:07 2016Vol:10 No:06 2016Vol:10 No:05 2016Vol:10 No:04 2016Vol:10 No:03 2016Vol:10 No:02 2016Vol:10 No:01 2016
Vol:9 No:12 2015Vol:9 No:11 2015Vol:9 No:10 2015Vol:9 No:09 2015Vol:9 No:08 2015Vol:9 No:07 2015Vol:9 No:06 2015Vol:9 No:05 2015Vol:9 No:04 2015Vol:9 No:03 2015Vol:9 No:02 2015Vol:9 No:01 2015
Vol:8 No:12 2014Vol:8 No:11 2014Vol:8 No:10 2014Vol:8 No:09 2014Vol:8 No:08 2014Vol:8 No:07 2014Vol:8 No:06 2014Vol:8 No:05 2014Vol:8 No:04 2014Vol:8 No:03 2014Vol:8 No:02 2014Vol:8 No:01 2014
Vol:7 No:12 2013Vol:7 No:11 2013Vol:7 No:10 2013Vol:7 No:09 2013Vol:7 No:08 2013Vol:7 No:07 2013Vol:7 No:06 2013Vol:7 No:05 2013Vol:7 No:04 2013Vol:7 No:03 2013Vol:7 No:02 2013Vol:7 No:01 2013
Vol:6 No:12 2012Vol:6 No:11 2012Vol:6 No:10 2012Vol:6 No:09 2012Vol:6 No:08 2012Vol:6 No:07 2012Vol:6 No:06 2012Vol:6 No:05 2012Vol:6 No:04 2012Vol:6 No:03 2012Vol:6 No:02 2012Vol:6 No:01 2012
Vol:5 No:12 2011Vol:5 No:11 2011Vol:5 No:10 2011Vol:5 No:09 2011Vol:5 No:08 2011Vol:5 No:07 2011Vol:5 No:06 2011Vol:5 No:05 2011Vol:5 No:04 2011Vol:5 No:03 2011Vol:5 No:02 2011Vol:5 No:01 2011
Vol:4 No:12 2010Vol:4 No:11 2010Vol:4 No:10 2010Vol:4 No:09 2010Vol:4 No:08 2010Vol:4 No:07 2010Vol:4 No:06 2010Vol:4 No:05 2010Vol:4 No:04 2010Vol:4 No:03 2010Vol:4 No:02 2010Vol:4 No:01 2010
Vol:3 No:12 2009Vol:3 No:11 2009Vol:3 No:10 2009Vol:3 No:09 2009Vol:3 No:08 2009Vol:3 No:07 2009Vol:3 No:06 2009Vol:3 No:05 2009Vol:3 No:04 2009Vol:3 No:03 2009Vol:3 No:02 2009Vol:3 No:01 2009
Vol:2 No:12 2008Vol:2 No:11 2008Vol:2 No:10 2008Vol:2 No:09 2008Vol:2 No:08 2008Vol:2 No:07 2008Vol:2 No:06 2008Vol:2 No:05 2008Vol:2 No:04 2008Vol:2 No:03 2008Vol:2 No:02 2008Vol:2 No:01 2008
Vol:1 No:12 2007Vol:1 No:11 2007Vol:1 No:10 2007Vol:1 No:09 2007Vol:1 No:08 2007Vol:1 No:07 2007Vol:1 No:06 2007Vol:1 No:05 2007Vol:1 No:04 2007Vol:1 No:03 2007Vol:1 No:02 2007Vol:1 No:01 2007