Open Science Research Excellence

Open Science Index

Commenced in January 2007 Frequency: Monthly Edition: International Publications Count: 29524


Select areas to restrict search in scientific publication database:
1060
Capturing an Unknown Moving Target in Unknown Territory using Vision and Coordination
Abstract:
In this paper we present an extension to Vision Based LRTA* (VLRTA*) known as Vision Based Moving Target Search (VMTS) for capturing unknown moving target in unknown territory with randomly generated obstacles. Target position is unknown to the agents and they cannot predict its position using any probability method. Agents have omni directional vision but can see in one direction at some point in time. Agent-s vision will be blocked by the obstacles in the search space so agent can not see through the obstacles. Proposed algorithm is evaluated on large number of scenarios. Scenarios include grids of sizes from 10x10 to 100x100. Grids had obstacles randomly placed, occupying 0% to 50%, in increments of 10%, of the search space. Experiments used 2 to 9 agents for each randomly generated maze with same obstacle ratio. Observed results suggests that VMTS is effective in locate target time, solution quality and virtual target. In addition, VMTS becomes more efficient if the number of agents is increased with proportion to obstacle ratio.
Digital Object Identifier (DOI):

References:

[1] Umar Manzoor and Kiran Ijaz, "Using Vision and Coordination to find Unknown Target", 10th WSEAS CSCC Multiconference on Computers, Communications, Systems and Circuits, Vouliagmeni Beach, Athens, Greece, July 10-15, 2006.
[2] Chimura and M. Tokoro, The Trailblazer Search: "A New Method for Searching and Capturing Moving Targets", AAAI-94, pp. 1347-1352, 1994.
[3] T. Ishida and M. Shimbo, "Improving the Learning Efficiencies of Real- Time Search", AAAI-96, pp. 305-310, 1996.
[4] E. Korf, Real-Time Heuristic Search," Artificial Intelligence, Vol. 42, No. 2-3,pp. 189-211. 1990..
[5] Muaz Niazi, Umar Manzoor and Kiran Ijaz, "C3LRTA*, Color Code Coordinated LRTA* Algorithm", International Conference on Computational Intelligence, Man-Machine Systems and Cybernetics, Miami, USA, Nov. 2005.
[6] T. Ishida and R. E. Korf, "Moving Target Search", IJCAI-91, pp. 204- 210, 1991.
[7] T. Ishida, "Moving Target Search with Intelligence", AAAI-92, pp. 525- 532, 1992.
[8] M. Goldenberg, A. Kovarksy, X. Wu, and J. Schaeffer. Multiple agents moving target search. In Proceedings of the International Joint Conference on Artificial Intelligence, pages 1538--1538, 2003.
[9] Florin Leon, Mihai Horia Zaharia, Dan G├ólea, "A new Approach in Agent Path-Finding using State Mark Gradient" Computer Science Journal of Moldova, Chi┼ƒinâu, 2004.
[10] D'idac Busquets, Carles Sierra and Ramon Lo' pez Dema` ntaras. A multiagent Approach to qualitative Landmark-Based Navigation, Autonomous Robots 15, 129-154, 2003.
[11] M.Tan. Maulti-Agent Reinforcement Learning: Independent vs. Cooperative agents. In Proceeding of the 10th International Conference on Machine Learning, 330-337, 1993.
[12] Y. Kitamura, K. Teranishi and S. Tatsumi, Organizational Strategies for Multi-agent Real-Time Search," International Conference on Multi- Agent Systems (ICMAS-96), pp. 150-156, 1996.
[13] M. Yokoo and Y. Kitamura, Multiagent Real-Time A* with Selection: Introducing Competition in Cooperative Search," International Conference on Multi-Agent Systems (ICMAS-96), pp. 409-416, 1996.
[14] Armagan Cakir and Faruk Polat, "Coordination of Intelligent Agents in real-time Search" 2002.
Vol:13 No:04 2019Vol:13 No:03 2019Vol:13 No:02 2019Vol:13 No:01 2019
Vol:12 No:12 2018Vol:12 No:11 2018Vol:12 No:10 2018Vol:12 No:09 2018Vol:12 No:08 2018Vol:12 No:07 2018Vol:12 No:06 2018Vol:12 No:05 2018Vol:12 No:04 2018Vol:12 No:03 2018Vol:12 No:02 2018Vol:12 No:01 2018
Vol:11 No:12 2017Vol:11 No:11 2017Vol:11 No:10 2017Vol:11 No:09 2017Vol:11 No:08 2017Vol:11 No:07 2017Vol:11 No:06 2017Vol:11 No:05 2017Vol:11 No:04 2017Vol:11 No:03 2017Vol:11 No:02 2017Vol:11 No:01 2017
Vol:10 No:12 2016Vol:10 No:11 2016Vol:10 No:10 2016Vol:10 No:09 2016Vol:10 No:08 2016Vol:10 No:07 2016Vol:10 No:06 2016Vol:10 No:05 2016Vol:10 No:04 2016Vol:10 No:03 2016Vol:10 No:02 2016Vol:10 No:01 2016
Vol:9 No:12 2015Vol:9 No:11 2015Vol:9 No:10 2015Vol:9 No:09 2015Vol:9 No:08 2015Vol:9 No:07 2015Vol:9 No:06 2015Vol:9 No:05 2015Vol:9 No:04 2015Vol:9 No:03 2015Vol:9 No:02 2015Vol:9 No:01 2015
Vol:8 No:12 2014Vol:8 No:11 2014Vol:8 No:10 2014Vol:8 No:09 2014Vol:8 No:08 2014Vol:8 No:07 2014Vol:8 No:06 2014Vol:8 No:05 2014Vol:8 No:04 2014Vol:8 No:03 2014Vol:8 No:02 2014Vol:8 No:01 2014
Vol:7 No:12 2013Vol:7 No:11 2013Vol:7 No:10 2013Vol:7 No:09 2013Vol:7 No:08 2013Vol:7 No:07 2013Vol:7 No:06 2013Vol:7 No:05 2013Vol:7 No:04 2013Vol:7 No:03 2013Vol:7 No:02 2013Vol:7 No:01 2013
Vol:6 No:12 2012Vol:6 No:11 2012Vol:6 No:10 2012Vol:6 No:09 2012Vol:6 No:08 2012Vol:6 No:07 2012Vol:6 No:06 2012Vol:6 No:05 2012Vol:6 No:04 2012Vol:6 No:03 2012Vol:6 No:02 2012Vol:6 No:01 2012
Vol:5 No:12 2011Vol:5 No:11 2011Vol:5 No:10 2011Vol:5 No:09 2011Vol:5 No:08 2011Vol:5 No:07 2011Vol:5 No:06 2011Vol:5 No:05 2011Vol:5 No:04 2011Vol:5 No:03 2011Vol:5 No:02 2011Vol:5 No:01 2011
Vol:4 No:12 2010Vol:4 No:11 2010Vol:4 No:10 2010Vol:4 No:09 2010Vol:4 No:08 2010Vol:4 No:07 2010Vol:4 No:06 2010Vol:4 No:05 2010Vol:4 No:04 2010Vol:4 No:03 2010Vol:4 No:02 2010Vol:4 No:01 2010
Vol:3 No:12 2009Vol:3 No:11 2009Vol:3 No:10 2009Vol:3 No:09 2009Vol:3 No:08 2009Vol:3 No:07 2009Vol:3 No:06 2009Vol:3 No:05 2009Vol:3 No:04 2009Vol:3 No:03 2009Vol:3 No:02 2009Vol:3 No:01 2009
Vol:2 No:12 2008Vol:2 No:11 2008Vol:2 No:10 2008Vol:2 No:09 2008Vol:2 No:08 2008Vol:2 No:07 2008Vol:2 No:06 2008Vol:2 No:05 2008Vol:2 No:04 2008Vol:2 No:03 2008Vol:2 No:02 2008Vol:2 No:01 2008
Vol:1 No:12 2007Vol:1 No:11 2007Vol:1 No:10 2007Vol:1 No:09 2007Vol:1 No:08 2007Vol:1 No:07 2007Vol:1 No:06 2007Vol:1 No:05 2007Vol:1 No:04 2007Vol:1 No:03 2007Vol:1 No:02 2007Vol:1 No:01 2007