Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Software Architecture and Tools for Autonomous Robots that Learn on Mission K. Kawamura, M. Wilkes, A. Peters, D. Gaines* Vanderbilt University Center.

Similar presentations


Presentation on theme: "A Software Architecture and Tools for Autonomous Robots that Learn on Mission K. Kawamura, M. Wilkes, A. Peters, D. Gaines* Vanderbilt University Center."— Presentation transcript:

1 A Software Architecture and Tools for Autonomous Robots that Learn on Mission K. Kawamura, M. Wilkes, A. Peters, D. Gaines* Vanderbilt University Center for Intelligent Systems * Jet Propulsion Laboratory http://shogun.vuse.vanderbilt.edu/cis/DARPA/ February 2002 MARS PI Meeting

2 Objective Develop a multi-agent based robot control architecture for humanoid and mobile robots that can: – accept high-level commands from a human – learn from experience to modify existing behaviors, and – share knowledge with other robots

3 Accomplishments 1.Multi-agent based robot control architectures for humanoid and mobile robots have been developed 2.Agent-based Human-Robot Interfaces have been developed for humanoid and mobile robots 3.SES (Sensory EgoSphere), a short-term robot memory, was developed and transferred to NASA/JSC Robonaut group 4.SES- & LES (Landmark EgoSphere)- based navigation algorithm was developed and tested 5.SES knowledge sharing among mobile robots was developed and tested 6.SAN-RL (Spreading Activation Network - Reinforcement Learning) method was applied to mobile robots for dynamic path planning

4 Presentation / Demo 1.Multi-agent based Robot Control Architecture Humanoid Mobile robots 2.Agent-based Human Robot Interfaces Humanoid (face-to-face) Mobile robots (GUI) 3.Sensory EgoSphere (SES) Humanoid Mobile robots 4.SES– and LES– based Navigation 5.SES and LES Knowledge Sharing 6.Dynamic Path Planing through SAN-RL

5 Novel Approach: Distributed architecture that expressly represents human and humanoid internally Publication [1,2] Multi-Agent Based Robot Control Architecture for Humanoids Self Agent Human Agent A A A A A A Atomic Agents Sensory EgoSphere DataBase Associative Memory SES Manager DBAM Manager Human Database

6 Multi-Agent Based Robot Control Architecture for Mobile Robots Self Agent SES DataBase Associative Memory Egosphere Manager DBAM Manager A A Atomic Agents A A AA LES Commander Interface Agent Path PlanningPeer Agent Publication [7] Novel Approach: Distributed, agent-based architecture to gather mission relevant information from robots

7 Agent-based Human-Robot Interfaces for Humanoids Novel Approach: Modeling the human’s and humanoid’s intent for interaction Human Agent (HA) observes and monitors the communications and actions of people extracts person’s intention for interaction communicates with people Self Agent (SA) monitors humanoid’s activity and performance for self- awareness and reporting to human determines the humanoid’s intention and response and reports to human Publication [3,4,5]

8 Agent-based Human-Robot Interface for Mobile Robots Novel Approach: Interface that adapts to the current context of the mission in addition to user preferences by using User Interface Components (UIC) and an agent-based architecture Camera UIC Sonar UIC Publication [7]

9 Sensory EgoSphere (SES) for Humanoids Objects in ISAC’s immediate environment are detected Objects are registered onto the SES at the interface nodes closest to the objects’ perceived locations Information about a sensory object is stored in a database with the node location and other index Publication [2]

10 Sensory EgoSphere Display for Humanoids Provides a tool for person to visualize what ISAC has detected

11 Sensory EgoSphere (SES) for Mobile Robots  The SES can be used to enhance a graphical user interface and to increase situational awareness  In a GUI, the SES translates mobile robot sensory data from the sensing level to the perception level in a compact form  The SES is also used for perception-based navigation with a Landmark EgoSphere  The can be also used for supervisory control of mobile robots  Perceptual and sensory information is mapped on a geodesically tessellated sphere  Distance information is not explicitly represented on SES  A sequence of SES’s will be stored in the database SES Publication [6] 2d EgoCentric view Top view

12 LES SES Navigation based on EgoCentric representations SES represents the current perception of the robot LES represents the expected state of the world Comparison of these provide the best estimate direction towards a desired region more Future Work SES- and LES-Based Navigation Publication [8] Novel Approach: Range-free perception-based navigation

13 Publication [9] Novel Approach: A team of robots that share SES and LES knowledge  Skeeter creates SES  Skeeter finds the object  Skeeter shares SES data with Scooter  Scooter calculates heading to the object  Scooter finds the object ? ? ? ? ? ? ? ? Object Found SES data Via LES #1 Via LES #2 Target LES LES Information  Scooter has the map of the environment  Scooter generates via LES’s  Scooter shares LES data with Skeeter  Skeeter navigates to the target using PBN SES and LES Knowledge Sharing Future Work

14 Dynamic Path Planning through SAN-RL (Spreading Activation Network - Reinforcement Learning) Novel Approach: Action selection with learning for the mobile robot Behavior Priority : 1.Using the shortest time 2.Avoid enemy 3.Equal priority More… DB Get initial data from learning mode High level command with multiple goals After finish training send data back to DB SAN-RL activate/deactivate robot’s behaviors Atomic Agents Scooter Publication [10]

15 Publications 1.K. Kawamura, R.A. Peters II, D.M. Wilkes, W.A. Alford, and T.E. Rogers, "ISAC: Foundations in Human- Humanoid Interaction", IEEE Intelligent Systems, July/August 2000. 2.K. Kawamura, A. Alford, K. Hambuchen, and M. Wilkes, "Towards a Unified Framework for Human-Humanoid Interaction", Proceedings of the First IEEE-RAS International Conference on Humanoid Robots, September 2000. 3.K. Kawamura, T.E. Rogers and X. Ao, “Development of a Human Agent for a Multi-Agent Based Human-Robot Interaction,” Submitted to First International Joint Conference on Autonomous Agents and Multi-Agent Systems (AAMAS 2002), Bologna, Italy, July 15-19, 2002. 4.T. Rogers, and M. Wilkes, "The Human Agent: a work in progress toward human-humanoid interaction" Proceedings 2000 IEEE International Conference on Systems, Man and Cybernetics, Nashville, October, 2000. 5.A. Alford, M. Wilkes, and K. Kawamura, "System Status Evaluation: Monitoring the state of agents in a humanoid system”, Proceedings 2000 IEEE International Conference on Systems, Man and Cybernetics, Nashville, October, 2000. 6.K. Kawamura, R. A. Peters II, C. Johnson, P. Nilas, S. Thongchai, “Supervisory Control of Mobile Robots Using Sensory EgoSphere”, IEEE International Symposium on Computational Intelligence in Robotics and Automation, Banff, Canada, July 2001. 7.K. Kawamura, D.M. Wilkes, S. Suksakulchai, A. Bijayendrayodhin, and K. Kusumalnukool, “Agent-Based Control and Communication of a Robot Convoy,” Proceedings of the 5th International Conference on Mechatronics Technology, Singapore, June 2001. 8.K. Kawamura, R.A. Peters II, D.M. Wilkes, A.B. Koku and A. Sekman, “Towards Perception-Based Navigation using EgoSphere”, Proceedings of the International Society of Optical Engineering Conference (SPIE), October 28-20, 2001. 9.K. Kawamura, D.M. Wilkes, A.B. Koku, T. Keskinpala, “Perception-Based Navigation for Mobile Robots”, accepted Proceedings of Multi-Robot System Workshop, Washington, DC, March 18-20, 2002. 10.D.M. Gaines, M. Wilkes, K. Kusumalnukool, S. Thongchai, K. Kawamura and J. White, “SAN-RL: Combining Spreading Activation Networks with Reinforcement Learning to Learn Configurable Behaviors,” Proceedings of the International Society of Optical Engineering Conference (SPIE), October 28-20, 2001.

16 Acknowledgements This work has been partially sponsored under the DARPA – MARS Grant # DASG60-99-1-0005 and from the NASA/JSC - UH/RICIS Subcontract # NCC9-309-HQ Additionally, we would like to thank the following CIS students: Mobile Robot Group: Bugra Koku, Carlotta Johnson, Turker Keskinpala, Anak Bijayendrayodhin, Kanok Kusumalnukool, Jian Peng Humanoid Robotic Group:Tamara Rogers, Kim Hambuchen, Christina Campbell


Download ppt "A Software Architecture and Tools for Autonomous Robots that Learn on Mission K. Kawamura, M. Wilkes, A. Peters, D. Gaines* Vanderbilt University Center."

Similar presentations


Ads by Google