Presentation is loading. Please wait.

Presentation is loading. Please wait.

A framework of safe robot planning Roland Pihlakas Institute of Technology in University of Tartu august 2008.

Similar presentations


Presentation on theme: "A framework of safe robot planning Roland Pihlakas Institute of Technology in University of Tartu august 2008."— Presentation transcript:

1 A framework of safe robot planning Roland Pihlakas Institute of Technology in University of Tartu august 2008

2 What is safe Safe action or state is: –a goal which is explicitly given to the robot; –explicitly permitted change in the robot’s environment. Everything else is unknown and possibly unsafe. Therefore should not be caused. Some automatically calculated sub-goals can be unsafe too, unless permitted.

3 The problem Why not to enumerate bad states: –too much work –humans are poor at systematically predicting things. Especially when they are not directly interested in that Why not to tell the robot all the sub-goals / steps towards the goal: –too much work –again, unexpected consequences of given steps

4 The problem Opposed interests: –to let the human get the job of robot configuration / instruction done quickly –to still have control and no surprises

5 Proposed solution Bad is implicit Usually enumerate only: –goals; –“okay” changes => permissions Perhaps simpler to enumerate Analogy: public vs private law

6 A second analogy Three laws in the order of priority: –Do not do anything that is not explicitly permitted. –Fulfill the goals that are given to you. –Optionally fulfill the optional goals.

7 An useful concept If you can undo something, it is usually safe, assuming that current state is safe. From that follows: –the principle of avoiding irreversible actions –two special classes for actions and their corresponding results, called “irreversible actions” and “reversible actions”.

8 Few motivating examples Street cleaning Room cleaning Making room on harddisk Littering crap

9 About permissions The permissions are usually for: –changes in given dimensions –usually not about specific actions

10 A simple language example Goal x = 2; Allow y = any; Reverse z; Dontdisturb w; Guarantee for q1 = 44 is q2 = 37; Context a = 177; Askauth allow b = any;

11 Data flow Sensors, certain functions calculating some value etc… The configuration Three datastructures: - preconditions / context - keep-always conditions - goal conditions Automatic plan generation Precondition checking Causal relations / prediction module

12 Adding optional goals to the language An example: Obligatory { Goal robot.location = “in front of TV”; } Optional { Goal floor.still_clean = yes; }

13 Second extension – cost calculation Lets add budget for achieving optional goals: Optional { Goal x4 = 3; Cost x4 = 10; Scale x4 = *;//infinite resolution Dontdisturb x3; Cost x3 = 4; [Scale x3 = per unit]; } Budget = 55; Max steps = 20; Max time = 10min;

14 The protocol When giving permissions, make sure that context is correctly specified! Opposing interests of human user: –to give many permissions and get quickly rid of the job –giving only necessary permissions and to specify their proper context Selinux analogy

15 Robot learning The sandbox Levels of sandboxes Bonus: understanding the concept of reversibilities helps to repeat motor actions until the coordination is good enough

16 “Passive” safety Distinguishes user’s commands from auto- generated ones; does not override users: –The robot distinguishes clearly between the orders that were given and the sub-goals it has set to itself. –By default avoids only own mistakes. –Even more: the robot may refuse to act. But will not do anything else than what humans have permitted.

17 Planning module Uses minimax search: –In case of uncertainties assumes worst –Optimises the cost of the plan

18 Errata May stop when encountering unexpected / unknown situations, unless instructed otherwise using context-specific goals.

19 Implementation language Prolog: –has built-in parser (for configuration processing) –has variable data type –automatic memory management –has useful data types for expressing constraints, or uncertainty –conveniently short syntax for failing function calls and resuming alternatives at upper levels in the planning module –“scriptable”

20 Future Multiple contexts / goal sets Online planning, partial plans Online diagnostics and remedy taking in case of danger, faults etc. Automatically finding unnecessary rights More powerful prediction module Time constraints

21 Future Asking for authorisation during planning –Askauth allow x = any; Asking the user to choose and authorise one plan from a set of automatically proposed alternatives Different userlevels Understanding changes caused by external agents or natural forces

22 Immediate future Prediction module – using high-dimensional regression Planning module – planning / search to be done backwards from the goal, not by breath-first or depth- first search. –This also allows breaking the planning task to independent parts Perception module – hierarchical composition of entities in the sensory space-time and assigning some of them to keywords. –Bonus: the same structure can possibly also be used for plan- generation / constraint satisfaction

23


Download ppt "A framework of safe robot planning Roland Pihlakas Institute of Technology in University of Tartu august 2008."

Similar presentations


Ads by Google