Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 9: Behavior Languages

Similar presentations


Presentation on theme: "Lecture 9: Behavior Languages"— Presentation transcript:

1 Lecture 9: Behavior Languages
CS 344R: Robotics Benjamin Kuipers

2 Alternative Approaches To Sequencers
Roger Brockett, MDL Hristu-Varsakelis & Andersson, MDLe. Jim Firby, RAPS … there are others … The right answer is not completely clear.

3 Motion Description Languages
Problem: Describe continuous motion in a complex environment as a finite set of symbolic elements. Applicability = sequencing Termination = condition or time-out. Roger Brockett defined MDL. Extended to MDLe by Manikonda, Krishnaprasad, and Hendler.

4 This is an instance of our framework for control laws
A local control law is a triple: A, Hi,  Applicability predicate A(y) Control policy u = Hi(y) Termination predicate (y)

5 The Kinetic State Machine
The MDLe state evolution model is: This is an instance of our general model There is also: a set of timers Ti; a set of boolean features i(y) U(t, x) is a general control law which can be suspended by the timer Ti or the interrupt i(y)

6 The Kinetic State Machine

7 Q: What is the role of G(x)?
In the state evolution model x is in Rn. Motor vector U(t,x) is in Rk. G is an nk matrix whose columns gi are vector fields in Rn. Each column represents the effect on x of one component of the motor vector.

8 MDL Programs The simplest MDL program is an atom To run an atom,
apply U to the kinetic state machine model, until the interrupt function (y) goes false, or until T units of time elapse.

9 Compose Atoms to Behaviors
Given atoms Define the behavior Which means to do the atoms sequentially until the interrupt b or time-out Tb occurs. Behaviors nest recursively to make plans.

10 Example Interrupts (bumper) (wait T) (atIsection b)
b specifies 4 bits: whether obstacle is required (front, left, back, right). Interrupt occurs when a location of that structure is detected.

11 Example Atoms (Atom interrupt_condition control_law)
(Atom (wait ) (rotate )) (Atom (bumper OR atIsection(b)) (go v, )) (Atom (wait T) (goAvoid , kf, kt)) (Atom (ri(t)==rj(t)) (align ri rj)) Select ideas from here for your controllers. goAvoid moves in direction \psi, with gains k_f and k_t on the forward and turn controllers, responding to distances to obstacles.

12 Environment Model A graph of local maps.
We will study local metrical maps later. Likewise topological maps. Edges in the graph represent behaviors. Compact and effective: Local metrical maps are reliable. Describe geometry only where necessary.

13 Experiment They built a model of three places in their laboratory.
They demonstrated MDLe plans for travel between pairs of places.

14 Limitations Simple sequential FSM model. Limited evaluation:
No parallelism or combination of control laws. No success/failure exits from control laws. Much can pack into the interrupt conditions. Limited evaluation: No exploration or learning. No test of reliability.

15 Next: Observers Probabilistic estimates of the true state, given the observations. Basic concepts: Probability distribution; Gaussian model Expectations

16 Estimates and Uncertainty
Conditional probability density function

17 Gaussian (Normal) Distribution
Completely described by N(,) Mean  Standard deviation , variance  2

18 The Central Limit Theorem
The sum of many random variables with the same mean, but with arbitrary conditional density functions, converges to a Gaussian density function. If a model omits many small unmodeled effects, then the resulting error should converge to a Gaussian density function.

19 Expectations Let x be a random variable.
The expected value E[x] is the mean: The probability-weighted mean of all possible values. The sample mean approaches it. Expected value of a vector x is by component.

20 Variance and Covariance
The variance is E[ (x-E[x])2 ] Covariance matrix is E[ (x-E[x])(x-E[x])T ]

21 Covariance Matrix Along the diagonal, Cii are variances.
Off-diagonal Cij are essentially correlations.

22 Independent Variation
x and y are Gaussian random variables (N=100) Generated with x=1 y=3 Covariance matrix:

23 Dependent Variation c and d are random variables.
Generated with c=x+y d=x-y Covariance matrix:


Download ppt "Lecture 9: Behavior Languages"

Similar presentations


Ads by Google