Download presentation
Presentation is loading. Please wait.
1
Maze-Solving Mindstorms NXT Robot
2
Our Mission Investigate the capabilities of the NXT robot Explore development options Build something interesting!
3
Problem Outline Robot is placed in a “grid” of same-sized squares – (Due to obscure and annoying technical limitations, the robot always starts at the “southwest” corner of the maze, facing “north”) Each square can be blocked on 0-4 sides (we just used note cards!) Maze is rectangularly bounded One square is a “goal” square (we indicate this by covering the floor of the goal square in white note cards ) The robot has to get to the goal square
4
Robot Design Uses basic “driving base” from NXT building guide, plus two light sensors (pointed downwards) and one ultrasonic distance sensor (pointed forwards) The light sensors are used to detect the goal square, and the distance sensor is used to detect walls
5
Robot Design, cont’d Light Sensors Ultrasonic Sensor
6
Robot Design, cont’d
7
Search Algorithm Simple Depth-First Search Robot scans each cell for walls and constructs a DFS tree rooted at the START cell As the DFS tree is constructed, it indicates which cells have been explored and provides paths for backtracking The DFS halts when the GOAL cell is found
8
Maze Structure GOAL START
9
DFS Tree Example GOAL START
10
DFS Tree Data Structure Two-Dimensional Array Cell maze[MAX_HEIGHT][MAX_WIDTH] typedef struct { bool isExplored; (= false) Direction parentDirection; (= NO_DIRECTION) WallStatus[4] wallStatus; (= {UNKNOWN}) } Cell; Actually implemented as parallel arrays due to RobotC limitations
11
DFS Algorithm while (true) { if robot is at GOAL cell victoryDance(); if there is an unexplored, unobstructed neighbor Mark parent of neighbor as current cell; Proceed to the neighbor; else if robot is not in START cell Backtrack; else return; //No GOAL cell exists, so we exit }
12
Example 3x3 maze GOAL
13
We start out at (0,0) – the “southwest” corner of the maze Location of goal is unknown
14
Check for a wall – the way forward is blocked
15
So we turn right
16
Check for a wall – no wall in front of us
17
So we go forward; the red arrow indicates that (0,0) is (1,0)’s predecessor.
18
We sense a wall
19
Turn right
20
We sense a wall here too, so we’re gonna have to look north.
21
Turn left…
22
Turn left again; now we’re facing north
23
The way forward is clear…
24
…so we go forward. – “When you come to a fork in the road, take it.” –Yogi Berra on depth-first search
25
We sense a wall – can’t go forward…
26
…so we’ll turn right.
27
This way is clear…
28
…so we go forward.
29
Blocked.
30
How about this way?
31
Clear!
33
Whoops, wall here.
34
We already know that the wall on the right is blocked, so we try turning left instead.
35
Wall here too! Now there are no unexplored neighboring squares that we can get to. So, we backtrack! (Retrace the red arrow)
36
We turn to face the red arrow…
37
…and go forward. Now we’ve backtracked to a square that might have an unexplored neighbor. Let’s check!
38
Ah-ha!
39
Onward!
40
Drat!
41
There’s gotta be a way out of here…
42
Not this way!
43
Two 90-degree turns to face west…
45
No wall here!
46
So we move forward and…
47
What luck! Here’s the goal. Final step: Execute victory dance.
48
Movement and Sensing The search algorithm above requires five basic movement/sensing operations: – “Move forward” to the square we’re facing – “Turn left” 90 degrees – “Turn right” 90 degrees – “Sense wall” in front of us – “Sense goal” in the current square
49
Movement and Sensing, cont’d Sensing turns out not to be such a big problem – If the ultrasonic sensor returns less than a certain distance, there’s a wall in front of us; otherwise there’s not – Goal sensing is similar (if the floor is “bright enough”, we’re at the goal)
50
Movement and Sensing, cont’d The motion operations are a major challenge, however Imagine trying to drive a car, straight ahead, exactly ten feet, with your eyes closed. That’s more or less what “move forward” is supposed to do – at least ideally. In the current implementation, we just make our best estimate by turning the wheels a certain fixed number of degrees, and make no attempt to correct for error. – We’ll talk about other options later
51
Language Options There are several languages and programming environments available for the NXT system: – NXT-G – Microsoft Robotics Studio – RobotC – etc…
52
NXT-G Lego provides graphical “NXT-G” software based on LabVIEW which we’ve seen before
53
NXT-G, cont’d NXT-G is designed to be easy for beginning programmers to use We found it rather limiting – Placing blocks/wires on the diagram takes longer than typing Furthermore, NXT-G lacks support for arrays, which is problematic for our application
54
Microsoft Robotics Studio To be honest, we just couldn’t get the darn thing to work! We’re sure it’s great, though.
55
RobotC Simple “C-like” language for programming NXT (and other platforms) developed at CMU Compiles to bytecode that is executed on a VM More-or-less complete support for NXT sensors, motors
56
RobotC, cont’d Limited subset of C – All variables allocated statically (so no recursion) – Somewhat limited type system For example, arrays are limited to two dimensions, and you can’t have arrays of structs as far as we can figure – Maximum of eight procedures and 256 variables
57
RobotC, cont’d Still in beta – Currently available for free download (time-limited demo) – Will eventually be released commercially, with licenses costing $20-$30 Requires modified firmware – This breaks compatibility with NXT-G programs Some features seem to be just plain buggy and/or incomplete But it supports arrays!
58
Okay, let’s see a demonstration!
59
Error Correction So as you may have noticed, it doesn’t work perfectly. Ideally, the robot should always turn exactly 90 degrees and should always be exactly centered inside the square. As we said, the “movement primitives” – go forward, turn left, turn right – are not perfectly precise. – Any “slips” or problems with traction will throw everything off. Error tends to compound
60
Error Correction, cont’d To some extent, error is inevitable; the robot doesn’t really have “vision” per se. However, if we fudged the environment a little bit, it would probably be possible to correct for much of the error.
61
Error Correction, cont’d One possibility: Mark the floor of each tile with lines that can be picked up by the light sensors. If placed correctly, the “alignment markers” could help the robot both to center itself along the X/Y axes, and to make sure it turns exactly 90 degrees.
62
Error Correction, cont’d Another possibility: Use the ultrasonic sensor to make sure the robot doesn’t run into walls, even if it “thinks” it should still be moving forwards. Unfortunately we didn’t get a chance to try and implement these ideas.
63
Sources Peter Dempsey Pericles Kariotis Adam Procter
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.