Download presentation
Presentation is loading. Please wait.
Published byNathaniel Marshall Modified over 7 years ago
2
Introduction to robotics essentials, Registration and Orientation
Miami Valley Computer Science Circle an enrichment program focusing on Computational Thinking for high school and advanced level middle school students Introduction to robotics essentials, Registration and Orientation Mr. Ken Nelson (Senior Network Security Engineer at Wright State University, M.S. in CS., MBA) Saturday, January 14th 10:00 – 12 PM 1st and 3rd Saturday of every month Room 170, Mathematics and Microbiology (MM), Wright State University (WSU) If you have any question, please contact Dr. Munsup Seoh at and Dr. Lining Qi at
3
Helpful robotics sites
(search Arduino, raspberry pi or robot for projects) (login to wsuvpn with account then go here and search for books)
4
What is robotics? What disciplines are involved in robotics?
Mechanical Engineering Electrical Engineering Computer programming …….more….. How to robots move? wheeled tracked Legged (2,4,6,8 …) Flying (quadcopter) …….more….. (snake, fish)
5
What is robotics cont. How are robots controlled?
Remote control Phone apps Programmed by Computers (wired/wireless) (Linux/Windows/Raspberry Pi) - C/C++, Scratch, artoo, Johny-Five(javascript), python… Control board (Arduino, raspberry pi …..) Autonomous code (follow an object, track, navigate a maze, self learning-neural net)
6
What can robots do? Build cars? Explore mars?
Automate many industrial applications (welding, material handling, cutting, assembly,mining …) Pretty much any manual job that is repetitive Military? Bomb disposal, Space repairs(rov / rms) Medical? Remote surgery Fix oil spills, Environmental hazard cleanup Farming? Pruning/weeding/spraying/irrigation…. Fast food? Flip burgers, make fries, take orders (autonomous kiosk) ved=0ahUKEwiVrP_mz7fRAhUq_4MKHSzqA88QsAQIZQ&biw=1920&bih=921
7
Future Robots? Exoskeletons for the handicap, Autonomous androids?
Pets, maids, nannies ???????????????? Self driving cars? (Tesla, google ….) RPA (automation) does what you tell it vs AI (machine learning) refines itself, adaptive learning 6% of jobs by 2020 will be robotic (shows I-Robot / Human)
8
Basic Components Arudino Uno Raspberry Pi 3 Model B
Chasis/mobile platform Motor controllers and motors Sensors (IR, sonic, visual) Cameras Battery
9
Pi Car will be our introduction to robotics
This is a basic fairly cheap car. Roughly $80 + the Raspberry pi controller. We will learn how to setup Raspberry pi & program on it. We will put together the car and add a camera and a sensor The car can be controlled wirelessly from a computer and able to push video to a webpage (which we can use to control the car when not able to see it) The car also has a sensor to prevent running into obstacles.
10
Hardware components Raspberry Pi 3 4-wheel Robot smart car chassis kit
External battery power bank L298N stepper motor controller board HC-SR04 sonar module distance sensor Raspberry Pi camera module + (flex mount or glass case) 40pin male/female breadboard jumper wires External Batter power bank (usb) optional Wireless USB adapter dongle Breadboard (small 3x2) Wireless keyboard (wirelessly control the car) Software: VLC Tools: insulation tape and screwdriver
11
Preparing the Raspberry PI
You can install NOOBS (New out of box software) or Download Raspbian If purchased the PI with an SD card it should have NOOBS installed NOOBS has Raspbian and other products ready to go Attach the wireless internet and keyboard adapters to the PI usb slots Attach hdmi cable from PI to monitor You can do the updates and config now or after building the car so don’t have to redo some PI config
12
The Build Connect motors to motor driver board
Build the chasis – remember to attach a red(top)&black(bottom) wire to motors Connect motors to motor driver board Connect Raspberry Pi to Motor driver GPIO pin2 to 5v GPIO pin6 to GND GPIO pin7 to INI4(eva) GPIO pin11 to INI3 (5v) GPIO pin13 to INI2 (5v) GPIO pin15 to INI1 (EVB)
13
Build Cont. Attach the HC-SR04 ultrasonic distance sensor
Plug the sensor into the breadboard Add a 1ohm resistor the the black cable to Protect the Raspberry PI Connect wires to the PI and motor (echo)Yellow cable to Pin 16 on PI (fem-male) (trigger)Orange cable to Pin 12 on PI (fem-male) (ground)black cable to GND on the motor driver (fem-male) (power)red cable to 5V on motor driver (fem-male)
14
Build Cont. Add the camera to the PI
Position power pack in back of chasis (for PI)
15
Setup PI and the CAR Configure the PI (click expand filesystem make sure full SD in use) Click interface (ensure ssh is enable and the camera) Click localization (set language and country) OK and reboot Once reboots click NETWORK, select wifi router and enter password Sudo apt-get update && sudo apt-get upgrade Sudo apt-get install vlc (y when prompted) On the smartphone/pc Now the PI is ready and you can start coding.
16
Create files to control car
Mkdir PiCar Cd PiCar Copy the files in from a jump drive or wireless or create them. Sudo nano main.py or Sudo vi main.py Sudo nano sensory.py Sudo nano camera.sh Car is now done
17
Main.PY What are some things are car needs to be able to do?
Forward? Reverse? Turn Right? Turn Left? Stop? How do we make it do things? Autonomous? Key Input?
18
Main.PY Each task is a procedure Easier to manage and change
19
Sensor.py What is are sensor capable of doing?
Measure distance, angle, height? Track an object? Track a line? Determine weight, pressure? How does the sensor work? Light? Sound? Vision? Tactile? How many sensors do we have? .PY for each sensor? How many sensors should we have? Tasks to complete? Procedure for each task?
20
Sensor.py
21
Camera.sh #!/bin/bash raspivid -o - -t 0 -hf -w 600 -h 400 -fps 30 |cvlc -v stream:///dev/stdin -- sout '#standard{access=http,mux=ts,dst=:8554}' :demux=h264 We are just setting the camera on, its size and speed and where to send the output. This camera is not for input, but only outputting what it sees, so a pretty simple command. Some robotics use a special camera for input like a Kinect for xbox.
22
How to run the car/code 1. Power on the Pi (with HDMI cable connected to a monitor) 2. Once booted, open up Terminal 3. Change directory to PiCar (Type cd PiCar) 4. Run Car File (Type 'sudo python main.py') 5. The PiCar is now running and controllable via the wireless keyboard (Remember to remove the HDMI cable before driving the car!) The below keys will drive the car: W = Forward S = Reverse A = Turn Left D = Turn Right P = Stop The Pi will also be semi-autonomous in the sense that if anything gets within 15cm of the front sensor then the PiCar will automatically reverse for a second.
23
Stream the camera online
1. Open Terminal 2. Change directory to PiCar (Type cd PiCar) 3. Run camera.sh (Type ./camera.sh) On the Client computer/smartphone: 1. Launch VLC 2. Click 'File' and then 'Open Network Stream' 3. Type 4. Click 'Open' and after a few seconds a new window will open with the stream.(There may be a few seconds lag) (hostname –I ) to see ip
24
The advanced car With the addition of a second motor controller to make front and rear wheels independent and a few extra longer range sensors and much more complex code we can turn our car into this.
25
Our mini autonomous robot
The robot we will build is a scaled down version of this If time permits we may do both!!
27
Cost: Pololu direct: Cost: Amazon Prime
6x6x6mm push button switch $5.29 2position on-off switch $5.00 12x12 .25in acrylic plexiglass $12.99 4 x AA 6v battery holder $ (AAA or AA same voltage but AA give more life) Edimax eq-7811un wifi $7.99 Raspberry Pi 3 model B $ ( I bought the complete starter kit to get switch, breadboard, sd card with software loaded, Clear case, power etc. $89.99) can get for $79.99 black case Total Cost: $ (most parts will be re-usable in other things you might build)
28
Obstacle Avoidance For the operating mode obstacle avoidance I use Sharp infrared distance sensors, provided by Pololu, mounted on tiny boards (see photos and component list below). I attached 2 sensors that are sensing distance to 5 cm at the front edges with an angle of 45°, and a 10 cm type in the middle.
29
Obstacle Avoidance Code
#!/usr/bin/python # ======================================================== # Python script for PiBot-A: obstacle avoidance # Version by Thomas Schoch - # ======================================================== from pololu_drv8835_rpi import motors, MAX_SPEED from time import sleep import wiringpi2 as wp2 # Signal handler for SIGTERM import signal, sys def sigterm_handler(signal, frame): motors.setSpeeds(0, 0) sys.exit(0) signal.signal(signal.SIGTERM, sigterm_handler) # GPIO pins of sensors GPIO_right = 21 GPIO_middle = 26 GPIO_left = 20 # Configure sensors as input wp2.pinMode(GPIO_right, wp2.GPIO.INPUT) wp2.pinMode(GPIO_middle, wp2.GPIO.INPUT) wp2.pinMode(GPIO_left, wp2.GPIO.INPUT) try: # Start moving forward motors.setSpeeds(MAX_SPEED, MAX_SPEED) while True: # Main loop # Read sensor input (positive logic) INPUT_right = not wp2.digitalRead(GPIO_right) INPUT_middle = not wp2.digitalRead(GPIO_middle) INPUT_left = not wp2.digitalRead(GPIO_left) # Set motor speeds dependent on sensor input if INPUT_left and INPUT_right: # Obstacle immediately ahead: move a bit bkwrd, # turn left a little bit and then proceed fwrd motors.setSpeeds(-200, -200) sleep (1) motors.setSpeeds(-200, 200) sleep (0.3) elif INPUT_middle: # turn left motors.setSpeeds(100, MAX_SPEED) elif INPUT_left: # turn right motors.setSpeeds(MAX_SPEED, 200) elif INPUT_right: motors.setSpeeds(200, MAX_SPEED) else: # No sensor input: drive forward # Repeat this loop every 0.1 seconds sleep (0.1) finally: # Stop motors in case of <Ctrl-C> or SIGTERM: motors.setSpeeds(0, 0)
30
Line Follower The line follower is realized with the Pololu "QTR-3A Reflectance Sensor Array" with three IR emitter and receiver (phototransistor) pairs
31
Line Follower code
32
Maze Solver For the maze solver I use the same sensors as for the line follower. That means I have only 3 sensors, rather than 5 or more, which seems to be usual. Notice that the robot is swaying a little bit whenever it reaches a node of the maze. It then looks to the left and the right to see whether there is a black line forking, a crossing or a T-junction. That way it compensates the absence of the outer sensors that are typically used for that purpose.
33
Maze Solver – thought process
This works as follows (example, image above shows the moving sensors): The robot is driving just straight ahead. Reaching the node, robot thinks: I'm drifting to the left (right sensor is on the line), hence making a correction to the right. As a result: all sensors black, robot knows: I'm over some node, and I'm too far to the right (because of the last movements). Robot sways to the left, looking with left sensor whether there is a line. No line found: robot knows, a line is branching to the right. Robot corrects its direction by turning back to the right a little. Robot drives behind the node until the node is under the axle in the center of the robot. It now sees a line. Robot knows: the node was a T-junction right. In this position the robot decides on where to move and eventually will turn on the spot. The timing is very crucial: it makes a big difference to drive some hundredths of seconds when the battery voltage is going up and down. Hence the robot starts with a 360° turn to the left and right and back performing some calibration (see video). The time needed for this is measured and is taken into account when calculating the time for some particular movement. Strategy for solving the maze The target (the black disk) is placed anywhere in the maze. The robot is placed anywhere in the maze. Robot searches and finds the end point following the left-hand rule. Robot calculates the shortest path by an algorithm similar to dead-end filling. Robot drives along the shortest path back to its starting position and "nods its head".
34
Maze Solver Code See notepad file. (robot-ms.py)
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.