Presentation is loading. Please wait.

Presentation is loading. Please wait.

Luis Mejias, Srikanth Saripalli, Pascual Campoy and Gaurav Sukhatme.

Similar presentations


Presentation on theme: "Luis Mejias, Srikanth Saripalli, Pascual Campoy and Gaurav Sukhatme."— Presentation transcript:

1 Luis Mejias, Srikanth Saripalli, Pascual Campoy and Gaurav Sukhatme

2  Introduction  Related work  Testbed  Visual preprocessing  Control Architectures  Experiments  Conclusion

3  Introduction  Related work  Testbed  Visual preprocessing  Control Architectures  Experiments  Conclusion

4  Goal:  vision-guided autonomous flying robots  Application:  Law enforcement, search and rescue, inspection and surveillance  Technique:  Object detection, tracking, inertial navigation, GPS and nonlinear system modeling

5  In this paper:  Two UAVs – Avatar and COLIBRI  Visual tracking => control commands

6  Introduction  Related work  Testbed  Visual preprocessing  Control Architectures  Experiments  Conclusion

7  Hummingbird (A. Conway, 1995)  Model-scale  Use GPS only  4 GPS antennas  Precisions: position 1cm attitude 1 degree

8  AVATAR (Jun, 1999)  Onboard INS & GPS  Kalman Filter for State Estimation  Simulation

9  Vision-guided Helicopter (Amidi, 1996, 1997)  Onboard DSP-based vision processor  Combine GPS and IMU data

10  Vision-augmented navigation system (Bosse, 1997)  Uses vision in-the-loop to control a helicopter  Visual odometer (Amidi, 1998)  A notable vision-based technique used in autonomous helicopter  (Wu, et al, 2005)  Vision is used as additional sensor and fused with inertial and heading measurements for control

11  Introduction  Related work  Testbed  Visual preprocessing  Control Architectures  Experiments  Conclusion

12  AVATAR  Gas-powered radio-controlled model helicopter  RT-2 DGPS system provides positional accuracy of 2 cm  ISIS-IMU provides rate information to onboard computer, which is fused using a 16 state Kalman filter  Ground station: a laptop to send high-level control commands and differential GPS corrections  Autonomous flight is achieved using a behavior-based control architecture

13  COLIBRI  Gas powered model helicopter  Fitted with a Xscale based flight computer augmented with GPS, IMU, Magnetometer, fused with a Kalman filter  VIA mini-ITX 1.25 GHz computer onboard with 512 Mb RAM, wireless interface and a firewire color camera  Ground station: a laptop to send high-level control commands, and for visualization

14  Introduction  Related work  Testbed  Visual preprocessing  Control Architectures  Experiments  Conclusion

15  Image segmentation and thresholding  Convert the image to grayscale  Use the value of “target color” as threshold  Segment the image to binary image where the object of interest is represented by 1’s and background with 0’s

16  Square Finding  Find contours (represented by polylines) from the binary image  Use an algorithm to reduce the points in polylines  Result: simplified squares

17  Template Matching  User selects a detected window (a target)from the GUI  A patch is selected around the location of the target  Use local search window to find best match between the target and the detected contours, deciding which window to track

18  Kalman Filter  Once a suitable match is found, a Kalman filter is used to track the feature positions  Input: x and y coordinates of the features  Output: estimates of these coordinates in the next frame

19  The user selects the object of interest from the GUI  The location of the object is used to generate visual reference

20  Lateral visual reference

21  Vertical Visual Reference

22  Introduction  Related work  Testbed  Visual preprocessing  Control Architectures  Experiments  Conclusion

23  A hierarchical behavior based control architecture  Output of Kalman filter is compared with desired values to give an error signal to controller

24  Controller is based on a decoupled PID control

25  Introduction  Related work  Testbed  Visual preprocessing  Control Architectures  Experiments  Conclusion

26  At Del Valle Urban Search and Rescue Training site in Santa Clarita, California  AVATAR, four trials  First, the helicopter is commanded to fly autonomously to a given GPS waypoint  As soon as it detects the featured window, the controller switches from GPS-based to vision-based control

27 Location of the features in the image

28 Helicopter position in meters. (left figure) vertical axis– easting (right figure) vertical axis – northing

29  At ETSII Campus in Madrid, Spain  COLIBRI  Seven experimental trials on two different days

30 Velocity references (vy r ) with the helicopter velocity (vy) Lateral displacement (east)

31 Velocity references (vz r ) with the helicopter velocity (vz) altitude displacement (down)

32 Helicopter displacements during the entire flight trial

33  colibrivideoWeb.wmv colibrivideoWeb.wmv

34  Introduction  Related work  Testbed  Visual preprocessing  Control Architectures  Experiments  Conclusion

35  Demonstrated an approach to visually control an autonomous helicopter: use visual algorithm to command UAV when GPS has dropouts  Experimentally demonstrated by performing vision-based window tracking tasks on two different platforms at different locations and different conditions

36  The topic is interesting  Visual algorithm is demonstrated effective in the experiments  But… the writing is so ugly.  Poor explanation ▪ features, template and matching  Incomplete explanation of figures


Download ppt "Luis Mejias, Srikanth Saripalli, Pascual Campoy and Gaurav Sukhatme."

Similar presentations


Ads by Google