Sensors in Robotics Li Guan

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

A Unified Approach to Calibrate a Network of Camcorders & ToF Cameras M 2 SFA 2 Marseille France 2008 Li Guan Marc Pollefeys {lguan, UNC-Chapel.
3D Head Mesh Data Stereo Vision Active Stereo 3D Reconstruction 3dMD System 1.
Caroline Rougier, Jean Meunier, Alain St-Arnaud, and Jacqueline Rousseau IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 21, NO. 5,
Whole-Home Gesture Recognition Using Wireless Signals —— MobiCom’13 Author: Qifan Pu et al. University of Washington Presenter: Yanyuan Qin & Zhitong Fei.
Kawada Industries Inc. has introduced the HRP-2P for Robodex 2002
www-video.eecs.berkeley.edu/research
Structured light and active ranging techniques Class 11.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #17.
Vision Sensing. Multi-View Stereo for Community Photo Collections Michael Goesele, et al, ICCV 2007 Venus de Milo.
System Integration and Experimental Results Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash.
Gratuitous Picture US Naval Artillery Rangefinder from World War I (1918)!!
Stereo Many slides adapted from Steve Seitz. Binocular stereo Given a calibrated binocular stereo pair, fuse it to produce a depth image Where does the.
Structured Light + Range Imaging Lecture #17 (Thanks to Content from Levoy, Rusinkiewicz, Bouguet, Perona, Hendrik Lensch)
Tracking Multiple Occluding People by Localizing on Multiple Scene Planes Professor :王聖智 教授 Student :周節.
SA-1 Probabilistic Robotics Probabilistic Sensor Models Beam-based Scan-based Landmarks.
GIS and Image Processing for Environmental Analysis with Outdoor Mobile Robots School of Electrical & Electronic Engineering Queen’s University Belfast.
Last Time Pinhole camera model, projection
Structured light and active ranging techniques Class 8.
Multiple Human Objects Tracking in Crowded Scenes Yao-Te Tsai, Huang-Chia Shih, and Chung-Lin Huang Dept. of EE, NTHU International Conference on Pattern.
Input/Output Devices Graphical Interface Systems Dr. M. Al-Mulhem Feb. 1, 2008.
Multiple View Geometry Marc Pollefeys University of North Carolina at Chapel Hill Modified by Philippos Mordohai.
3-D Computer Vision Using Structured Light Prepared by Burak Borhan.
Jacinto C. Nascimento, Member, IEEE, and Jorge S. Marques
Non-invasive Techniques for Human Fatigue Monitoring Qiang Ji Dept. of Electrical, Computer, and Systems Engineering Rensselaer Polytechnic Institute
Non-invasive Techniques for Human Fatigue Monitoring Qiang Ji Dept. of Electrical, Computer, and Systems Engineering Rensselaer Polytechnic Institute
InerVis Mobile Robotics Laboratory Institute of Systems and Robotics ISR – Coimbra Contact Person: Jorge Lobo Human inertial sensor:
College of Engineering and Science Clemson University
Introduce about sensor using in Robot NAO Department: FTI-FHO-FPT Presenter: Vu Hoang Dung.
Mohammed Rizwan Adil, Chidambaram Alagappan., and Swathi Dumpala Basaveswara.
Structured light and active ranging techniques Class 8
A Brief Overview of Computer Vision Jinxiang Chai.
1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception.
Real-Time Phase-Stamp Range Finder with Improved Accuracy Akira Kimachi Osaka Electro-Communication University Neyagawa, Osaka , Japan 1August.
KinectFusion : Real-Time Dense Surface Mapping and Tracking IEEE International Symposium on Mixed and Augmented Reality 2011 Science and Technology Proceedings.
Probabilistic Context Free Grammars for Representing Action Song Mao November 14, 2000.
A General Framework for Tracking Multiple People from a Moving Camera
Recap from Monday Image Warping – Coordinate transforms – Linear transforms expressed in matrix form – Inverse transforms useful when synthesizing images.
Stereo Many slides adapted from Steve Seitz.
Epitomic Location Recognition A generative approach for location recognition K. Ni, A. Kannan, A. Criminisi and J. Winn In proc. CVPR Anchorage,
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
Machine Vision Introduction to Using Cognex DVT Intellect.
Autonomous Robots Vision © Manfred Huber 2014.
Journal of Visual Communication and Image Representation
Robotics/Machine Vision Robert Love, Venkat Jayaraman July 17, 2008 SSTP Seminar – Lecture 7.
Suggested Machine Learning Class: – learning-supervised-learning--ud675
Introduction to Robots and the Mind - Sensors - Bert Wachsmuth & Michael Vigorito Seton Hall University.
Microsoft Kinect How does a machine infer body position?
Ehsan Nateghinia Hadi Moradi (University of Tehran, Tehran, Iran) Video-Based Multiple Vehicle Tracking at Intersections.
Visual Information Processing. Human Perception V.S. Machine Perception  Human perception: pictorial information improvement for human interpretation.
range from cameras stereoscopic (3D) camera pairs illumination-based
VIVOTEK 2007 Product Roadmap
Imaging and Depth Estimation in an Optimization Framework
CS4670 / 5670: Computer Vision Kavita Bala Lec 27: Stereo.
CS b659: Intelligent Robotics
Image-Based Rendering
Fall ’06 COMP Robotics Computer Science Dept. UNC-Chapel Hill
Zhigang Zhu, K. Deepak Rajasekar Allen R. Hanson, Edward M. Riseman
Common Classification Tasks
Vehicle Segmentation and Tracking in the Presence of Occlusions
Acknowledgement: some content and figures by Brian Curless
Mixed Reality Server under Robot Operating System
CSc4730/6730 Scientific Visualization
Probabilistic Robotics
Distance Sensor Models
Allen Yang Yang University of Illinois at Urbana-Champaign
Probabilistic Map Based Localization
Introduction to Object Tracking
Scalable light field coding using weighted binary images
Presentation transcript:

Sensors in Robotics Li Guan Fall ’06 COMP 790-072 Robotics Computer Science Dept. UNC-Chapel Hill Sensors in Robotics Li Guan Figure from Roland Siegwart, Sensors for mobile robotics Feature extraction Savannah River Site Nuclear Surveillance Robot

Classification of Sensors What: Proprioceptive sensors measure values internally to the system (robot), e.g. motor speed, wheel load, heading of the robot, battery status Exteroceptive sensors information from the robots environment distances to objects, intensity of the ambient light, unique features. How: Passive sensors energy coming for the environment Active sensors emit their proper energy and measure the reaction better performance, but some influence on environment 2018/9/17

General Classification 2018/9/17

General Classification (Cont.) 2018/9/17

Outline Recent Vision Sensors Sensor Fusion Framework Multiple Sensor Cooperation 2018/9/17

A Taxonomy Figure from Marc Pollefeys, COMP790-089 3D Photography 2018/9/17

A Taxonomy (cont.) Figure from Marc Pollefeys, COMP790-089 3D Photography 2018/9/17

2018/9/17

Projector as camera 2018/9/17

Multi-Stripe Triangulation To go faster, project multiple stripes But which stripe is which? Answer #1: assume surface continuity e.g. Eyetronics’ ShapeCam 2018/9/17

Multi-Stripe Triangulation To go faster, project multiple stripes But which stripe is which? Answer #2: colored stripes (or dots) 2018/9/17

Multi-Stripe Triangulation To go faster, project multiple stripes But which stripe is which? Answer #3: time-coded stripes 2018/9/17

Time-Coded Light Patterns Assign each stripe a unique illumination code over time [Posdamer 82] Time Space 2018/9/17

Pulsed Time of Flight Basic idea: send out pulse of light (usually laser), time how long it takes to return 2018/9/17

Time of Flight Computation Pulsed laser measurement of elapsed time directly resolving picoseconds Beat frequency between a frequency modulated continuous wave and its received reflection Phase shift measurement to produce range estimation technically easier than the above two methods 2018/9/17

Pulsed Time of Flight Advantages: Disadvantages: Large working volume (up to 100 m.) Disadvantages: Not-so-great accuracy (at best ~5 mm.) Requires getting timing to ~30 picoseconds Does not scale with working volume Often used for scanning buildings, rooms, archeological sites, etc. 2018/9/17

Depth cameras Superfast shutter + standard CCD 3DV’s Z-cam                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                         Depth cameras 3DV’s Z-cam Superfast shutter + standard CCD cut light off while pulse is coming back, then I~Z but I~albedo (use unshuttered reference view) 2018/9/17

Phase Shift Measurement 2018/9/17

Phase Shift Measurement (Cont.) 2018/9/17 Note the ambiguity in the measured phase!

Canesta 3D camera 2D array of time-of-flight sensors jitter too big on single measurement, but averages out on many (10,000 measurements100x improvement) 2018/9/17

CSEM 3D Sensor 2018/9/17

Other Vision Sensors Omni-directional Camera 2018/9/17

Other Vision Sensors (cont.) Depth from Focus/Defocus 2018/9/17

Outline Recent Vision Sensors Sensor Fusion Framework Multiple Sensor Cooperation 2018/9/17

Sensor Errors Systematic error  deterministic errors caused by factors that can (in theory) be modeled  prediction e.g. calibration of a laser sensor or of the distortion cause by the optic of a camera Random error  non-deterministic errors no prediction possible however, they can be described probabilistically e.g. Hue instability of camera, black level noise of camera .. 2018/9/17

Probabilistic Sensor Fusion 2018/9/17

Sensor Fusion Example: Probabilistic Visual Hull Multiple Camera Sensors Inward Looking Reconstruct the environment Jean-Sebastien Franco, et. al. ICCV`05 figures from http://graphics.csail.mit.edu/~wojciech/vh/reduction.html 2018/9/17

Fusion of Multi-View Silhouette Cues Using a Space Occupancy Grid (ICCV `05) Unreliable silhouettes: do not make decision about their location Do sensor fusion: use all image information simultaneously 2018/9/17

Bayesian formulation Idea: we wish to find the content of the scene from images, as a probability grid Modeling the forward problem - explaining image observations given the grid state - is easy. It can be accounted for in a sensor model. Bayesian inference enables the formulation of our initial inverse problem from the sensor model Simplification for tractability: independent analysis and processing of voxels 2018/9/17

Modeling Sensor model: Inference: Grid Gx I: color information in images B: background color models F: silhouette detection variable (0 or 1): hidden GX: occupancy at voxel X (0 or 1) Inference: Grid Gx 2018/9/17

Visualization 2018/9/17

Further, we can infer occlusion Foreground object inference robust to partial occlusions, when Static occluders, partial occlusion This enables detection of discrepancies between the foreground volume and where its silhouette is actually observed Example (Old Well dataset with 9 cameras, frame#118, voxels>90%) 2018/9/17

Add figure 2018/9/17

Occlusion Inference Example 9 views, 30fps, 720by480, calibrated, about 1.2min. 2018/9/17

Current Result Binary Occluder A demo video 2018/9/17

Other Reference M. A. Abidiand R. C. Gonzalez, Data Fusion in Robotics and Machine Intelligence, Academic Press, 1992. P.K.Allen,Robotic object recognition using vision and touch, KluwerAcademic Publishers, 1987 A. I. Hernandez, G. Carrault, F. Mora, L. Thoraval, G. Passariello, and J. M. Schleich, “Multisensorfusion for atrialand ventricular activity detection in coronary care monitoring, IEEE Transactions on Biomedical Engineering, vol. 46, no. 10, pp. 1186–1190, 1999. A. Hernandez, O. Basset, I. Magnin, A. Bremond, and G. Gimenez, “Fusion of ultrasonic and radiographic images of the breast, in Proc. IEEE UltrasonicsSymposium, pp. 1437–1440, San Antonio, TX, USA, 1996. 2018/9/17

Outline Recent Vision Sensors Sensor Fusion Framework Multiple Sensor Cooperation 2018/9/17

Sensor Communication Different Types of Sensors/Drivers image sensors: camera, MRI, radar… sound sensors: microphones, hydrophones, seismic sensors. temperature sensors: thermometers motion sensors: radar gun, speedometer, tachometer, odometer, turn coordinator … Sensor Data Transmission Size Format Frequency SensorTalk (Honda Research Institute) `05 2018/9/17

Objective of SensorTalk Variety of Sensors Different requirements (output frequency) Different input/output High re-usability of driver and application code (Cross platform) Multi-user access to the sensor To build sensors from simpler sensors Work together with RoboTalk Think of a sensor as a robot – Pan-tilt-zoom camera Think of a robot as a sensor – NASA Mars Exploration Rover, ASIMO… 2018/9/17

Objective A communication tool A protocol Coordinate different types of sensors Facilitate different types of applications A protocol A set of rules to write the drivers & applications A set of methods to support multiple clients (e.g. write-locking) A set of modes to transmit output data 2018/9/17

Basic Idea A model of sensor 2018/9/17

Model of a Sensor A service with parameters Static Parameters (Input Signal, Output Signal) Tunable Parameters Client can query all parameters Client can change tunable parameters that are not being locked 2018/9/17

Example #1: Heat Sensor Parameters output format (integer, double) output value unit (Kelvin, oC) gain publishing frequency (1Hz ~ 29.99Hz) Resolution of output value … 2018/9/17

Example #2: Camera Parameters output format (RGB, JPG) image resolution (1024*768 pixels) projection matrix (3*4 double matrix) focal lens () radius distortion correction map (1024*768*2 double array) publishing frequency (1Hz ~ 100Hz) … 2018/9/17

Example #3: Visual Hull Sensor Parameters number of camera views Parameters related with each cameras projection matrix of every view output format volume resolution publishing frequency (1Hz~60Hz) … 2018/9/17

SensorTalk Design Serve multiple users One base frequency Multiple client required transmission mode DIRECT MODE CONTINUOUS MODE BATCH MODE Multiple client required publishing rate Multiple client required frame compression Locking Parameters Read Output Frame/Stop Read Output Frame 2018/9/17

SensorTalk Scenario Server Client Up Up Subscribe Create a client structure Return client ID Ask for Description Return Description Control para “A” Call function to change “A” Return new “A” 2018/9/17

SensorTalk Scenario (cont.) Server Client Get 1 frame (DIRECT) Get 1 frame from driver Return the frame Process the frame Get frames (CONTINUOUS) Get 1 frame from driver Return the frame Get 1 frame from driver Return the frame Get 1 frame from driver Return the frame 2018/9/17

SensorTalk Scenario (cont.) Server Client Stop stream (CONTINUOUS) Stop getting frames Return SUCCESS Release Disconnect Close program Delete the client structure with ID Waiting for other connections 2018/9/17

Demo 2 Virtual Cameras 1 “Visual Hull” sensor Dataset from http://www.mpi-sb.mpg.de/departments/irg3/kungfu/                                                                             A demo video 2018/9/17

A Counterpart - RoboTalk Copyright Lucasfilm Ltd. Mobile Robot with Pan-Tilt Camera Honda Asimo Humanoid Robot Allen Y. Yang, Hector Gonzalez-Banos, Victor Ng-Thow-Hing, James Davis, RoboTalk: controlling arms, bases and androids through a single motion interface, IEEE Int. Conf. on Advanced Robotics (ICAR), 2005.   2018/9/17

2018/9/17

Robot? Sensor? A PTZ (Pan/Tilt/Zoom) camera Movable on its horizontal (Pan), Vertical (Tilt), and focal length (Zoom) axis. The Mars Land Rover A specialized sensing robot… 2018/9/17

Why not just SensorTalk/RoboTalk QoS – high Throughput - low Sensor: Qos – low Throughput – may be huge! 2018/9/17

Conclusion Recent Vision Sensors Sensor Fusion Framework More in SLAM Multiple Sensor Cooperation More in Multiple robot coordination 1st Summer School on Perception and Sensor Fusion in Mobile Robotics, September 11~16, 2006 – Fermo, Italy http://psfmr.univpm.it/2005/material.htm Thanks, any Questions? 2018/9/17