Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sensors in Robotics Li Guan

Similar presentations


Presentation on theme: "Sensors in Robotics Li Guan"— Presentation transcript:

1 Sensors in Robotics Li Guan
Fall ’06 COMP Robotics Computer Science Dept. UNC-Chapel Hill Sensors in Robotics Li Guan Figure from Roland Siegwart, Sensors for mobile robotics Feature extraction Savannah River Site Nuclear Surveillance Robot

2 Classification of Sensors
What: Proprioceptive sensors measure values internally to the system (robot), e.g. motor speed, wheel load, heading of the robot, battery status Exteroceptive sensors information from the robots environment distances to objects, intensity of the ambient light, unique features. How: Passive sensors energy coming for the environment Active sensors emit their proper energy and measure the reaction better performance, but some influence on environment 2018/9/17

3 General Classification
2018/9/17

4 General Classification (Cont.)
2018/9/17

5 Outline Recent Vision Sensors Sensor Fusion Framework
Multiple Sensor Cooperation 2018/9/17

6 A Taxonomy Figure from Marc Pollefeys, COMP790-089 3D Photography
2018/9/17

7 A Taxonomy (cont.) Figure from Marc Pollefeys, COMP D Photography 2018/9/17

8 2018/9/17

9 Projector as camera 2018/9/17

10 Multi-Stripe Triangulation
To go faster, project multiple stripes But which stripe is which? Answer #1: assume surface continuity e.g. Eyetronics’ ShapeCam 2018/9/17

11 Multi-Stripe Triangulation
To go faster, project multiple stripes But which stripe is which? Answer #2: colored stripes (or dots) 2018/9/17

12 Multi-Stripe Triangulation
To go faster, project multiple stripes But which stripe is which? Answer #3: time-coded stripes 2018/9/17

13 Time-Coded Light Patterns
Assign each stripe a unique illumination code over time [Posdamer 82] Time Space 2018/9/17

14 Pulsed Time of Flight Basic idea: send out pulse of light (usually laser), time how long it takes to return 2018/9/17

15 Time of Flight Computation
Pulsed laser measurement of elapsed time directly resolving picoseconds Beat frequency between a frequency modulated continuous wave and its received reflection Phase shift measurement to produce range estimation technically easier than the above two methods 2018/9/17

16 Pulsed Time of Flight Advantages: Disadvantages:
Large working volume (up to 100 m.) Disadvantages: Not-so-great accuracy (at best ~5 mm.) Requires getting timing to ~30 picoseconds Does not scale with working volume Often used for scanning buildings, rooms, archeological sites, etc. 2018/9/17

17 Depth cameras Superfast shutter + standard CCD 3DV’s Z-cam
                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        Depth cameras 3DV’s Z-cam Superfast shutter + standard CCD cut light off while pulse is coming back, then I~Z but I~albedo (use unshuttered reference view) 2018/9/17

18 Phase Shift Measurement
2018/9/17

19 Phase Shift Measurement (Cont.)
2018/9/17 Note the ambiguity in the measured phase!

20 Canesta 3D camera 2D array of time-of-flight sensors
jitter too big on single measurement, but averages out on many (10,000 measurements100x improvement) 2018/9/17

21 CSEM 3D Sensor 2018/9/17

22 Other Vision Sensors Omni-directional Camera 2018/9/17

23 Other Vision Sensors (cont.)
Depth from Focus/Defocus 2018/9/17

24 Outline Recent Vision Sensors Sensor Fusion Framework
Multiple Sensor Cooperation 2018/9/17

25 Sensor Errors Systematic error  deterministic errors
caused by factors that can (in theory) be modeled  prediction e.g. calibration of a laser sensor or of the distortion cause by the optic of a camera Random error  non-deterministic errors no prediction possible however, they can be described probabilistically e.g. Hue instability of camera, black level noise of camera .. 2018/9/17

26 Probabilistic Sensor Fusion
2018/9/17

27 Sensor Fusion Example: Probabilistic Visual Hull
Multiple Camera Sensors Inward Looking Reconstruct the environment Jean-Sebastien Franco, et. al. ICCV`05 figures from 2018/9/17

28 Fusion of Multi-View Silhouette Cues Using a Space Occupancy Grid (ICCV `05)
Unreliable silhouettes: do not make decision about their location Do sensor fusion: use all image information simultaneously 2018/9/17

29 Bayesian formulation Idea: we wish to find the content of the scene from images, as a probability grid Modeling the forward problem - explaining image observations given the grid state - is easy. It can be accounted for in a sensor model. Bayesian inference enables the formulation of our initial inverse problem from the sensor model Simplification for tractability: independent analysis and processing of voxels 2018/9/17

30 Modeling Sensor model: Inference: Grid Gx
I: color information in images B: background color models F: silhouette detection variable (0 or 1): hidden GX: occupancy at voxel X (0 or 1) Inference: Grid Gx 2018/9/17

31 Visualization 2018/9/17

32 Further, we can infer occlusion
Foreground object inference robust to partial occlusions, when Static occluders, partial occlusion This enables detection of discrepancies between the foreground volume and where its silhouette is actually observed Example (Old Well dataset with 9 cameras, frame#118, voxels>90%) 2018/9/17

33 Add figure 2018/9/17

34 Occlusion Inference Example
9 views, 30fps, 720by480, calibrated, about 1.2min. 2018/9/17

35 Current Result Binary Occluder A demo video 2018/9/17

36 Other Reference M. A. Abidiand R. C. Gonzalez, Data Fusion in Robotics and Machine Intelligence, Academic Press, 1992. P.K.Allen,Robotic object recognition using vision and touch, KluwerAcademic Publishers, 1987 A. I. Hernandez, G. Carrault, F. Mora, L. Thoraval, G. Passariello, and J. M. Schleich, “Multisensorfusion for atrialand ventricular activity detection in coronary care monitoring, IEEE Transactions on Biomedical Engineering, vol. 46, no. 10, pp. 1186–1190, 1999. A. Hernandez, O. Basset, I. Magnin, A. Bremond, and G. Gimenez, “Fusion of ultrasonic and radiographic images of the breast, in Proc. IEEE UltrasonicsSymposium, pp. 1437–1440, San Antonio, TX, USA, 1996. 2018/9/17

37 Outline Recent Vision Sensors Sensor Fusion Framework
Multiple Sensor Cooperation 2018/9/17

38 Sensor Communication Different Types of Sensors/Drivers
image sensors: camera, MRI, radar… sound sensors: microphones, hydrophones, seismic sensors. temperature sensors: thermometers motion sensors: radar gun, speedometer, tachometer, odometer, turn coordinator Sensor Data Transmission Size Format Frequency SensorTalk (Honda Research Institute) `05 2018/9/17

39 Objective of SensorTalk
Variety of Sensors Different requirements (output frequency) Different input/output High re-usability of driver and application code (Cross platform) Multi-user access to the sensor To build sensors from simpler sensors Work together with RoboTalk Think of a sensor as a robot – Pan-tilt-zoom camera Think of a robot as a sensor – NASA Mars Exploration Rover, ASIMO… 2018/9/17

40 Objective A communication tool A protocol
Coordinate different types of sensors Facilitate different types of applications A protocol A set of rules to write the drivers & applications A set of methods to support multiple clients (e.g. write-locking) A set of modes to transmit output data 2018/9/17

41 Basic Idea A model of sensor 2018/9/17

42 Model of a Sensor A service with parameters
Static Parameters (Input Signal, Output Signal) Tunable Parameters Client can query all parameters Client can change tunable parameters that are not being locked 2018/9/17

43 Example #1: Heat Sensor Parameters output format (integer, double)
output value unit (Kelvin, oC) gain publishing frequency (1Hz ~ 29.99Hz) Resolution of output value 2018/9/17

44 Example #2: Camera Parameters output format (RGB, JPG)
image resolution (1024*768 pixels) projection matrix (3*4 double matrix) focal lens () radius distortion correction map (1024*768*2 double array) publishing frequency (1Hz ~ 100Hz) 2018/9/17

45 Example #3: Visual Hull Sensor
Parameters number of camera views Parameters related with each cameras projection matrix of every view output format volume resolution publishing frequency (1Hz~60Hz) 2018/9/17

46 SensorTalk Design Serve multiple users One base frequency
Multiple client required transmission mode DIRECT MODE CONTINUOUS MODE BATCH MODE Multiple client required publishing rate Multiple client required frame compression Locking Parameters Read Output Frame/Stop Read Output Frame 2018/9/17

47 SensorTalk Scenario Server Client Up Up Subscribe
Create a client structure Return client ID Ask for Description Return Description Control para “A” Call function to change “A” Return new “A” 2018/9/17

48 SensorTalk Scenario (cont.)
Server Client Get 1 frame (DIRECT) Get 1 frame from driver Return the frame Process the frame Get frames (CONTINUOUS) Get 1 frame from driver Return the frame Get 1 frame from driver Return the frame Get 1 frame from driver Return the frame 2018/9/17

49 SensorTalk Scenario (cont.)
Server Client Stop stream (CONTINUOUS) Stop getting frames Return SUCCESS Release Disconnect Close program Delete the client structure with ID Waiting for other connections 2018/9/17

50 Demo 2 Virtual Cameras 1 “Visual Hull” sensor Dataset from
                                                                            A demo video 2018/9/17

51 A Counterpart - RoboTalk
Copyright Lucasfilm Ltd. Mobile Robot with Pan-Tilt Camera Honda Asimo Humanoid Robot Allen Y. Yang, Hector Gonzalez-Banos, Victor Ng-Thow-Hing, James Davis, RoboTalk: controlling arms, bases and androids through a single motion interface, IEEE Int. Conf. on Advanced Robotics (ICAR), 2005.   2018/9/17

52 2018/9/17

53 Robot? Sensor? A PTZ (Pan/Tilt/Zoom) camera Movable on its horizontal (Pan), Vertical (Tilt), and focal length (Zoom) axis. The Mars Land Rover A specialized sensing robot… 2018/9/17

54 Why not just SensorTalk/RoboTalk
QoS – high Throughput - low Sensor: Qos – low Throughput – may be huge! 2018/9/17

55 Conclusion Recent Vision Sensors Sensor Fusion Framework
More in SLAM Multiple Sensor Cooperation More in Multiple robot coordination 1st Summer School on Perception and Sensor Fusion in Mobile Robotics, September 11~16, 2006 – Fermo, Italy Thanks, any Questions? 2018/9/17


Download ppt "Sensors in Robotics Li Guan"

Similar presentations


Ads by Google