Presentation is loading. Please wait.

Presentation is loading. Please wait.

An Introduction to Mobile Robotics CSE350/450-011 Sensor Systems (Continued) 2 Sep 03.

Similar presentations


Presentation on theme: "An Introduction to Mobile Robotics CSE350/450-011 Sensor Systems (Continued) 2 Sep 03."— Presentation transcript:

1 An Introduction to Mobile Robotics CSE350/450-011 Sensor Systems (Continued) 2 Sep 03

2 Objectives for Today Any Questions? Finish discussion of inertial navigation Brief review on discrete/numerical integration Review of TOF sensors A few thoughts on modeling sensor noise and the Gaussian distribution

3 Assumptions in our Inertial Navigation Model Coriolis effects negligible Flat earth No effects from vehicle vibration “Perfect” sensor orientation –With respect to the vehicle –With respect to the earth’s surface

4 Transforming Accelerations into Position Estimates In a perfect world It’s not a perfect world. We have noise and bias in our acceleration measurements: As a result ERROR TERMS errata

5 But what about Orientation? In a perfect world: It’s not a perfect world. We have noise and bias in our gyroscopic measurements: As a Result:

6 From Local Sensor Measurements to Inertial Frame Position Estimates E N Local frame is attached to the SENSOR IN THE PLANE Inertial Frame is FIXED x y

7 The Impact of Orientation Bias Ignoring noise: Let’s assume that our sensor frame is oriented in an eastwardly direction, and ω=0 ERROR SCALES CUBICLY!!!

8 Inertial Navigation Strategy Noise & bias cannot be eliminated Bias in accelerometers/gyros induces errors in position that scale quadratically/cubicly with time Bias impact can be reduced through frequent recalibrations to zero out current bias Bottom line: –Inertial navigation provide reasonable position estimates over short distances/time periods –Inertial navigation must be combined with other sensor inputs for extended position estimation QUESTION: How do we perform the integrations with a discrete sensor?

9 Time-of-Flight Sensors Ultrasonic (aka SONAR) Emits high-frequency sound and receiver captures echo Rigidly mounted to provide distance at a fixed relative bearing Inexpensive and lightweight Range to ≈ 10 meters Error ≈ 2% Potential error sources –Specular reflection –Crosstalk –Multi-path iRobot ® B21R Polaroid TM Transducer

10 SICK ® LMS-200 Time-of-Flight Sensors Laser Range Finders (LRF) “Most accurate” exteroceptive sensor available Relies on detecting the backscatter from a pulsed IR laser beam Range 80 meters 180 o degree scans at 0.25 o resolution Error: 5mm SD at ranges < 8 meters Negatives –Weight ≈ 10 pounds –Cost ≈ $5K –Power consumption ≈ 20W/160W –Difficulty in detecting transparent or dark matte surfaces

11 Modeling Sensor Noise Some Initial Ideas Assume that we can remove sensor bias through calibration All sensor measurements are still wrong, as they are corrupted by random sensor noise Goal: Develop algorithms which are robust to sensor noise Problem: How do we model if its distribution is unknown?

12 Modeling Sensor Noise Some Initial Ideas Some Possible Solutions: –Collect empirical data and develop a consistent model –Gaussian assumption v ~ N(μ,σ 2 )

13 Some Other Definitions Population mean: Population mean is also referred to as the expected value for the distribution

14 Some Other Definitions (cont’d) Population Variance: σ is referred to as the Standard Deviation, and is always positive

15 Why a Gaussian? Central Limit Theorem Mathematical convenience –Gaussian Addition distribution –Gaussian Subtraction Distribution –Gaussian Ratio Distribution –“Invariance” to Convolution –“Invariance” to Linear Transformation Empirical Data Experimental Support “It’s the normal distribution”

16 The “Standard” 2-D Gaussian This formula is only valid when the principle axes of the distribution are aligned with the x-y coordinate frame (more later) QUESTION: Assuming that our accelerometers are corrupted by Gaussian noise, would we expect the distribution for position to be Gaussian as well?

17 So how can I sample a Gaussian distribution in Matlab? randn function >> help randn RANDN Normally distributed random numbers. RANDN(N) is an N-by-N matrix with random entries, chosen from a normal distribution with mean zero, variance one and standard deviation one. >> x = randn x = -1.6656 >> x = randn(2,3) x = 0.1253 -1.1465 1.1892 0.2877 1.1909 -0.0376 QUESTION: How do I generate a general 1D Gaussian distribution from this?

18 A (very) Brief Overview of Computer Vision Systems Cameras are the natural extension of biological vision to robotics Advantages of Cameras –Tremendous amounts of information –Natural medium for human interface –Small Size –Passive –Low power consumption Disadvantages of Cameras –Explicit estimates for parameters of interests (e.g. range, bearing, etc.) are computationally expensive to obtain –Accuracy of estimates strongly tied to calibration –Calibration can be quite cumbersome Pulnix tm & Pt Grey tm Cameras

19 Sample Robotics Application Obstacle Avoidance

20 Single Camera System After an appropriate calibration, every pixel can be associated with a unique ray in space with an associated azimuth angle θ, and elevation angle φ An individual camera provides NO EXPLICIT DISTANCE INFORMATION Perspective Camera Model CCD

21 (x i R,y i R ) Optical Center f CCD Stereo Vision Geometry (X,Y,Z) Optical Center CCD B = Baseline (x i L,y i L ) x Z f=focal length B/2

22 Stereo Geometry (cont’d) Left ImageRight Image disparity = (x i R – x i L ) (x i L,y i L )(x i R,y i R ) (X,Y,Z) NOTE: This formulation assumes that the two images are already rectified.

23 * Images from www.ptgrey.com Sample Stereo Reconstruction From Point Grey Bumblebee TM Camera

24 Next Time… How do we extract features from images –Edge Segmentation –Color Segmentation –Corner Extraction


Download ppt "An Introduction to Mobile Robotics CSE350/450-011 Sensor Systems (Continued) 2 Sep 03."

Similar presentations


Ads by Google