Download presentation
Presentation is loading. Please wait.
Published byOlivia Madlyn Gibbs Modified over 9 years ago
1
Satellites in Our Pockets: An Object Positioning System using Smartphones Justin Manweiler, Puneet Jain, Romit Roy Choudhury TsungYun 20120827
2
Outline Introduction Primitives for Object Localization System Design Evaluation Future Work Conclusion
3
Introduction Augmented Reality (AR) – Location based Query “Restaurants around me?” – Distant Object based Query how expensive are rooms in that nice hotel far away? is that cell tower I can see from my house too close for radiation effects?
4
Introduction Wikitude – http://www.youtube.com/user/Wikitude http://www.youtube.com/user/Wikitude out-of-band tagging – Objects in the environment should be annotated out-of-band – someone visited Google Earth and entered a tag
5
Introduction Problem – Can a distant object be localized by looking at it through a smartphone? – The problem would have been far more difficult five years back SmartPhone !! – Camera, GPS, Accelerometer, Compass, and Gyroscope
6
Introduction OPS (Object Positioning System) – Computer vision – Smartphone sensors – Mismatch optimization Contribution – Localization for distant objects within view – System design and implementation on the Android Nexus S platform
7
Introduction OPS overview
8
Primitives for Object Localization (A) Compass triangulate
9
Primitives for Object Localization We cannot ask the user to walk too far – distance between camera views is much smaller than the distance from the camera to the object – Compass precision becomes crucial – Smartphone sensors are not nearly designed to support such a level of precision GPS can be impacted – Weather, clock error, …
10
Primitives for Object Localization (B) Visual trilateration – Trilateration is used in GPS, but not in the distant object positioning
11
Primitives for Object Localization The possible position lies on a curve – Visual angle: Computer vision + accelerometer
12
Primitives for Object Localization (C) Visual Triangulation – Parallax Multiple views of an object from different angles produce visual distortions – The properties of parallax and visual perception in general are well-understood We can find the interior angle – Still form a curve
13
Primitives for Object Localization
14
Combining Triangulation and Trilateration
15
Primitives for Object Localization We do not obtain a single point of intersection across all curves – Due to errors from GPS, compass, and inaccurate parameter estimation from the visual dimensions – Increasing the number of camera views will help it will also increase the number of curves (each with some error) – Rely on optimization techniques to find a single point of convergence
16
System Design
17
Structure form Motion (SFM) – State-of-art computer vision technique – Input: multiple photos from the user – Feature detector, Bundle Adjustment, Levenberg- Marquardt Algorithm – Output: (a) 3D point cloud of the geometry (b) the relative positions and orientation of the camera
18
System Design Structure form Motion (SFM)
19
System Design The other issues – Capture user Intent OPS must be able to automatically infer which object in view the user is most-likely interested the object-of-interest roughly at the center of the camera’s viewfinder – Privacy user can only upload the keypoints and feature descriptors
20
System Design We utilize SFM as a “black box” utility However, GPS/compass readings themselves will be noisy Optimization – Minimize the Compass error – Minimize the GPS noise – OPS Optimize on Object Location
21
System Design Triangulation via Minimize the Compass error – this scales to support an arbitrary number of GPS and compass bearing pairs – We want all C(n, 2) pairs points converge to a single point – A minimize question Add a error term to each compass value
22
System Design Triangulation via Minimize the Compass error
23
System Design Minimization of GPS Noise, Relative to Vision – Adjust the GPS reading from the position user take the photograph – solve a scaling factor λ that proportionally expands the distances in the SFM point cloud to match the equivalent real-world distances – + =
24
System Design Minimization of GPS Noise, Relative to Vision
25
System Design OPS Optimization on Object Location
26
System Design OPS Optimization on Object Location
27
System Design Extending the Location Model to 3D – Pitch rotational movement orthogonal to the plane of the phone screen, relative to the horizon – Adjustment From our 3D point cloud, there is a unique mapping of every 3D point back to each original 2D image
28
Evaluation Experiment – More than 50 buildings – Distance from user to the building: 30~150m far enough away that it makes sense to use the system limited by the user’s ability to clearly see the object and focus a photograph – 4 pictures, each photograph was taken between 0.75m and 1.5m
29
Evaluation Experiment – Processing time: 30~60s primarily attributable to structure from motion – Quality of photographs Lighting Blur Overexposed
30
Evaluation
33
Introduce some noise – Gaussian distribution with mean 0 and a varied standard deviation – Sensitivity to GPS – Sensitivity to compass error Sensitivity to Photograph detail – Varied resolution
34
Evaluation
37
Future Work Live Feedback to Improve Photograph Quality – feedback to the user Improving GPS Precision with Dead Reckoning – Dead Reckoning is the process of calculating one's current position by using a previously determined position Continual Estimation of Relative Positions with Video
38
Conclusion Localization for distant objects using the view from camera, without any off-band effort Real and implementation on the Android Nexus S platform Achieve a promising results – The prime limitation is from GPS error The two equation to calculate the “height”
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.