EU funded FP7: Oct 11 – Sep 14 Co-evolution of Future AR Mobile Platforms Paul Chippendale, Bruno Kessler Foundation FBK, Italy.

Slides:



Advertisements
Similar presentations
ARCHEOGUIDE Augmented Reality-based Cultural Heritage On-site Guide
Advertisements

Free-viewpoint Immersive Networked Experience February 2010.
5th FP7 Networked Media Concertation Meeting, Brussels 3-4 February 2010 A unIfied framework for multimodal content SEARCH Short Project Overview.
Communications Systems
Wiki-Reality: Augmenting Reality with Community Driven Websites Speaker: Yi Wu Intel Labs/vision and image processing research Collaborators: Douglas Gray,
Distributed search for complex heterogeneous media Werner Bailer, José-Manuel López-Cobo, Guillermo Álvaro, Georg Thallinger Search Computing Workshop.
9th E3 Concertation Meeting, Brussels, September 10th, 2002
Clustering Crowdsourced Videos by Line-of-Sight FOCUS: Clustering Crowdsourced Videos by Line-of-Sight Puneet Jain, Justin Manweiler, Arup Acharya, and.
Sheldon Brown, UCSD, Site Director Milton Halem, UMBC Director Yelena Yesha, UMBC Site Director Tom Conte, Georgia Tech Site Director Fundamental Research.
Martin Wagner and Gudrun Klinker Augmented Reality Group Institut für Informatik Technische Universität München December 19, 2003.
Parallel Tracking and Mapping for Small AR Workspaces Vision Seminar
Daniel Shepard and Todd Humphreys
Richard Yu.  Present view of the world that is: Enhanced by computers Mix real and virtual sensory input  Most common AR is visual Mixed reality virtual.
Augmented Reality David Johnson.
Augmented Reality COMP 1701 Nicholas Alberts. What is Augmented Reality? Augmented reality is a combination of a real life scene from a person, and a.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Interactive Systems Technical Design
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
FYP Project LYU0301: Secure and Reliable PDA-Based Communication System.
FYP Project LYU0303: 1 Video Object Tracking and Replacement for Post TV Production.
Overview and Mathematics Bjoern Griesbach
© 2010 Pearson Addison-Wesley. All rights reserved. Addison Wesley is an imprint of 1-1 HCI Human Computer Interaction Week 10.
TAXISAT PROJECT Low Cost GNSS and Computer Vision based data fusion solution for driverless vehicles Marc POLLINA
Mediafi.org ficontent.eu Ficontent boosting Creativity Carmen Mac Williams Experimentation coordinator
1. The Promise of MEMS to LBS and Navigation Applications Dr. Naser El-Shiemy, CEO Trusted Positioning Inc. 2.
Visualization Technology Basic Masaki Hayashi Nov.12, 2013 Visualization of 3D CG.
Chapter 1: Voilà! Meet the Android. Smartphones –Can browse the Web –Allow you to play games –Use business applications –Check –Play music –Record.
AS ICT.  A portable communication device is a pocket sized device that is carried around by an individual  They typically have a display screen with.
“S ixth Sense is a wearable gestural interface device that augments the physical world with digital information and lets people use natural hand gestures.
The Future and Accessibility OZeWAI Conference 2011 Jacqui van Teulingen Director, Web Policy 1.
INTRODUCTION TO MOBILE COMPUTING. MOBILE COMPUTING  Mobile computing is the act of interacting with a computer through the use of a mobile device. 
Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
Satellites in Our Pockets: An Object Positioning System using Smartphones Justin Manweiler, Puneet Jain, Romit Roy Choudhury TsungYun
Video Eyewear for Augmented Reality Presenter: Manjul Sharma Supervisor: Paul Calder.
Logistics and Systems Rabby Q. Lavilles. Supply chain is a system of organizations, people, technology, activities, information and resources involved.
Visualizing Information in Global Networks in Real Time Design, Implementation, Usability Study.
Monitoring, Modelling, and Predicting with Real-Time Control Dr Ian Oppermann Director, CSIRO ICT Centre.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation 1 Dynamic Sensor Resource Management for ATE MURI.
Video Eyewear for Augmented Reality Presenter: Manjul Sharma Supervisor: Paul Calder.
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
W E L C O M E. A U G M E N T E D R E A L I T Y A SEMINAR BY JEFFREY J M EC7A ROLL NO:
Mixed Reality Trompe l’oëil in the 21st Century
FYP Project LYU0304: “Monster Battle”: A Prototype of Augmented Reality Card Game.
Augmented Reality Authorized By: Miss.Trupti Pardeshi. NDMVP, Comp Dept. Augmented Reality 1/ 23.
FP7-ICT Networked Media and Search Systems End-to-end Immersive and Interactive Media Technologies “creating a pervasive Augmented Reality paradigm,
Improving O&M Skills Through the Use of VE for People Who Are Blind: Past Research and Future Potential O. Lahav School of Education, Tel Aviv University.
CONTENT FOCUS FOCUS INTRODUCTION INTRODUCTION COMPONENTS COMPONENTS TYPES OF GESTURES TYPES OF GESTURES ADVANTAGES ADVANTAGES CHALLENGES CHALLENGES REFERENCE.
Mobile GIS CHAPTER 1: GIS AND THE INFORMATION AGE The Information Age:  The world changing and the methods of meeting the needs of those changes are also.
MOMIRAS Kick-Off Meeting, 28/7/ Center for Research and Technology Hellas / Information Technologies Institute (
The ambient light sensor in a smart phone is what measures how bright the light is. It’s the phones software that adjusts the brightness in the display.
Visualization Technology Basic Masaki Hayashi Nov.10, 2015 Visualization with 3D CG.
A Framework for Perceptual Studies in Photorealistic Augmented Reality Martin Knecht 1, Andreas Dünser 2, Christoph Traxler 1, Michael Wimmer 1 and Raphael.
Application development process Part 1. Overview State of the mobile industry Size of the market Popularity of platforms How users use their devices Internationalisation.
AUGMENTED REALITY VIJAY COLLEGE OF ENGG. FOR WOMEN PRESENTED BY…….
Mixed Reality Augmented Reality (AR) Augmented Virtuality (AV)
LocusEngine APPLICATIONS
A SEMINAR ON ROVER TECHNOLOGY
Technologies: for Enhancing Broadcast Programmes with Bridgets
Introduction to Virtual Environments & Virtual Reality
CHAITANYA INSTITUTE OF SCIENCE AND TECHNOLOGY
A Virtual Reality and Augmented Reality Technology Company
Augmented Reality And Virtual Reality.
What is Augmented Reality?
ASHIK V S Roll No. 19 S3 ECE COLLEGE OF ENGINEERING, TRIVANDRUM
Chapter 18 MobileApp Design
3rd Studierstube Workshop TU Wien
WELCOME TO SEMINAR.
Sensor Fusion Localization and Navigation for Visually Impaired People
Closing Remarks.
Presentation transcript:

EU funded FP7: Oct 11 – Sep 14 Co-evolution of Future AR Mobile Platforms Paul Chippendale, Bruno Kessler Foundation FBK, Italy

Move away from the Augmented Keyhole

User centric, not device centric HMDs lock displays to the viewer But what about handheld displays?

Device-World registration What is the devices real-world location? Which direction is it pointing?

Device-World registration What is the devices real-world location? GPS, Cell/WiFi tower triangulation (~10m)

Device-World registration Which direction is it pointing? Magnetometer, Gyros, Accelerometers (~5-20 º) Mems production variability Sensors age Soft/Hard iron influences vary across devices, environments and camera pose

Is +/- 10m and +/- 20 º sufficient for nailed-down AR?

But what about hand-held AR? Devices becomes an augmented window

User-Device-World registration What is the devices real-world location? Which direction is it pointing? Where is the user with respect to the screen?

Surely if we wait sensor errors will disappear? Unlikely! O Sensor errors are tolerable for non-AR application, handset manufacturers focus on price, power and form- factor Cant we just model the error in software? Not really! O Platform diversity and swift evolution make error modelling expensive and quickly obsolete Just wait for better AR devices!

So what can we do? The AR comunity should work with handset manufacturers and make recomendations Use computer vision to work with sensors

VENTURI project... o Match AR requirements to platform o Efficiently exploit CPUs & GPUs o Improving sensor-camera fusion by creating a common clock (traditionally only audio/video considered) o Applying smart power management policies o Optimizing AR chain, by exploiting both on-board and cloud processing/storage

Seeing the world o Improve device-world pose by: Matching visual features to 3D models of the world Matching camera feed to visual appearance of the world Fusing camera and sensors for ambiguity reasoning and tracking o Use front facing camera to estimate user-device pose via face tracking

Urban 3D Model matching o Use high resolution building models (e.g. laser scanned) and globally registered to geo-referenced coordinate system o Use 3D marker-less tracking to correlate distinctive features to 3D building models. Subsequent tracking using inertial sensors and visual optical flow

Terrain 3D Model matching o Synthetic model of world rendered from Digital Elevation Models. Salient features from camera feed (depth discontinuities) matched to similar synthetic features.

16 Use approximate location to gather nearby images from the cloud Exploit sensor data to provide a clue for orientation alignment Computer vision algorithms match feature descriptors from the camera feed to similar features in the cloud images Appearance matching

SLAM + Matching O Simultaneous Localization And Mapping - build a map of an unknown environment while at the same time navigating the environment using the map. o Mapped environment has no real-world scale nor absolute geo-coordinates. Exploit prior approaches to complete registration.

Mobile context understanding o User/environment context estimation: o PDR enriched with vision o User activity modelling o Sensing geo-objects o Harvest/create geo-social content

Context sensitive AR delivery o Inject AR data in a natural manner according to: o environment o occlusions o lighting and shadows o user activity o Exploit user and environment context to select best delivery modality (text, graphics, audio, etc.), i.e. scalable/simplify-able audio-visual content

User Interactions o Explore evolving AR delivery and interaction o In-air interfaces: device, hand and face tracking o 3D audio o Pico-projection for multi-user, social-AR o HMDs

Prototypes One consolidated prototype at the end of each year to be evaluated through Use-cases o Gaming - VeDi 1.0 o Blind assistant - VeDi 2.0 o Tourism - VeDi 3.0 Constraints relaxed

VeDi 1.0 Objective: Stimulate software and hardware cross- partner integration and showcase state-of- the-art indoor AR registration Scenario: Multi-player, table-top AR Game resembling a miniature city. Players must accomplish a set of AR missions in the city, that adhere to physical constraints. Software: Sensor-aided marker-less 3D feature tracking. City geometrically reconstructed offline correctly occlusion handling and model registration. Hardware: Demo runs on experimental ST Ericsson prototype mobile platform.

FP7-ICT Networked Media and Search Systems End-to-end Immersive and Interactive Media Technologies creating a pervasive Augmented Reality paradigm, where information is presented in a user rather than a device centric way Co-ordinated by Paul Chippendale, Fondazione Bruno Kessler