X3D Extension for (Mobile) AR Contents International AR Standards Workshop Seoul, Korea Oct 11-12, 2010 Gerard J. Kim (WG 6 AR Standards Study Group Coordinator)

Slides:



Advertisements
Similar presentations
Nymity (and other made up words) Dan Cutting July 2003.
Advertisements

9th E3 Concertation Meeting, Brussels, September 10th, 2002
A Natural Interactive Game By Zak Wilson. Background This project was my second year group project at University and I have chosen it to present as it.
The Steerable Projector and Camera Unit in an Instrumented Environment Lübomira Spassova Saarland University, Saarbrücken, Germany.
PEREY Research & Consulting 1 Welcome. PEREY Research & Consulting 2 Getting Connected WiFi ESSID: guest Fixed line Conference Room 1 –IP : ~2.
VisHap: Guangqi Ye, Jason J. Corso, Gregory D. Hager, Allison M. Okamura Presented By: Adelle C. Knight Augmented Reality Combining Haptics and Vision.
Mixed Reality Systems -Lab IV – Augmented Reality- Christoph Anthes.
Augmented Reality David Johnson.
Intro to Simulation and Virtual reality – CE ► An overview of VR and simulation ► Practical application of theory ► Module Leader - Bob Hobbs ►
Multimedia and the WWW Howell Istance and Chris Hand, Napier University.
Virtual Reality. What is virtual reality? a way to visualise, manipulate, and interact with a virtual environment visualise the computer generates visual,
Vision Computing An Introduction. Visual Perception Sight is our most impressive sense. It gives us, without conscious effort, detailed information about.
© De Montfort University, D Graphics and VRML Howell Istance and Chris Hand* De Montfort University * now at
FYP Project LYU0301: Secure and Reliable PDA-Based Communication System.
1 Location-Based Services Using GSM Cell Information over Symbian OS Final Year Project LYU0301 Mok Ming Fai (mfmok1) Lee Kwok Chau (leekc1)
3D on-line representations Jan Valcik → introduction, main idea → VRML 97 → building of virtual worlds → VRML 97 at work → X3D → MUDVR → animations, navigation.
CSCE 641: Computer Graphics Image-based Rendering Jinxiang Chai.
Three-Dimensional Concepts
FYP Project LYU0303: 1 Video Object Tracking and Replacement for Post TV Production.
Overview and Mathematics Bjoern Griesbach
© 2010 Pearson Addison-Wesley. All rights reserved. Addison Wesley is an imprint of 1-1 HCI Human Computer Interaction Week 10.
Augmented Reality and 3D modelling Done by Stafford Joemat Supervised by Mr James Connan and Mr Mehrdad Ghaziasgar.
Knowledge Systems Lab JN 8/24/2015 A Method for Temporal Hand Gesture Recognition Joshua R. New Knowledge Systems Laboratory Jacksonville State University.
Declarative 3D for the Web Architecture Web3D 2012 Workshop Johannes Behr, Kristian Sons.
IEEE MEDIA INDEPENDENT SERVICES DCN: SAUC Title: Use cases of MIS framework to cooperate with SDN wireless access networks Date.
Virtual Humanoid “Utsushiomi” Michihiko SHOJI Venture Business Laboratories, Yokohama National University
Computer Visualization BIM Curriculum 03. Topics  History  Computer Visualization Methods  Visualization Workflow  Technology Background.
VRML - 1 Virtual Reality Modeling Language (VRML) Peter O’Grady.
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
Augmented and mixed reality (AR & MR)
Rapid Prototyping of Distributed User Interfaces J. P. Molina 1,2, J. Vanderdonckt 1, P. González 2 A. Fernández 2 and M. D. Lozano 2 1 Université catholique.
Physically Realistic Interface for a User Inside VR Masahide Hashimoto Kenji Miyamoto Faculty of Engineering Hosei University (Japan)
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
Computer Graphics Group Jiří Žára. Computer Graphics Group 2X3D Contents 1.Web3D Consortium 2.X3D specification 3.GeoVRML 4.NurbsVRML 2.
H3D API Training Part 2.1: X3D. Outline  Scene graphs  X3D – XML syntax  X3D components overview.
Augmented reality Prepared by: Khyati Kataria 08DCE001
Augmented Reality and 3D modelling By Stafford Joemat Supervised by Mr James Connan.
Video Eyewear for Augmented Reality Presenter: Manjul Sharma Supervisor: Paul Calder.
Interactive Textures as Spatial User Interfaces in X3D Web3D 2010 Symposium Sabine Webel Y. Jung, M. Olbrich, T. Drevensek, T. Franke, M.Roth, D.Fellner,
A Multi-agent Approach for the Integration of the Graphical and Intelligent Components of a Virtual Environment Rui Prada INESC-ID.
W E L C O M E. A U G M E N T E D R E A L I T Y A SEMINAR BY JEFFREY J M EC7A ROLL NO:
HCI 입문 Graphics Korea University HCI System 2005 년 2 학기 김 창 헌.
FYP Project LYU0304: “Monster Battle”: A Prototype of Augmented Reality Card Game.
Augmented Reality Authorized By: Miss.Trupti Pardeshi. NDMVP, Comp Dept. Augmented Reality 1/ 23.
Markerless Augmented Reality Platform Design and Verification of Tracking Technologies Author:J.M. Zhong Date: Speaker:Sian-Lin Hong.
CONTENT FOCUS FOCUS INTRODUCTION INTRODUCTION COMPONENTS COMPONENTS TYPES OF GESTURES TYPES OF GESTURES ADVANTAGES ADVANTAGES CHALLENGES CHALLENGES REFERENCE.
Border Code: an Efficient Code System for Augmented Reality Seong-hun Park and Young-guk Ha' Konkuk University, Department of Computer Science and Engineering,
X3D: Real Time 3D Solution for the web Web3D Tech Talk – SIGGRAPH 2008 Fraunhofer Foundation Mission: “Application oriented research for industry and advantage.
Outline Introduction Related Work System Overview Methodology Experiment Conclusion and Future Work.
Mixed Reality Conferencing Hirokazu Kato, Mark Billinghurst HIT Lab., University of Washington.
A Framework for Perceptual Studies in Photorealistic Augmented Reality Martin Knecht 1, Andreas Dünser 2, Christoph Traxler 1, Michael Wimmer 1 and Raphael.
IEEE MEDIA INDEPENDENT SERVICES DCN: SAUC Title: Use cases of MIS framework to cooperate with SDN wireless access networks Date.
Augmented Reality Services based on Embedded Metadata Byoung-Dai Lee Department of Computer Science, Kyonggi University, Suwon, Korea Abstract.
AUGMENTED REALITY VIJAY COLLEGE OF ENGG. FOR WOMEN PRESENTED BY…….
Hyerim Park, Woontack Woo KAIST UVR Lab, Republic of Korea.
CIRP Annals - Manufacturing Technology 60 (2011) 1–4 Augmented assembly technologies based on 3D bare-hand interaction S.K. Ong (2)*, Z.B. Wang Mechanical.
X3DOM : Integrating 3D content seamlessly into webpage
Technologies: for Enhancing Broadcast Programmes with Bridgets
MPEG-4 Binary Information for Scenes (BIFS)
Visual Information Retrieval
Jun Shimamura, Naokazu Yokoya, Haruo Takemura and Kazumasa Yamazawa
Implementing Localization
1st Draft for Defining IoT (1)
Object Model for Live Actor and Entity in a MAR world
Live Actor and Entity Representation in MAR
What is Augmented Reality?
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
Mixed Reality Server under Robot Operating System
IEEE MEDIA INDEPENDENT HANDOVER DCN:
File Format for Representing Live Actor and Entity in MAR
Presentation transcript:

X3D Extension for (Mobile) AR Contents International AR Standards Workshop Seoul, Korea Oct 11-12, 2010 Gerard J. Kim (WG 6 AR Standards Study Group Coordinator) Korea University

Approach Extensibility to existing frameworks X3D (Scene graph) Because AR is implemented as VR! KML, OpenGIS, … We need location representation Generality/Flexibility to accommodate Different AR platforms (~Platform independence) Mobile, Desktop, HMD, … Sensors and devices Vision based, Marker based, Location based, … Focused on file format (Scene graph based?) vs. Contents representation Machine consumption

Various display types and platforms Video Combiner Optical Combiner Display Camera Display Camera Video Combiner [R. Azuma, 1997]

AR/MR Implementation

Various sensing

MR/AR Contents Context: Condition or situation that triggers an augmentation and mixing of real and virtual objects Resource: Raw data or information used for augmentation Content: One or more pairings of contexts and Resources + behaviors (that uses the resources) context resource

Related work Jung et al. (InstantReality Suite) Extension of Sensor nodes – Physical contexts Extension of Viewpoint nodes – Specification of camera parameters Layers: One layer served as background video Extension of X3DLightNode: Lighting effects SFImageSenosr : X3DDirectSensorNode { SFImage[in/out]value… SFBool[]outFalse SFString[]label … } DEF frame SFImageSensor { label “Video Frames” } ROUTE frame.value_changed TO surfaceTex.set_image

Major proposals Extend “View” node: Resolution between “live” camera and virtual Define “Live” camera node (G. Lee / ETRI) – Not necessarily for “AR” contents (e.g. Video textures) – Parameters set by user More detailed parameter specification for “View” Set by user Routed from “Live” camera node – With possibility of behavioral manipulation “Routed” from sensor – Camera could be tracked separately Default: same as the world – Note that view can be relative to anything LC, VC LC VC LC, VC

Major proposal Extending movie texture node (for AR background) Also proposed by G. Lee / Instant Reality Extend existing virtual “Sensor” nodes New X3DARNodes for target real object description ImagePatch, 3DObject, GPSLocation, SingleValue, … Existing: E.g. Visibility, Proximity, Touch sensor … New: RangeSensor, UIClickSensor, …

Not included in this proposal Lighting and Rendering issue Depth sensing and occlusion effects Extended point of interest (e.g. path, hierarchical POI) Platform type specification e.g. Resolution difference

AR contents (Real/Physical) X3D (Virtual) WORLD View (Virtual Camera) Other X3D Nodes AR Node + Sensor ROUTE* Live Camera Movie Texture* Virtualized Physical Contexts

Abstraction of MR/AR contents as a collection of context and resources connected by “Event in”’s and “Event out”’s.

<ROUTE fromField='touchTime' fromNode='TOUCH' toField='startTime' toNode='TIME'/> <ROUTE fromField='fraction_changed' fromNode='TIME' toField='set_fraction' toNode='INTERP_POS'/> <ROUTE fromField='value_changed' fromNode='INTERP_POS' toField='translation' toNode='BALL'/>

X3DNode X3DARNode ImagePatch 3DObject GPSLocation SingleValue UIDevice … X3DChildNode X3DSensorNode X3DEnvironmental SensorNode VisibilitySensor ProximitySensor RangeSensor … UIConfigNode UIClickSensor UIScrollSensor …

Vision based feature recognition and tracking (e.g. fiducials, markers, 3D points) Non-vision based env. sensor events and values (e.g. RFID, GPS, distance) User interaction devices events and values (e.g. buttons, touch screen, jog dial) Context information (e.g. user age) Real ObjectX3DARNodeMain AttributesSensor used MarkerImagePatchID, Position, Orientation Visibility 3D point3DObjectID, Type, Position, Orientation Visibility GPS Location ID, CoordinateRange RFIDSingleValueValue (Boolean)Existence Ultrasonic sensorSingleValueDistance (Integer)Proximity ButtonUIDeviceValue (Boolean)UIClickSensor User AgeSingleValueAge (Integer)Range

X3DARNode Placeholders for physical objects within AR/MR world “implementation” X3DARNode : X3DNode { SFNode[in, out]metadata SFNode[in, out]parent SFString[in, out]description SFBool[in, out]enabled } X3DARNode is the base type for the Marker, Location and General Event, …

ImagePatch (Marker) & VisibilitySensor ImagePatch : X3DARNode { SFNode[in, out]metadata SFNode[in, out]parent SFString[in, out]description SFBool[in, out]enabled SFString[in, out]filename SFVec3f[in, out]position SFRotation[in, out]orientation } VisibilitySensor : X3DEnvironmentalSensorNode { SFVec3f[in, out]center SFBool[in, out]enabled SFNode[in, out]metadata SFVec3f[in, out]size SFTime[out]enterTime SFTime[out]exitTime SFBool[out]isActive }

Location & RangeSensor GPSLocation : X3DSensorNode { SFNode[in, out]metadata SFNode[in, out]parent SFString[in, out]description SFBool[in, out]enabled SFInt32[in, out]device_description SFBool[out]status MFString[out]values } RangeSensor : X3DEnvironmentalSensorNode { SFVec3f[in, out]center SFBool[in, out]enabled SFNode[in, out]metadata SFVec3f[in, out]size SFTime[out]enterTime SFTime[out]exitTime SFBool[out]isActive SFInt32[in, out]sequence SFString[in, out]lBound SFString[in, out]uBound SFString[in, out]value }

Live camera LiveCamera = MR/AR Capture Camera Within the Scene node Image field is the out value Camera internal parameter  projmat field Camera external parameter  Set to World but can be tracked Live Camera { SFString[in, out]label "default“ SFString[out]parent SFImage[out]image SFMatrix4f[out]projmat "1 0 0 … " SFBool[out]on FALSE SFBool[out]tracking FALSE SFVec3f[out]position SFRotation [out]orientation }

Routing from LiveCam From Live Camera node “image” field To Background (LiveURL field) Shape (MovieTexture field)

Live video  Background <Background groundAngle=' ' groundColor=' ' skyAngle=' ' skyColor=' ' backUrl='mountns.png' frontUrl='mountns.png' leftUrl='mountns.png' rightUrl='mountns.png'/>

MovieTexture Node Add MovieTexture to X3DTextureNode hierarchy Used for TextureBackground Fix TextureBackground relative to camera Allow connection to live camera (not just through streaming server)

MovieTexture Node <MovieTexture loop='true' url=' "wrlpool.mpg" " '/> <Coordinate point=' '/>

Live Camera  Movie Texture

Live Camera and Virtual Camera Calibrating the virtual camera according to the parameters of live capture camera Internal parameter = projection matrix External parameter = camera pose Manual Direct specification Routing From the Live camera From the Sensor W0W0 T

Method 1 Viewpoint : X3DViewpointNode{ SFMatrix4f[in]projmat SFVec3f[in,out]position SFRotation [in,out]orientation SFNode[in,out]liveCamera Add distortion parameters here } …

Method 2 …

Other Activities Draft document Teleconferences with Web3D Implementation: k-MART Domestic workshop April, POSTECH, Pohang, Korea June, KIST, Seoul, Korea

Future More Extensions Examples Implementations International consensus