Mixed Reality Systems -Lab IV – Augmented Reality- Christoph Anthes.

Slides:



Advertisements
Similar presentations
Interactive Space – An application of OpenCV
Advertisements

Video Object Tracking and Replacement for Post TV Production LYU0303 Final Year Project Spring 2004.
A Natural Interactive Game By Zak Wilson. Background This project was my second year group project at University and I have chosen it to present as it.
Graphics Pipeline.
ICS103 Programming in C Lecture 1: Overview of Computers & Programming
CMPT 300: Operating Systems I Dr. Mohamed Hefeeda
1 CS 106, Winter 2009 Class 4, Section 4 Slides by: Dr. Cynthia A. Brown, Instructor section 4: Dr. Herbert G. Mayer,
Video Object Tracking and Replacement for Post TV Production LYU0303 Final Year Project Spring 2004.
Quicktime Howell Istance School of Computing De Montfort University.
1 3D modelling with OpenGL Brian Farrimond Robina Hetherington.
LYU0603 A Generic Real-Time Facial Expression Modelling System Supervisor: Prof. Michael R. Lyu Group Member: Cheung Ka Shun ( ) Wong Chi Kin ( )
03/09/2007CSCI 315 Operating Systems Design1 Memory Management Notice: The slides for this lecture have been largely based on those accompanying the textbook.
Painterly Rendering CMPS Assignment 2. OpenGL OpenGL is a cross-language, cross- platform specification defining and API for 2D and 3D graphics.
ART: Augmented Reality Table for Interactive Trading Card Game Albert H.T. Lam, Kevin C. H. Chow, Edward H. H. Yau and Michael R. Lyu Department of Computer.
OS Spring’03 Introduction Operating Systems Spring 2003.
Essentials of Interactive Computer Graphics: Concepts and Implementation K. Sung, P. Shirley, S. Baer Intro and Chapter 1 Goals Give student some idea.
Mehmet Can Vuran, Instructor University of Nebraska-Lincoln Acknowledgement: Overheads adapted from those provided by the authors of the textbook.
03/05/2008CSCI 315 Operating Systems Design1 Memory Management Notice: The slides for this lecture have been largely based on those accompanying the textbook.
Real-Time Face Detection and Tracking Using Multiple Cameras RIT Computer Engineering Senior Design Project John RuppertJustin HnatowJared Holsopple This.
CHAPTER 7 Viewing and Transformations © 2008 Cengage Learning EMEA.
Operating Systems Concepts 1. A Computer Model An operating system has to deal with the fact that a computer is made up of a CPU, random access memory.
AGD: 5. Game Arch.1 Objective o to discuss some of the main game architecture elements, rendering, and the game loop Animation and Games Development.
DEMONSTRATION FOR SIGMA DATA ACQUISITION MODULES Tempatron Ltd Data Measurements Division Darwin Close Reading RG2 0TB UK T : +44 (0) F :
X3D Extension for (Mobile) AR Contents International AR Standards Workshop Seoul, Korea Oct 11-12, 2010 Gerard J. Kim (WG 6 AR Standards Study Group Coordinator)
Chapter 3.1:Operating Systems Concepts 1. A Computer Model An operating system has to deal with the fact that a computer is made up of a CPU, random access.
Course Overview, Introduction to CG Glenn G. Chappell U. of Alaska Fairbanks CS 381 Lecture Notes Friday, September 5, 2003.
M. Taimoor Khan * Java Server Pages (JSP) is a server-side programming technology that enables the creation of dynamic,
FALL 2005CSI 4118 – UNIVERSITY OF OTTAWA1 Part 4 Web technologies: HTTP, CGI, PHP,Java applets)
3D Stereo Reconstruction using iPhone Devices Final Presentation 24/12/ Performed By: Ron Slossberg Omer Shaked Supervised By: Aaron Wetzler.
Shape Recognition and Pose Estimation for Mobile Augmented Reality Author : N. Hagbi, J. El-Sana, O. Bergig, and M. Billinghurst Date : Speaker.
Zhonghua Qu and Ovidiu Daescu December 24, 2009 University of Texas at Dallas.
Input/OUTPUT [I/O Module structure].
Data File Access API : Under the Hood Simon Horwith CTO Etrilogy Ltd.
OpenGL Shading Language (Advanced Computer Graphics) Ernest Tatum.
Computer Graphics World, View and Projection Matrices CO2409 Computer Graphics Week 8.
Department of Computer Science and Engineering, CUHK 1 Final Year Project 2003/2004 LYU0302 PVCAIS – Personal Video Conference Archives Indexing System.
Lecture 5: Interaction 1  Principles of Interactive Graphics  CMSCD2012  Dr David England, Room 711,  ex 2271 
Alice 2.0 Introductory Concepts and Techniques Project 1 Exploring Alice and Object-Oriented Programming.
CS 450: COMPUTER GRAPHICS PORTRAIT OF AN OPENGL PROGRAM SPRING 2015 DR. MICHAEL J. REALE.
In the next step you will enter some data records into the table. This can be done easily using the ‘Data Browser’. The data browser can be accessed via.
Using Xcode A Beginner’s Tutorial Erin Green. This tutorial will walk you through Xcode, a software development tool for Apple’s iOS applications – We.
CGMB214: Introduction to Computer Graphics
Lecture 11: Exam Revision 1  Principles of Interactive Graphics  CMSCD2012  Dr David England, Room 718,  ex 2271  Coursework.
X-WindowsP.K.K.Thambi The X Window System Module 5.
PROJECT - ZYNQ Yakir Peretz Idan Homri Semester - winter 2014 Duration - one semester.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
CSE 381 – Advanced Game Programming GLSL. Rendering Revisited.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Chapter 1 : Overview of Computer and Programming By Suraya Alias
Chapter – 8 Software Tools.
Copyright 2007, Information Builders. Slide 1 iWay Web Services and WebFOCUS Consumption Michael Florkowski Information Builders.
Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.
COMP091 – Operating Systems 1 Memory Management. Memory Management Terms Physical address –Actual address as seen by memory unit Logical address –Address.
Basic Organization of UI Software. 2 The User Interface n Typically want to think of “UI” as only one component of an overall system – The part that “deals.
: Information Retrieval อาจารย์ ธีภากรณ์ นฤมาณนลิณี
Event Management. EMU Graham Heyes April Overview Background Requirements Solution Status.
What you need: In order to use these programs you need a program that sends out OSC messages in TUIO format. There are a few options in programs that.
OCR A Level F453: The function and purpose of translators Translators a. describe the need for, and use of, translators to convert source code.
The Stingray Example Program CMT3311. Stingray - an example 2D game May be useful as a simple case study Most 2D games need to solve generic problems.
5/5/2018 1:12 AM © Microsoft Corporation. All rights reserved. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS.
Term Project: Overview
Ogre Overview Lecture 3.
In-situ Visualization using VisIt
Ogre Overview.
Term Project: Overview
More on Graphical User Interfaces
TerraForm3D Plasma Works 3D Engine & USGS Terrain Modeler
Oracle Sales Cloud Sales campaign
Mixed Reality Server under Robot Operating System
OpenGL-Rendering Pipeline
Presentation transcript:

Mixed Reality Systems -Lab IV – Augmented Reality- Christoph Anthes

Lab Mixed Reality Systems 2 Overview ARToolKit Combining ARToolKit with OpenSG Initialisation ARToolKit Loop Helper Functions Tracking Objects Interaction with objects Combining ARToolKit with inVRs

Lab Mixed Reality Systems 3 ARToolKit Current Version available at Sourceforge page C and C++ API Cross-platform API (Windows, Linux, MacOS, IRIX) OpenGL used for rendering, GLUT used for event handling Used video library depends on chosen platform Architecture – General Picture Build on GLUT with OpenGL, C and C++ Makes use of the video device specific and graphics drivers Often used only as an independent tracking library Own application is designed to be build on top of OpenGL and ARToolKit Interfaces to larger tracking libraries exist (e.g. OpenTracker) From

Lab ARToolKit Architecture – ARToolKit Focus AR module Core module with marker tracking, calibration and parameter collection Video module Collection of video routines for capturing the video input frames Wrapper around the standard platform SDK video capture routines Gsub module Graphic routines based on the OpenGL and GLUT libraries Gsub_lite module Replaces GSub with a more efficient collection of graphics routines Independent of any particular windowing toolkit 4 Mixed Reality Systems From

Lab ARToolKit Coordinate Systems Several coordinate systems are to be used Most important camera and marker coordinates From camera to screen coordinates a transformation via a distortion function can be performed Z-axis of the marker is pointing upward Z-axis of the camera is pointing in the scene Top left corner of the screen is 0,0 arGetTransMat() Returns the coordinates of the marker in the Camera coordinate System arMatrixInverse() Returns the coordinates of the camera in the marker coordinate system 5 Mixed Reality Systems From

Lab Mixed Reality Systems 6 ARToolKit Basic Application From

Lab Mixed Reality Systems 7 ARToolKit Corresponding Function Calls in plain ARToolKit API From

Lab Mixed Reality Systems 8 ARToolKit Connections to Scene Graphs in General Three steps Initialising the Camera Transforming the Object “Real Occluders” – slightly advanced OpenSG Creation of a new node with fscEdit OpenSceneGraph Extension of scene view class Display of occluding geometry Overwrite colour buffer with video image Finally display of recognised objects Two big approaches OSGART ( OSGAR (

Lab Mixed Reality Systems 9 Combining ARToolKit and OpenSG OpenSG provides examples for interconnecting either ARToolKit or ARToolKit Plus We are going to work with ARToolKit So we first need the additional include files gsub.h – contains main display functions used in ARToolkit video.h – provides multi-platform video input support for ARToolKit param.h – contains principal routines for loading, saving, and modify camera parameters ar.h – provides image analysis and marker detection routines

Lab Mixed Reality Systems 10 Combining ARToolKit and OpenSG If we take a look at our example we start with a set of forward declarations some which we have not seen in the previous OpenSG or inVRs tutorial Additionally more methods are used at the end of the code We start with the setup and cleanup methods: initARToolkit() is used for initialisation of the ARToolKit components of the example initOpenSG() initialises the OpenSG setup of the example initGlut() registers the GLUT callbacks as we have seen in previous examples setupCamera () provides a setup for our interconnected webcam cleanupARToolkit() stops the video capture of ARToolKit and closes the video stream processing cleanupOpenSG() frees used variables, stops the binding to ARToolKit, and call osgExit() Let’s have a detailed look at the setup methods

Lab Mixed Reality Systems 11 Combining ARToolKit and OpenSG initARToolkit() Wraps and calls the different internal setup functions for ARToolKit The cleanup method is registered as a callback at program termination

Lab Mixed Reality Systems 12 Combining ARToolKit and OpenSG setupCamera() The camera setup is defined in this function Camera parameters from calibration file are parsed and evaluated Conversion has to take place in order to write data out in the right format The ModelViewMatrix and the ProjectionMatrix are set This setup is stored inside an OpenSG camera object The camera parameters are stored in binary files E.g. currentParams.mat

Lab Mixed Reality Systems 13 Combining ARToolKit and OpenSG setupCamera()

Lab Mixed Reality Systems 14 Combining ARToolKit and OpenSG initOpenSG() The standard OpenSG setup is performed A GLUT window is created and initialised A root node with an anonymous group core is created The previously described camera setup is triggered A background object is interconnected with a video texture The SimpleSceneManager is initialised and interconnected with the window and the root node of the scene The background image as well as the camera are attached to the viewport of the just generated window All changes are committed to OpenSG Finally the OpenSG cleanup function is registered to be triggered at the termination of the application

Lab Mixed Reality Systems 15 Combining ARToolKit and OpenSG initOpenSG()

Lab Mixed Reality Systems 16 Combining ARToolKit and OpenSG Then we have the ARToolKit processing methods In the captureFrame() method the image is retrieved from the camera The detectMarkers() function triggers marker detection and outputs the amount of found markers applyMarkerTrans() applies the transformation from a given marker to an OpenSG transformation core ARToolKit Loop These steps represent our steps 2-4 from our ARToolKit loop Step 1 was the initialisation given with the previous set of functions Step 5 the rendering is performed by OpenSG They have to be processed frame by frame, thus they are called inside the display loop of the application

Lab Mixed Reality Systems 17 Combining ARToolKit and OpenSG captureFrame() This method retrieves an image which was captured by ARToolKit from the incoming video stream The image data is set in an OpenSG image and an update notification is issued This frame will later on be rendered as a background image It is as well used by ARToolKit for image processing which is performed in the next step

Lab Mixed Reality Systems 18 Combining ARToolKit and OpenSG detectMarkers() In this method the markers visible on the image are detected The amount of detected markers as well as the markers themselves are returned by reference A threshold parameter determines the binarisation of the image Output describing the amount of detected markers as well as the ids of the markers is generated on the console

Lab Mixed Reality Systems 19 Combining ARToolKit and OpenSG applyMarkerTrans() Extracts the transformation information from a marker and applies it to a given OpenSG model By using an STL map a binding between The marker transformation is requested from ARToolKit via arGetMarkerTrans It is then converted into an OpenSG matrix If a marker is found in the map it becomes activated again and the transformation matrix just retrieved is applied on an OpenSG object The changes are finally committed

Lab Mixed Reality Systems 20 Combining ARToolKit and OpenSG applyMarkerTrans()

Lab Mixed Reality Systems 21 Combining ARToolKit and OpenSG Additional methods which are used as helper functions and appear in the code The method arMatrixToOSGMatrix() performs a data conversion from ARToolKit to OpenSG The argConvGLcpara() method is used for conversion of ARToolKit camera parameters to OpenGL parameters getTranslation() returns the translation vector of a transformation matrix

Lab Mixed Reality Systems 22 Combining ARToolKit and OpenSG arMatrixToOSGMatrix() Some very basic reformatting of an ARToolKit Matrix to an OpenSG Matrix is performed in this method

Lab Mixed Reality Systems 23 Combining ARToolKit and OpenSG argConvGLcpara() is used to transform ARToolKit intrinsic camera parameters matrix format to an OpenGL matrix format More details on camera calibration and the parameters are given in the computer vision class in Winter semester

Lab Mixed Reality Systems 24 Combining ARToolKit and OpenSG Additional methods which are used as helper functions and appear in the code createPattern() connects a marker with an OpenSG sub scene graph createBackground() creates an image background based on the image gathererd from the ARToolKit Video stream createModel() loads a sub scene graph from disk and equips it with an additional transformation code Before we start coding let’s have a more detailed look at these methods

Lab Mixed Reality Systems 25 Combining ARToolKit and OpenSG createPattern() This helper method interconnects a pattern from ARToolKit with a sub scene graph provided by OpenSG, it uses createModel() as helper The pattern is then registered at an STL map

Lab Mixed Reality Systems 26 Combining ARToolKit and OpenSG createBackground() This method creates the image background of OpenSG based on the data gathered from the ARToolKit video stream

Lab Mixed Reality Systems 27 Combining ARToolKit and OpenSG createModel() This is a helper method which simply loads a model and attaches it to a node with a ComponentTransform core Now we should know all necessary helper functions and will go on with the main function and the display function

Lab Mixed Reality Systems 28 Tracking Objects Our main function in this example is very simple since most of the processing is performed in the display loop Initialisation of OpenSG and ARToolKit is performed The video loop and the display loop are triggered We now insert our first snippet into main in order to display an object on a marker Compile and execute now Snippet 1-1

Lab Mixed Reality Systems 29 Tracking Objects If we now take a closer look at our display loop we can see at the beginning our 3 ARToolKit steps We should now enhance the scene by adding two objects and removing the first one If you compile and execute you code now you should be able to see two objects Now we want to perform some interaction between the objects Snippet 2-1

Lab Mixed Reality Systems 30 Tracking Objects With the next snippet we check the proximity of the two objects and change the scale First we retrieve the transformations of the objects and calculate the distance Snippet 2-2 – First Part

Lab Mixed Reality Systems 31 Interaction with Objects We calculate the scale based on the distance and apply it on the transformation matrices of our two objects Afterwards changes have to be committed Snippet 2-2 – Second Part

Lab Mixed Reality Systems 32 Interaction with Objects If you execute your code now you should see something like this (most likely with a different user)

Lab Mixed Reality Systems 33 Interaction with Objects In the next step things will become more complicated an object should move from a marker to another marker Actually we want to have a frog jump from one stone on a marker to another stone on a different marker First we uncomment the scaling part of the object by inserting the following snippet Snippet 3-1

Lab Mixed Reality Systems 34 Interaction with Objects We initialise a frog model and other nodes for our animation Snippet 3-2

Lab Mixed Reality Systems 35 Interaction with Objects Now we trigger the initialisation in the main method For the animation of motion between one and another stone we install a timer Inside the Display loop delta values are calculated in order to determine the time since the last frame An overall counter is incremented Snippet 3-3 Snippet 3-4

Lab Mixed Reality Systems 36 Interaction with Objects For animating a jumping frog from one stone to another stone, we implement a state machine with 5 states SITTING – the frog sits for a given time before it can try to jump, afterwards it automatically switches to the WAITING state WAITING – two markers have to be close enough to make the frog jump, if the proximity check returns positively it will jump JUMPING – in the state a change in the scene graph is performed. The frog node is attached to a transformation node FLYING – in this state the actual calculation for the trajectory of the frog is calculated and applied on the transformation node. Once the target is reached the state changes to LANDING LANDING – as in the JUMPING state a change in the scene graph hierarchy is performed. The state machine switches to SITTING again. The whole state machine is implemented inside a single snippet through which we will go through now

Lab Mixed Reality Systems 37 Interaction with Objects State machine – SITTING – WAITING We check the passed time in sitting state and switch after 2000ms to the WAITING state. The counter is reset afterwards. In the WAITING state we check the marker proximity and switch to the JUMPING state if successful. Snippet 3-5 Part 1

Lab Mixed Reality Systems 38 Interaction with Objects State machine – JUMPING We detach the frog from the stone and attach it to a transformation core the transformation matrix of the core is initialised with the markers transformation matrix The state changes to FLYING and the counter is reset. Snippet 3-5 Part 2

Lab Mixed Reality Systems 39 Interaction with Objects State machine – FLYING Retrieve the target position and the current position and determine the path of the frog In case the flying time becomes to high we switch back to the LANDING state If the distance between the frog and the target becomes to big it is going back Snippet 3-5 Part 3

Lab Mixed Reality Systems 40 Interaction with Objects State machine – FLYING continued If everything went alright we determine the direction vector by normalising the path We scale this normalised vector to move the frog at a constant speed based on the current deltaTime No we calculate a transformation matrix based on this vector Finally the generated transformation is applied on the frog transformation Snippet 3-5 Part 4

Lab Mixed Reality Systems 41 Interaction with Objects State machine – LANDING The frog is detached from the transformation node and attached to the new marker The counter is reset and the state is set back to the initial SITTING state At the end of each iteration of the display loop the changes are committed Snippet 3-4 Part 5

Lab Mixed Reality Systems 42 Interaction with Objects This is how it should look The final step is fully up to you to implement Have a frog jumping between three markers You will have to load an additional marker You will have to determine the distance between the three different markers and determine the closest Let the frog hop from one marker to the closest marker

Lab Mixed Reality Systems 43 Combining ARToolKit with inVRs inVRs provides a simple binding to the ARToolKit which implements interaction with markers and the scene The next lab will show some basic desktop interaction with the ARToolKit To enable full AR support for the inVRs framework the input interface as well as the output interface can be enhanced Things to do at home Try to make two frogs switch stones when they look at each other Take a look at multi-pattern calibration Try to interconnect inVRs with the ARToolKit and write an ARToolKit marker input device Take a look at ARToolKit Plus, what are the advantages? Can you manage to interconnect ARToolKit Plus with OpenSG or inVRs?

Lab Mixed Reality Systems 44 Useful Links ARToolKit - Sourceforge Entry pagehttp://artoolkit.sourceforge.net/ - GPL versionhttp:// - commercial versionhttp:// ARToolKit Plus Web Page OpenSG Web Page inVRs Web Page

Thank You !