PlayAnywhere: A Compact Interactive Tabletop Projection-Vision System Professor : Tsai, Lian-Jou Student : Tsai, Yu-Ming PPT Production rate : 100% Date.

Slides:



Advertisements
Similar presentations
Video Object Tracking and Replacement for Post TV Production LYU0303 Final Year Project Spring 2004.
Advertisements

QR Code Recognition Based On Image Processing
Vision-Based Finger Detection and Its Applications 基於電腦視覺之手指偵測及其應用 Yi-Fan Chuang Advisor: Prof. Yi-Ping Hung Prof. Ming-Sui Lee.
Computers off the Desk: Bugs, Hallway Bots, and Teddy Bears Steven Bathiche and Andy Wilson.
Three Dimensional Viewing
VisHap: Guangqi Ye, Jason J. Corso, Gregory D. Hager, Allison M. Okamura Presented By: Adelle C. Knight Augmented Reality Combining Haptics and Vision.
Su-ting, Chuang 2010/8/2. Outline Introduction Related Work System and Method Experiment Conclusion & Future Work 2.
Intelligent Systems Lab. Extrinsic Self Calibration of a Camera and a 3D Laser Range Finder from Natural Scenes Davide Scaramuzza, Ahad Harati, and Roland.
Instructor: Mircea Nicolescu Lecture 13 CS 485 / 685 Computer Vision.
December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
Edge Detection CSE P 576 Larry Zitnick
Virtual Dart: An Augmented Reality Game on Mobile Device Supervisor: Professor Michael R. Lyu Prepared by: Lai Chung Sum Siu Ho Tung.
Active Calibration of Cameras: Theory and Implementation Anup Basu Sung Huh CPSC 643 Individual Presentation II March 4 th,
The Science of Digital Media Microsoft Surface 7May Metropolia University of Applied Sciences Display Technologies Seminar.
Video Object Tracking and Replacement for Post TV Production LYU0303 Final Year Project Spring 2004.
CS485/685 Computer Vision Prof. George Bebis
ART: Augmented Reality Table for Interactive Trading Card Game Albert H.T. Lam, Kevin C. H. Chow, Edward H. H. Yau and Michael R. Lyu Department of Computer.
Scenes, Cameras & Lighting. Outline  Constructing a scene  Using hierarchy  Camera models  Light models.
Computer-Based Animation. ● To animate something – to bring it to life ● Animation covers all changes that have visual effects – Positon (motion dynamic)
... M A K E S Y O U R N E T W O R K S M A R T E R Lenses & Filters.
© 2009, TSI Incorporated Stereoscopic Particle Image Velocimetry.
Introduce about sensor using in Robot NAO Department: FTI-FHO-FPT Presenter: Vu Hoang Dung.
Introduction to Machine Vision Systems
On the Design, Construction and Operation of a Diffraction Rangefinder MS Thesis Presentation Gino Lopes A Thesis submitted to the Graduate Faculty of.
MULTI-TOUCH TABLE Athena Frazier Chun Lau Adam Weissman March 25, 2008 Senior Projects II.
“S ixth Sense is a wearable gestural interface device that augments the physical world with digital information and lets people use natural hand gestures.
Supporting Beyond-Surface Interaction for Tabletop Display Systems by Integrating IR Projections Hui-Shan Kao Advisor : Dr. Yi-Ping Hung.
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
GENERAL PRESENTATION SUBMITTED BY:- Neeraj Dhiman.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
ECEN 4616/5616 Optoelectronic Design Class website with past lectures, various files, and assignments: (The.
Viewing Angel Angel: Interactive Computer Graphics5E © Addison-Wesley
Surface Computing Turning everyday surfaces into interactive intelligent interfaces Co-located input and output Mixed reality: tangible objects, natural.
High-Resolution Interactive Panoramas with MPEG-4 발표자 : 김영백 임베디드시스템연구실.
Shape from Stereo  Disparity between two images  Photogrammetry  Finding Corresponding Points Correlation based methods Feature based methods.
Computer Graphics Bing-Yu Chen National Taiwan University.
December 4, 2014Computer Vision Lecture 22: Depth 1 Stereo Vision Comparing the similar triangles PMC l and p l LC l, we get: Similarly, for PNC r and.
Multitouch Game Table Senior Design Fall 2006 What we did right and what we did wrong… Aditya Mittal James Wallace Albert You Paul Ferrara.
Grant Thomas Anthony Fennell Justin Pancake Chris McCord TABLEGAMES UNLIMITED.
Cmput412 3D vision and sensing 3D modeling from images can be complex 90 horizon 3D measurements from images can be wrong.
Laser-Based Finger Tracking System Suitable for MOEMS Integration Stéphane Perrin, Alvaro Cassinelli and Masatoshi Ishikawa Ishikawa Hashimoto Laboratory.
Projector Calibration of Interactive Multi-Resolution Display Systems 互動式多重解析度顯示系統之投影機校正 Presenter: 邱柏訊 Advisor: 洪一平 教授.
CS5500 Computer Graphics March 12, 2007.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
Su-ting, Chuang 2010/8/2. Outline Introduction Related Work System and Method Experiment Conclusion & Future Work 2.
12/24/2015 A.Aruna/Assistant professor/IT/SNSCE 1.
Su-ting, Chuang 2010/8/2. Outline Introduction Related Works System and Method Experiment Conclusion & Future Work 2.
Su-ting, Chuang 1. Outline Introduction Related work Hardware configuration Detection system Optimal parameter estimation framework Conclusion 2.
October 16, 2014Computer Vision Lecture 12: Image Segmentation II 1 Hough Transform The Hough transform is a very general technique for feature detection.
1 Angel: Interactive Computer Graphics 4E © Addison-Wesley 2005 Classical Viewing Ed Angel Professor of Computer Science, Electrical and Computer Engineering,
Rendering Pipeline Fall, D Polygon Rendering Many applications use rendering of 3D polygons with direct illumination.
Classical Viewing Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico.
By Pushpita Biswas Under the guidance of Prof. S.Mukhopadhyay and Prof. P.K.Biswas.
Project Multi-Touch Albert You Aditya Mittal Paul Ferrara Jim Wallace.
Su-ting, Chuang 1. Outline Introduction Related work Hardware configuration Finger Detection system Optimal parameter estimation framework Conclusion.
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
MULTI TOUCH. Introduction Multi-touch is a human-computer interaction technique. Consists of a touch screen as well as software that recognizes multiple.
What you need: In order to use these programs you need a program that sends out OSC messages in TUIO format. There are a few options in programs that.
1 A Presentation On MOBILE PHONE PROJECTOR GUIDED BY: PRESENTED BY: MR. PANKAJ KHANDELWAL VISHNU GUPTA LECTURER B.TECH. (8 th Sem.) INSTITUTE OF ENGINEERING.
TOUCHLESS TOUCHSCREEN USER INTERFACE
IMAGE PROCESSING is the use of computer algorithms to perform image process on digital images   It is used for filtering the image and editing the digital.
EYE TRACKING TECHNOLOGY
Rendering Pipeline Fall, 2015.
CS4670 / 5670: Computer Vision Kavita Bala Lec 27: Stereo.
Classical Viewing Ed Angel
Common Classification Tasks
Three Dimensional Viewing
Course 6 Stereo.
University of New Mexico
Introduction to Artificial Intelligence Lecture 22: Computer Vision II
Presentation transcript:

PlayAnywhere: A Compact Interactive Tabletop Projection-Vision System Professor : Tsai, Lian-Jou Student : Tsai, Yu-Ming PPT Production rate : 100% Date : 2011/3/23 Andrew D. Wilson Microsoft Research, Redmond, WA UIST '05 Proceedings of the 18th annual ACM symposium on User interface software and technology 2005, p STU ESLab 1

Outline Introduction Hardware selection Image Rectification Touch and Hover PlayAnywhere Visual Code Page Tracking Flow Move Conclusion STU ESLab 2

Introduction (1/2) The systems are based on the combination of projection for display and computer vision techniques for sensing. The projector and cameras,as well as computing resources (CPU, storage, …) are built into the same compact device. This combined projecting and sensing pod may be quickly placed on any flat surface in the user’s environment, and requires no calibration of the projection or sensing system. STU ESLab 3

Introduction (2/2) We introduce a touch detection algorithm based on the observation of shadows, a fast, simple visual code format and detection algorithm, the ability to continuously track sheets of paper, and finally, an optical flowbased algorithm for the manipulation of onscreen objects. STU ESLab 4

Hardware selection (1/3) Uses a NEC WT600 DLP projector to project a 40” diagonal image onto an ordinary table surface. The NEC WT600 uses four aspheric mirrors to project a normal 1024x768 rectangular image from a very oblique angle at extremely short distance. For a 40” diagonal image, the NEC WT600 requires 2.5” between its leading face and the projection surface, while a 100” diagonal image is obtained at a distance of 26”. STU ESLab 5

Hardware selection (2/3) We place the IR illuminant off axis from the single camera so that objects above the surface generate controlled shadows indicating height. PlayAnywhere uses an Opto Technology OTLH-0070-IR high power LED package and a Sony ExView analog gray scale CCD NTSC camera. The camera is mounted near the top of the projector, giving an oblique view of the surface. A very wide angle micro lens (2.9mm focal length) is suitable to capture the entire projected surface. STU ESLab 6

Hardware selection (3/3) Advantages: (a) Difficult and dangerous overhead installation of the projector is avoided. (b) there is no need to re-calibrate the camera and projection to the surface when the unit is moved. (c) With the oblique projection, occlusion problems typical of front-projected systems are minimized. (d) A 40” diagonal projection surface is adequate for many advanced interactive table applications. STU ESLab 7

Image Rectification (1/2) The wide angle lens imparts significant barrel distortion on the input image, while the oblique position of the camera imparts a projective distortion or foreshortening. first step, an image rectification process removes both distortions via standard bilinear interpolation techniques. STU ESLab 8

Image Rectification (2/2) STU ESLab 9

Touch and Hover (1/2) One approach is to project a sheet of infrared light just above the surface and watch for fingers intercepting the light from just above. we present a technique which exploits the change in appearance of shadows as an object approaches the surface STU ESLab 10

Touch and Hover (2/2) It calculate the exact height of the finger over the surface if the finger and its shadow are matched to each other and tracked. The precision of touch location is limited by the resolution of the imaged surface (grating charts toabout 3-4mm, approximately 4.5 image pixels). STU ESLab 11

PlayAnywhere Visual Code (1/3) Visual codes are useful to locate and identify game pieces, printed pages, media containers, knobs and other objects that are generic. The visual code design 12 bit code (supports 4,096 unique codes) for each of the game piece types in chess. STU ESLab 12

PlayAnywhere Visual Code (2/3) 1. Compute the edge intensity and orientation everywhere in the image using the Sobel filter. 2. For each pixel with sufficiently high edge intensity, use the edge orientation to establish a rotated local coordinate system. a. Read each pixel value corresponding to each bit in the code according to the code layout. b. Check the code value against a table of codes used in the current application. 3. Rank each candidate according to some criteria. STU ESLab 13

PlayAnywhere Visual Code (3/3) The contours can be found quickly using the Hough transform applied to circles, reusing the edge orientation information computed above. For each pixel in the input image,calculating the center of the circle of a given radius and edge orientation found at the input coordinates. STU ESLab 14

Page Tracking Page tracking algorithm is based on a Hough transform with the Sobel edge and orientation information as input. Use Two pair of parallel lines perpendicular to each other is verified as a page by ensuring that there are strong edges along a significant fraction of the lines in the original Sobel image. STU ESLab 15

Flow Move (1/2) One approach to implementing an interaction that allows translation, rotation and scaling of an onscreen object is to track one finger of hands placed on the virtual object, and then continuously calculate the change in position, orientation and scale based on the relative motion of those tracked points. Use Optical flow computations make very few assumptions about the nature of the input images and typically only requirethat there be sufficient local texture on the moving object. STU ESLab 16

Flow Move (2/2) STU ESLab 17

Conclusion PlayAnywhere, an interactive projection vision system with a number of sensing capabilities that demonstrate the flexibility by computer vision processes. Computer vision techniques have a high computational cost. Most commodity camera systems acquire images at only 30Hz, which is not fast enough to support certain kinds of high fidelity input. PlayAnywhere suggests a form factor that is in many ways more attractive than either rear-projection systems or front projected systems. STU ESLab 18

Thanks for your listening!! STU ESLab 19