CIRP Annals - Manufacturing Technology 60 (2011) 1–4 Augmented assembly technologies based on 3D bare-hand interaction S.K. Ong (2)*, Z.B. Wang Mechanical.

Slides:



Advertisements
Similar presentations
Advanced Manufacturing Laboratory Department of Industrial Engineering Sharif University of Technology Session # 14.
Advertisements

S 1 Intelligent MultiModal Interfaces Manuel J. Fonseca Joaquim A. Jorge
Results/Conclusions: In computer graphics, AR is achieved by the alignment of the virtual camera with the actual camera and the virtual object with the.
Real-Time Dynamic Wrinkles Caroline Larboulette Marie-Paule Cani GRAVIR Lab, Grenoble, France.
Vision-Based Finger Detection and Its Applications 基於電腦視覺之手指偵測及其應用 Yi-Fan Chuang Advisor: Prof. Yi-Ping Hung Prof. Ming-Sui Lee.
Rapid Object Detection using a Boosted Cascade of Simple Features Paul Viola, Michael Jones Conference on Computer Vision and Pattern Recognition 2001.
© University of Strathclyde Assessing Aesthetic Quality Martin Fitchie University of Strathclyde.
VisHap: Guangqi Ye, Jason J. Corso, Gregory D. Hager, Allison M. Okamura Presented By: Adelle C. Knight Augmented Reality Combining Haptics and Vision.
Page 1 SIXTH SENSE TECHNOLOGY Presented by: KIRTI AGGARWAL 2K7-MRCE-CS-035.
Move With Me S.W Graduation Project An Najah National University Engineering Faculty Computer Engineering Department Supervisor : Dr. Raed Al-Qadi Ghada.
KAIST CS780 Topics in Interactive Computer Graphics : Crowd Simulation A Task Definition Language for Virtual Agents WSCG’03 Spyros Vosinakis, Themis Panayiotopoulos.
Overview of Computer Vision CS491E/791E. What is Computer Vision? Deals with the development of the theoretical and algorithmic basis by which useful.
1 / 31 CS 425/625 Software Engineering User Interface Design Based on Chapter 15 of the textbook [SE-6] Ian Sommerville, Software Engineering, 6 th Ed.,
Interactive Systems Technical Design
ART: Augmented Reality Table for Interactive Trading Card Game Albert H.T. Lam, Kevin C. H. Chow, Edward H. H. Yau and Michael R. Lyu Department of Computer.
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
Augmented Reality: Object Tracking and Active Appearance Model
Interactive Sand Art Draw Using RGB-D Sensor Presenter : Senhua Chang.
10th Workshop "Software Engineering Education and Reverse Engineering" Ivanjica, Serbia, 5-12 September 2010 First experience in teaching HCI course Dusanka.
Research Area B Leif Kobbelt. Communication System Interface Research Area B 2.
“S ixth Sense is a wearable gestural interface device that augments the physical world with digital information and lets people use natural hand gestures.
Satellites in Our Pockets: An Object Positioning System using Smartphones Justin Manweiler, Puneet Jain, Romit Roy Choudhury TsungYun
Chapter 11: Interaction Styles. Interaction Styles Introduction: Interaction styles are primarily different ways in which a user and computer system can.
Welcome to CGMB574 Virtual Reality Computer Graphics and Multimedia Department.
GIP: Computer Graphics & Image Processing 1 1 Medical Image Processing & 3D Modeling.
Fingertip Tracking Based Active Contour for General HCI Application Proceedings of the First International Conference on Advanced Data and Information.
Introduction In recent years, products are required to follow the trend of fashion. It is very popular in using freeform surface to design the model of.
3D Fingertip and Palm Tracking in Depth Image Sequences
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Presentation by: K.G.P.Srikanth. CONTENTS  Introduction  Components  Working  Applications.
Speaker : Meng-Shun Su Adviser : Chih-Hung Lin Ten-Chuan Hsiao Ten-Chuan Hsiao Date : 2010/01/26 ©2010 STUT. CSIE. Multimedia and Information Security.
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
CSCE 5013 Computer Vision Fall 2011 Prof. John Gauch
EFFICIENT ROAD MAPPING VIA INTERACTIVE IMAGE SEGMENTATION Presenter: Alexander Velizhev CMRT’09 ISPRS Workshop O. Barinova, R. Shapovalov, S. Sudakov,
Enabling User Interactions with Video Contents Khalad Hasan, Yang Wang, Wing Kwong and Pourang Irani.
© University of Strathclyde Martin Fitchie University of Strathclyde Research Presentation Day 2004 Integrating Tolerance Analysis and.
1 CP586 © Peter Lo 2003 Multimedia Communication Human Computer Interaction.
Integrating Active Tangible Devices with a Synthetic Environment for Collaborative Engineering Sandy Ressler Brian Antonishek Qiming Wang Afzal Godil National.
Software Engineering User Interface Design Slide 1 User Interface Design.
Design for Manufacture and Assembly Introduction Note: this is an adaptation of a presentation from Dr. Denis Cormier, intended for use as an Intro to.
1 1 Spatialized Haptic Rendering: Providing Impact Position Information in 6DOF Haptic Simulations Using Vibrations 9/12/2008 Jean Sreng, Anatole Lécuyer,
Hand Gesture Recognition Using Haar-Like Features and a Stochastic Context-Free Grammar IEEE 高裕凱 陳思安.
Spring 2007 COMP TUI 1 Computer Vision for Tangible User Interfaces.
CONTENT FOCUS FOCUS INTRODUCTION INTRODUCTION COMPONENTS COMPONENTS TYPES OF GESTURES TYPES OF GESTURES ADVANTAGES ADVANTAGES CHALLENGES CHALLENGES REFERENCE.
2006/10/25 1 A Virtual Endoscopy System Author : Author : Anna Vilanova 、 Andreas K ö nig 、 Eduard Gr ö ller Source :Machine Graphics and Vision, 8(3),
Outline Introduction Related Work System Overview Methodology Experiment Conclusion and Future Work.
Frank Bergschneider February 21, 2014 Presented to National Instruments.
A Speech Interface to Virtual Environment Authors Scott McGlashan and Tomas Axling Swedish Institute of Computer Science.
SixthSense Technology Visit to download
HAPTIC SYSTEMS.
Introduction to Machine Learning, its potential usage in network area,
Hand Gestures Based Applications
SIXTH SENSE TECHNOLOGY
Visual Information Retrieval
  A preliminary study: Perceptions of aviation maintenance students related to the use of Augmented Reality maintenance instructions Amadou Anne, Yu Wang.
AHED Automatic Human Emotion Detection
Topic for Presentaion-2
Introduction to Graphics Modeling
Depth Perception in Medical Augmented Reality
VIRTUAL/REMOTE LABS By Dr. Sandip Kumar Raut Lecturer in Tabla
Mixed Reality Server under Robot Operating System
AN INTRODUCTION TO COMPUTER GRAPHICS Subject: Computer Graphics Lecture No: 01 Batch: 16BS(Information Technology)
Systems Analysis and Design in a Changing World, 6th Edition
EE 492 ENGINEERING PROJECT
AHED Automatic Human Emotion Detection
Embodied Interfaces Many people assume that embodied agents (i.e. virtual humans) will enhance HCI as they can…. take advantage of pre-existing social.
Chapter 9 System Control
Robot Programming Through Augmented Trajectories in Augmented Reality
Overview of augmented reality By M ISWARYA. CONTENTS  INTRODUCTION  WHAT IS AR?  ROLE OF AR  EXAMPLES  APPLICATION  ADVANTAGES  CONCLUSION.
Presentation transcript:

CIRP Annals - Manufacturing Technology 60 (2011) 1–4 Augmented assembly technologies based on 3D bare-hand interaction S.K. Ong (2)*, Z.B. Wang Mechanical Engineering Department, Faculty of Engineering, National University of Singapore 2013 / 06 / 13 指導教授 : 洪弘祈老師 李正隆老師 報告者 : 廖偉丞 組員 : 許丁友 張琴翊

contents 1. Abstract 2. Introduction 3. 3D bare-hand interaction method 4. Assembly data management 5. Augmented assembly process 6. Assembly sequence evaluation and feedback 7. Implementation and case study 8. Conclusion and future work

Abstract  Augmented reality has been applied to develop augmented assembly systems.  However, most reported studies used pre- defined assembly information.  AR is predominantly used to display information and interactions between users and the augmented environment are limited. 1/2

Abstract  This paper presents 3D barehand interaction in an augmented assembly environment to manipulate and assemble virtual components.  A hybrid method based on constraint analysis is presented, which interprets users’ manual assembly intents robustly without the need for auxiliary CAD information. 2/2

Introduction  In recent years, virtual reality and virtual prototyping techniques have been widely used to simulate and evaluate assembly in the early design stage.  The assembly planning experience is limited to a pure virtual environment due to a lack of real spatial feeling and suitable sensory feedback. 1/3

Introduction  Augmented assembly is an application of augmented reality in assembly where an augmented environment is created, in which virtual objects are combined with the real environment to enhance the assembly design and planning process.  AA system that interprets users’ manual assembly intents, supports on-line constraint recognition, and provides a robust 3D bare-hand interface to allow realistic visual feedback during assembly. 2/3

Introduction  A bare-hand interaction augmented assembly (BHAA) system has been developed. 3/3

3D bare-hand interaction method  To achieve natural and intuitive human computer interaction (HCI), human hands can be used as interaction devices in AEs.  Computer vision (CV) based human hand detection and tracking techniques can identify bare-hand gestures from video streams and use them as commands for the systems. 1/3

3D bare-hand interaction method  In the 3DNBHI method, the users’ bare hands are tracked to extract the hand contours, determine the palm centers and detect the fingertips.  The hand centers are tracked using a matching algorithm that minimizes the displacement of the pair of hand centers over two successive frames, so that these two hands can always be differentiated from the live video stream. 2/3

3D bare-hand interaction method  To achieve interactions between the bare hands and virtual objects.  A small virtual sphere is rendered on each fingertip. 3/3

Assembly data management  A tri-layer assembly data structure (TADS) is used for assembly data management in BHAA.  First layer consists of geometric information.  Second layer is assembly sequence.  Third layer is assembly structure part-pair, surface-pair constraint information. 1/1

Augmented assembly process  With the 3DNBHI interface, users can manipulate and assemble two different parts more intuitively and realistically.  When these two parts are sufficiently close to each other, the user can adjust the positions and orientations of these parts easily and efficiently to trigger the assembly intent interpretation and constraints recognition functions. 1/6

Augmented assembly process 2/6

Augmented assembly process Assembly feature recognition  The surface contact query method is carried out as follows:  Step#1: Check the types of the surface pairs in contact.  Step#2: Check the parameters of the surface pairs in contact. 3/6

Augmented assembly process Assembly feature recognition  When the difference Ti in each parameter for a surface pair is within a threshold range,this surface pair remains in the list of surface contacts;otherwise, this surface pair will be removed from the list of surface contacts. 4/6

Augmented assembly process Constraint confirmation and refinement  For each constraint that has been recognized, the system can adjust the position and orientation of the components automatically to ensure that the constraint is met precisely. 5/6

Augmented assembly process Assembly tool operation  In BHAA, the user can select an assembly tool from the TADS to carry out an assembly operation.  The assembly tool operation process is carried out as follows. Step#1: Identification Step#2: Operation Step#3: Withdrawal 6/6

Assembly sequence evaluation and feedback  To improve assembly efficiency and reduce assembly cost, changes in assembly directions and tools should be minimized.  During an assembly simulation using BHAA, the user can evaluate an assembly sequence to obtain a near-optimum plan considering the ease of assembly, tool and orientation changes. 1/1

Implementation and case study  The BHAA system works well and consistently at about 15 frames per second for a 512 × 384 frame resolution.  The fingertip detection method which has a RMS error of 1–2 mm in all axes. 1/2

Implementation and case study 2/2

Conclusion and future work  A 3D dual-handed interaction interface is provided to facilitate AA.  The limitations are a lack of force feedback, a lack of realism using only fingertips for virtual objects manipulation and only three typical assembly constraints are considered. 1/1