1 Transparent control of avatar gestures A prototype Francesca Barrientos GUIR Meeting  28 April 2000.

Slides:



Advertisements
Similar presentations
DESCRIBING INPUT DEVICES
Advertisements

 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
MUltimo3-D: a Testbed for Multimodel 3-D PC Presenter: Yi Shi & Saul Rodriguez March 14, 2008.
UNC Chapel Hill M. C. Lin Reading Assignments Principles of Traditional Animation Applied to 3D Computer Animation, by J. Lasseter, Proc. of ACM SIGGRAPH.
Input: Devices and Theory. Input for Selection and Positioning Devices Power Law of Practice Fitt’s Law (2D, 3D lag) Eye hand coordination Two handed.
Nonverbal Communication
Single Display Groupware Ana Zanella - CPSC
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
A Smart Sensor to Detect the Falls of the Elderly.
Communicating with Avatar Bodies Francesca Barrientos Computer Science UC Berkeley 8 July 1999 HCC Research Retreat.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Adrian Ilie Realistic Avatar Movement Using Combined Trackers.
Sunee Holland University of South Australia School of Computer and Information Science Supervisor: Dr G Stewart Von Itzstein.
What does your body say?.  all messages that are not expressed as words.
The Implementation of a Glove-Based User Interface Chris Carey.
Chapter 11: Interaction Styles. Interaction Styles Introduction: Interaction styles are primarily different ways in which a user and computer system can.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Behavioral Animation: The Individual.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Expressive Emotional ECA ✔ Catherine Pelachaud ✔ Christopher Peters ✔ Maurizio Mancini.
Chapter 7. BEAT: the Behavior Expression Animation Toolkit
Graphite 2004 Statistical Synthesis of Facial Expressions for the Portrayal of Emotion Lisa Gralewski Bristol University United Kingdom
APML, a Markup Language for Believable Behavior Generation Soft computing Laboratory Yonsei University October 25, 2004.
1 of 2 This document is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS OR IMPLIED, IN THIS DOCUMENT. © 2007 Microsoft Corporation.
Virtual Cockpit: Terbu Alexander Institute for Computer Graphics and Vision Graz University of Technology An Alternative Augmented Reality User Interface.
Josh Stephens Comp Characteristics Degrees of Freedom: particular, independent way that a body moves in space Input type/Frequency of data: Discrete:
Full-body motion analysis for animating expressive, socially-attuned agents Elisabetta Bevacqua Paris8 Ginevra Castellano DIST Maurizio Mancini Paris8.
Recognition, Analysis and Synthesis of Gesture Expressivity George Caridakis IVML-ICCS.
Greta MPEG-4 compliant Script based behaviour generator system: Script based behaviour generator system: input - BML or APML input - BML or APML output.
VIRTUAL REALITY (VR) INTRODUCTION AND BASIC APPLICATIONS الواقع الافتراضي : مقدمة وتطبيقات Dr. Naji Shukri Alzaza Assist. Prof. of Mobile technology Dean.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
3D Interaction Techniques for Virtual Environments
Human Computer Interaction © 2014 Project Lead The Way, Inc.Computer Science and Software Engineering.
Literacy I can recall main info, know where to look for it, make inferences linked to evidence, show awareness of characters’ intentions, adapt speech.
Language. Phonetics is the study of how elements of language are physically produced.
Vision-based human motion analysis: An overview Computer Vision and Image Understanding(2007)
ENTERFACE 08 Project 1 “MultiParty Communication with a Tour Guide ECA” Mid-term presentation August 19th, 2008.
Model of the Human  Name Stan  Emotion Happy  Command Watch me  Face Location (x,y,z) = (122, 34, 205)  Hand Locations (x,y,z) = (85, -10, 175) (x,y,z)
Toward a Unified Scripting Language 1 Toward a Unified Scripting Language : Lessons Learned from Developing CML and AML Soft computing Laboratory Yonsei.
TOUCH ME NOT Presented by: Anjali.G.
4 November 2000Bridging the Gap Workshop 1 Control of avatar gestures Francesca Barrientos Computer Science Division UC Berkeley.
Just a quick reminder with another example
NONVERBAL COMMUNICATION What is non verbal communication? Nonverbal communication has been defined as communication without words.Nonverbal communication.
User Performance in Relation to 3D Input Device Design  Studies conducted at University of Toronto  Usability review of 6 degree of freedom (DOF) input.
UNC Chapel Hill M. C. Lin Basics of Motion Generation let X i = position,orient. of O i at t k = t 0,  i END = false while (not END) do display O i, 
Nonverbal Communication. Communication in general is process of sending and receiving messages that enables humans to share knowledge, attitudes, and.
public speaking AEMAN Al ABUOD Dr. Baek  Public speaking is very important skills that can affect our life.  Public speaking is so important that can.
Essential dementia awareness: person centred approaches.
Interactive Control of Avatars Animated with Human Motion Data By: Jehee Lee, Jinxiang Chai, Paul S. A. Reitsma, Jessica K. Hodgins, Nancy S. Pollard Presented.
Week 9, Day 2 Object-oriented Design Acknowledgement: These slides by Dr. Hasker SE-2811 Slide design: Dr. Mark L. Hornick Content: Dr. Hornick Errors:
Cursive: Controlling Expressive Avatar Gesture using Pen Gesture Francesca A. Barrientos John F. Canny UC Berkeley Computer science division CVE’02, September.
Cursive A novel interaction technique for controlling expressive avatar gesture Francesca Barrientos and John Canny UC Berkeley UIST 12 November 2001,
What is Multimedia Anyway? David Millard and Paul Lewis.
UCL Human Representation in Immersive Space. UCL Human Representation in Immersive Space Body ChatSensing Z X Y Zr YrXr Real–Time Animation.
Cursive A novel interaction technique for controlling expressive avatar gesture Francesca Barrientos and John Canny UC Berkeley UIST 12 November 2001,
Towards Learning Affective Body Gesture Andrea Kleinsmith Nadia Bianchi-Berthouze Database Systems Lab University of Aizu Aizu-Wakamatsu, Japan August.
Perceptive Computing Democracy Communism Architecture The Steam Engine WheelFire Zero Domestication Iron Ships Electricity The Vacuum tube E=mc 2 The.
Music and Audio Computing I A
Artificial Intelligence Lecture No. 5
NBKeyboard: An Arm-based Word-gesture keyboard
Reading Assignments Principles of Traditional Animation Applied to 3D Computer Animation, by J. Lasseter, Proc. of ACM SIGGRAPH 1987 Computer Animation:
Controlling Gestures on Avatars
Behavior and Communication
Francesca Barrientos and John Canny
Communicating with Avatar Bodies
Synthesis of Motion from Simple Animations
LEAP MOTION: GESTURAL BASED 3D INTERACTIONS
Cursive: Controlling Expressive Avatar Gesture using Pen Gesture
Computer Graphics Lecture 15.
The Organization and Planning of Movement Ch
Continuous control of avatar gesture (with dirty details)
Chapter 9 System Control
Presentation transcript:

1 Transparent control of avatar gestures A prototype Francesca Barrientos GUIR Meeting  28 April 2000

2 Review of interface elements Tracking (restricted) hand motion Free form Does not require watching the widget except to put down pen Facilitate proprioception Detect rests between gestures - calibration Design of mapping Qualitative control of motion No positional accuracy in output Does not require input accuracy

3 Kinematic mapping New joint angle  Old joint angle + (  mouse position * scaling factor) For small movements, motion in horizontal plane Moving pen closer to self moves avatar hand closer to body

4 Things that worked Within a small range, tracking is intuitive Can produce free form gestures Movement seems expressive Control is transparent

5 Limitations Large motions not intuitive Hard to form gestures based on proximity to other parts of body Mapping may behave differently on different systems Limited range of motion Want hand to be somewhat independent Move one body part at a time

6 Project goals Build a desktop VR system for controlling avatar gestures In particular for gesticulation Control by tracking a part of user’s body Study use of system Prove can perform some kinds of gesture not possible with other systems Understand which features contribute to communicative power of system

7 Gesticulation Gesticulation is gesture that co- occurs with speech Meaning of the utterance is divided up between the words and gestures Gestures derive meaning from timing with respect to speech

8 Main problems Limited input and complex output Control interface divides user’s attention

9 Other solutions for nonverbal communication Discrete choices (menus) of expressions Usually affective (happy, sad, angry…) Usually facial Usually used with chat environments Examples: Emotion wheel in ComicChat Palace Gesture/Mimic panel in Vlnet

10 Other solutions continued Analysis of text ComicChat uses keywords, acronyms, punctuation, etc. Semi-autonomous behaviors BodyChat by Vilhjálmsson Simple kinematic controls Sliders and similar widgets (e.g.. Slater) Full body motion capture

11 Nonverbal communication function table (examples) AffectiveBlushing/ facial coloring None PossiblePhysiological sensor IllustratorHand motions/ describe shape of object or action YesHigh Continuous and inverse kinematics FunctionDisplay/features Speech timing IntentionAwarenessControl

12 Nonverbal communication function table Function or form Display/features Speech timing IntentionAwarenessControl AffectiveMimics and postures/ lasting NoneHigh Discrete choices EmblemsSpecific hand shape NoneHigh Discrete choices Attention regulation Directed glancesNoneLow Discrete input/ Avatar agent Discourse marking Hand motions/ recognizable inflections YesHigh Continuous kinematic

13 Where I fit in Forms and functions Emphasis and color Discourse marking Personality? Features Free form Co-occur with speech Intention and awareness User has voluntary control Continuous control

14 Design informed by gesture research Stroke - most effortful phase Resting position part of prototype Space utilization Design/choice of mapping may be based on type of gesture Beats have favored direction and location Generation of beat gesture should be easily repeatable

15 Next step 6DOF tracker input Track position of wrist in space instead of position on a plane Still designing kinematic mapping Proprioception Detect rest and preparation of gesture Independence of hand orientation Qualitative control Forward kinematic mapping

16 And next Networked virtual environment User interface features Navigation Other gesture feedback (since user will have avatar point of view) User studies …

17 Summary Presented a prototype gesture control system Reminded you of my research goals Suggested a framework for selecting controls suitable for different types of gestures Described how design is informed by gesture research Next steps

18 Any questions?