Real-Time Animation of Realistic Virtual Humans. 1. The 3D virtual player is controlled by the real people who has a HMD and many sensors people who has.

Slides:



Advertisements
Similar presentations
Virtual Me. Motion Capture (mocap) Motion capture is the process of simulating actual movement in a computer generated environment The capture subject.
Advertisements

Motivation Hair animation used in movies, games, virtual reality, etc. Problem due to complexity –Human head has over 100,000 strands of hair –Computation.
3D Graphics Rendering and Terrain Modeling
Designing Facial Animation For Speaking Persian Language Hadi Rahimzadeh June 2005.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Figure Animation.
3D Graphics for Game Programming (J. Han) Chapter XI Character Animation.
A Versatile Depalletizer of Boxes Based on Range Imagery Dimitrios Katsoulas*, Lothar Bergen*, Lambis Tassakos** *University of Freiburg **Inos Automation-software.
1Notes  Assignment 0 marks should be ready by tonight (hand back in class on Monday)
Rasterization and Ray Tracing in Real-Time Applications (Games) Andrew Graff.
HCI 530 : Seminar (HCI) Damian Schofield.
LYU0603 A Generic Real-Time Facial Expression Modelling System Supervisor: Prof. Michael R. Lyu Group Member: Cheung Ka Shun ( ) Wong Chi Kin ( )
CSE351/ IT351 Modeling And Simulation Choosing a Mesh Model Dr. Jim Holten.
Modeling and Deformation of Arms and Legs Based on Ellipsoidal Sweeping Speaker: Alvin Date:2/16/2004From:PG03.
Presentation About Anatomy- Based Joint Models for Virtual Humans Skeletons Prepared By Khloud Zain Al-Abdeen Najwa Al-Ghamdi
(conventional Cartesian reference system)
Animation. Outline  Key frame animation  Hierarchical animation  Inverse kinematics.
1 Advanced Scene Management System. 2 A tree-based or graph-based representation is good for 3D data management A tree-based or graph-based representation.
1 Angel: Interactive Computer Graphics 4E © Addison-Wesley 2005 Models and Architectures Ed Angel Professor of Computer Science, Electrical and Computer.
Week 4 Lecture 3: Character Animation Based on Interactive Computer Graphics (Angel) - Chapter 10 1 Angel: Interactive Computer Graphics 5E © Addison-Wesley.
Algirdas Beinaravičius Gediminas Mazrimas.  Introduction  Motion capture and motion data  Used techniques  Animating human body  Problems  Conclusion.
University of Texas at Austin CS 378 – Game Technology Don Fussell CS 378: Computer Game Technology 3D Engines and Scene Graphs Spring 2012.
University of Texas at Austin CS 378 – Game Technology Don Fussell CS 378: Computer Game Technology Beyond Meshes Spring 2012.
Modeling and representation 1 – comparative review and polygon mesh models 2.1 Introduction 2.2 Polygonal representation of three-dimensional objects 2.3.
Introduction --Classification Shape ContourRegion Structural Syntactic Graph Tree Model-driven Data-driven Perimeter Compactness Eccentricity.
1 Computer Graphics Week13 –Shading Models. Shading Models Flat Shading Model: In this technique, each surface is assumed to have one normal vector (usually.
COMP 175: Computer Graphics March 24, 2015
Computer Graphics Group Tobias Weyand Mesh-Based Inverse Kinematics Sumner et al 2005 presented by Tobias Weyand.
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
Facial animation retargeting framework using radial basis functions Tamás Umenhoffer, Balázs Tóth Introduction Realistic facial animation16 is a challenging.
Invitation to Computer Science 5th Edition
3D Objects Subject:T0934 / Multimedia Programming Foundation Session:12 Tahun:2009 Versi:1/0.
Three Topics Facial Animation 2D Animated Mesh MPEG-4 Audio.
CS 551/651 Advanced Computer Graphics Warping and Morphing Spring 2002.
Computer Graphics An Introduction. What’s this course all about? 06/10/2015 Lecture 1 2 We will cover… Graphics programming and algorithms Graphics data.
Week 11 - Thursday.  What did we talk about last time?  Image processing  Blurring  Edge detection  Color correction  Tone mapping  Lens flare.
Exploitation of 3D Video Technologies Takashi Matsuyama Graduate School of Informatics, Kyoto University 12 th International Conference on Informatics.
CSC 461: Lecture 3 1 CSC461 Lecture 3: Models and Architectures  Objectives –Learn the basic design of a graphics system –Introduce pipeline architecture.
Image Synthesis Rabie A. Ramadan, PhD 1. 2 About my self Rabie A. Ramadan My website and publications
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
1Computer Graphics Lecture 4 - Models and Architectures John Shearer Culture Lab – space 2
Vision-based human motion analysis: An overview Computer Vision and Image Understanding(2007)
3D animation is rendered clip of animated 3D objects in a 3D environment. An example: Examples of movies released in 3D are Toy Story, Cars, Shrek, Wall-E,
MIRALab Where Research means Creativity SVG Open 2005 University of Geneva 1 Converting 3D Facial Animation with Gouraud shaded SVG A method.
Multimedia System and Networking UTD Slide- 1 University of Texas at Dallas B. Prabhakaran Rigging.
Computer Graphics Chapter 6 Andreas Savva. 2 Interactive Graphics Graphics provides one of the most natural means of communicating with a computer. Interactive.
Character Setup In addition to rigging for character models, rigging artists are also responsible for setting up animation controls for anything that is.
112/5/ :54 Graphics II Image Based Rendering Session 11.
Animated Speech Therapist for Individuals with Parkinson Disease Supported by the Coleman Institute for Cognitive Disabilities J. Yan, L. Ramig and R.
Review on Graphics Basics. Outline Polygon rendering pipeline Affine transformations Projective transformations Lighting and shading From vertices to.
RENDERING Introduction to Shading models – Flat and Smooth shading – Adding texture to faces – Adding shadows of objects – Building a camera in a program.
Subject Name: Computer Graphics Subject Code: Textbook: “Computer Graphics”, C Version By Hearn and Baker Credits: 6 1.
Skeletal Animation and Skinning A (hardware friendly) software approach By: Brandon Furtwangler.
1cs426-winter-2008 Notes. 2 Kinematics  The study of how things move  Usually boils down to describing the motion of articulated rigid figures Things.
UCL Human Representation in Immersive Space. UCL Human Representation in Immersive Space Body ChatSensing Z X Y Zr YrXr Real–Time Animation.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Applications and Rendering pipeline
Computer Animation Algorithms and Techniques
3.1 Clustering Finding a good clustering of the points is a fundamental issue in computing a representative simplicial complex. Mapper does not place any.
Human Figure Animation
3D Graphics Rendering PPT By Ricardo Veguilla.
2.1. Collision Detection Overview.
3D Rendering Pipeline Hidden Surface Removal 3D Primitives
Fusion, Face, HD Face Matthew Simari | Program Manager, Kinect Team
Computer Animation and Visualisation Lecture 4. Skinning
Chapter XIII Character Animation
Prepared by: Engr . Syed Atir Iftikhar
UMBC Graphics for Games
Synthesis of Motion from Simple Animations
Computer Graphics Lecture 15.
Lecture 3. Virtual Worlds : Representation,Creation and Simulation ( II ) 고려대학교 그래픽스 연구실.
Presentation transcript:

Real-Time Animation of Realistic Virtual Humans

1. The 3D virtual player is controlled by the real people who has a HMD and many sensors people who has a HMD and many sensors 2. Unlike in video games, animation is not predefined. 3. Unlike in rending films, animation is in real-time.

Real-Time Animation of Realistic Virtual Humans To achieve a real-time virtual humans application we need to consider: 1. Modeling people. 2.How to present deformation of human body. 3.Motion control.

Real-Time Animation of Realistic Virtual Humans 1.Body Creation and Skeleton Animation -Zhou Bin 2.Facial Deformation -Hu Yi 3. Body Deformation and Animation Framework - Yang Yufei - Yang Yufei

Real-Time Animation of Realistic Virtual Humans Body Creation: First layer: Skeleton hierarchy. First layer: Skeleton hierarchy. Second layer: Metaballs attached on the skeleton. Second layer: Metaballs attached on the skeleton. Third layer: Convert metaballs to mesh surface (skin mesh). Third layer: Convert metaballs to mesh surface (skin mesh). Fourth layer: Texture. Fourth layer: Texture.

Real-Time Animation of Realistic Virtual Humans First layer: Skeleton hierarchy 1. Skeleton articulated just similar to real human. 1. Skeleton articulated just similar to real human. 2. We can define all the human postures using this skeleton. 2. We can define all the human postures using this skeleton. 3. Relative limb sizes of the skeleton decides the figure of the 3. Relative limb sizes of the skeleton decides the figure of the virtual people. virtual people.

Real-Time Animation of Realistic Virtual Humans Second layer: Metaballs attached on the skeleton Construct human body by column. Can’t avoid gap on human body. Construct human body by column and sphere joint. Can avoid gap but can’t present human body’s smooth and gradual shape. Construct human body by metaballs. Metaballs includes spheres and ellipsoids. Left figure is constructed by two ellipsoids.

Real-Time Animation of Realistic Virtual Humans Second layer: Metaballs attached on the skeleton 1.These two human models are assembled by Metaballs (ellipsoid components). by Metaballs (ellipsoid components). 2. Metaballs are smooth and gradual, so they are suitable for modeling human bodies they are suitable for modeling human bodies 3. Different color depends on the attributes of that metaball: blendable or unblendable, that metaball: blendable or unblendable, deformable or nondeformable. deformable or nondeformable.

Real-Time Animation of Realistic Virtual Humans Third layer: Convert metaballs to mesh surfaces (skin mesh). 1. Convert metaballs to a number of meshes. We can do this job by doing ray tracing, and We can do this job by doing ray tracing, and sampling the intersection points. sampling the intersection points. 2. These meshes can decide the lighting effect of human bodies. of human bodies. 3. Texture mapping corresponds to these mesh surfaces. mesh surfaces.

Real-Time Animation of Realistic Virtual Humans Fourth layer: Mapping texture Texturing has low cost of computation but can observably improves the quality of the virtual object. Real human hand texture Textured 3D model 3D Bernard Boxton without texture without texture

Real-Time Animation of Realistic Virtual Humans Body Animation: First layer: Skeleton motion. First layer: Skeleton motion. --Let avatars make postures and actions. --Let avatars make postures and actions. Second layer: Mesh surface (skin mesh) deformation. Second layer: Mesh surface (skin mesh) deformation. --Present deformation of humans’ skin. --Present deformation of humans’ skin.

Real-Time Animation of Realistic Virtual Humans Skeleton: 1. The hierarchy of skeleton is defined by a set of joints. 1. The hierarchy of skeleton is defined by a set of joints. This set of joints corresponds to the main joints of real humans This set of joints corresponds to the main joints of real humans 2. Each joint consists of a set of degrees of freedom (DOF). 2. Each joint consists of a set of degrees of freedom (DOF). DOFs decide ranges of the joint can translate and rotate. DOFs decide ranges of the joint can translate and rotate.

Real-Time Animation of Realistic Virtual Humans Three methods of skeleton motion control: 1. The skeleton motion is captured in real time and drives the avatar. 1. The skeleton motion is captured in real time and drives the avatar. 2. The skeleton motion is predefined. It will be activated from a 2. The skeleton motion is predefined. It will be activated from a database as a response to human’s input. E.g. Sony’s Eye Toy. database as a response to human’s input. E.g. Sony’s Eye Toy. 3. The skeleton motion is dynamically calculated. Doesn’t require 3. The skeleton motion is dynamically calculated. Doesn’t require a user’s continual intervention. E.g. complex games, and AI a user’s continual intervention. E.g. complex games, and AI application. application.

 Human Head Modeling Facial Animation Real-Time Animation of Realistic Virtual Humans To simulate humans requires real-time visualization and animation.

 Human Head Modeling Real-Time Animation of Realistic Virtual Humans Scanning scan the surface of the head and construct a head model Use a sculpturing model----Software Sculptor

 Use a sculpturing model----Software Sculptor Real-Time Animation of Realistic Virtual Humans

 Facial Animation Real-Time Animation of Realistic Virtual Humans Facial deformation model Facial motion control

 Facial Deformation Model Real-Time Animation of Realistic Virtual Humans 1.Consider human face as a polygonal mesh 2.Define regions on the mesh 3.Define a control lattice on the region of interest 4.Muscle actions are simulated by changing the control points’ weight 5.A stiffness factor allows the amount of deformation for each point

 Facial Deformation Model Real-Time Animation of Realistic Virtual Humans

 Facial Motion Control Real-Time Animation of Realistic Virtual Humans

 The Facial Action Coding System 1.Define basic motion parameters as minimum perceptible actions (MPAs) open_mouth, close_upper_eyelids, or raise_corner_lip Real-Time Animation of Realistic Virtual Humans

 The Facial Action Coding System 1.Each MPA has a corresponding set of visible features e.g. movement of eyebrows, jaw, or mouth Real-Time Animation of Realistic Virtual Humans

 The Facial Action Coding System 2.Real-time facial animation module uses three different input methods Video Audio or Speech Predefined Actions Real-Time Animation of Realistic Virtual Humans

I.Video Input Recognition and tracking of the facial features is based on color sample identification, edge detection, and other image processing operations. The feature capture and tracking rate is about 20 frames per second (fps) Real-Time Animation of Realistic Virtual Humans

II.Audio Input s egment audio into phonemes with their duration Text input to phonemes phonemes decomposes into MPAs Real-Time Animation of Realistic Virtual Humans

predefined action of speech input audio input video input interpolation Real-Time Animation of Realistic Virtual Humans IV.Synchronization

V.Composition Real-Time Animation of Realistic Virtual Humans

 Body deformations The ways we do the representing humans –Polygonal representation –Visual accuracy representation –The combination of previous two Implement results Real-Time Animation of Realistic Virtual Humans  Animation framework  Two case studies

Real-Time Animation of Realistic Virtual Humans The first method The skin wrapped around the skeleton is represented with a fixed mesh divided at important joints where deformations occur.  Polygonal representation Pros –Simple and easy to implement Cons –The virtual human appears “rigid” and lacks realism –Visually distracting artifacts may arise at joints The combined method Visual accuracy representation  Body deformation Image from the internet

Real-Time Animation of Realistic Virtual Humans The second method The application compute the skin from implicit primitives and use a physical model to deform the body’s envelope. Pros –Stress visual accuracy and yields very satisfactory results in terms of realism Cons –So computationally demanding –Unsuitable for real-time applications Polygonal representation The combined method  Visual accuracy representation  Body deformation

Real-Time Animation of Realistic Virtual Humans The third method It’s a combination of previous two methods, allowing a good trade-off between realism and rendering speed. Steps –Constructing a body mesh –Deforming by manipulating skin contours Polygonal representation  The combined method Visual accuracy representation  Body deformation

Real-Time Animation of Realistic Virtual Humans The combined method --- step one Output the body data as cross- sectional contours. Convert contours data to triangle meshes. –Easy to render, performs better –Construct a triangle strip by connecting the points from two adjacent cross- section Connecting two different body parts proves a bit more complicated. The combined method  Constructing the body mesh Manipulating skin contours  Body deformation Image from: Daniel Thalmann, Jianhua Shen, Eric Chauvineau, 1996

Real-Time Animation of Realistic Virtual Humans The combined method --- step two Transform a complicated 3D operation into a 2D operation by manipulate the cross-section contours. By setting the orientation and position of the plane in which every contour lies, we can achieve a smooth deformation of the skin. The combined method Constructing the body mesh  Manipulating skin contours  Body deformation Image from: Daniel Thalmann, Jianhua Shen, Eric Chauvineau, 1996

Real-Time Animation of Realistic Virtual Humans The combined method --- step two (cont.) Very joint lies in the plane of its contour when the skeleton is in the at-rest posture. Two segments of the arm whose directions are L 1 and L 2. N u, N 0 and N l are the normal vectors of the cross-section planes. O i and N i are the center and normal respectively of the i th cross-section plane. Since we know N i by interpolation, we can compute each vertex belonging to the i th contour. The combined method Constructing the body mesh  Manipulating skin contours  Body deformation Image from:Prem Kalra, Nadia Magnenat-Thalmann, Laurent Moccozet, and Gael Sannier, 1998

Real-Time Animation of Realistic Virtual Humans The combined method --- step two (conclusion) The contours run parallel at the region above the elbow. This is because our eyes naturally go to the areas surrounding major joints like the elbows or knees. Practically we determined the number of upper and lower contours to deform in a heuristic fashion. It saves the rendering time and has little degradation. The combined method Constructing the body mesh  Manipulating skin contours  Body deformation Image from:Prem Kalra, Nadia Magnenat-Thalmann, Laurent Moccozet, and Gael Sannier, 1998

Real-Time Animation of Realistic Virtual Humans Results Using real-time-oriented 3D graphics toolkit called Performer. In right image: the virtual human on the left is made up of 14,000 vertices and containing 13,500 textured triangles using deformation; on the right uses rigid meshes with 17,000 triangles. Image and figure from:Prem Kalra, Nadia Magnenat-Thalmann, Laurent Moccozet, and Gael Sannier, 1998

Real-Time Animation of Realistic Virtual Humans Animation framework The close link between modeling and animation. The system separates into three units: modeling, deformation and motion control. –Modeling provides geometrical models for the body, hands and face. –Deformations are performed separately on different entities based on the model used for each part. –Motion control generates and controls the movements for different entities.

Real-Time Animation of Realistic Virtual Humans Animation framework Figure from:Prem Kalra, Nadia Magnenat-Thalmann, Laurent Moccozet, and Gael Sannier, 1998

Real-Time Animation of Realistic Virtual Humans Case studies --- CyberTennis Image from:Prem Kalra, Nadia Magnenat-Thalmann, Laurent Moccozet, and Gael Sannier, 1998

Real-Time Animation of Realistic Virtual Humans Case studies --- CyberDance Image from:Prem Kalra, Nadia Magnenat-Thalmann, Laurent Moccozet, and Gael Sannier, 1998