Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 1 Contribution to the study of visual, auditory and haptic rendering of information of contact in virtual environments 9/12/2008 Jean Sreng Advisors:

Similar presentations


Presentation on theme: "1 1 Contribution to the study of visual, auditory and haptic rendering of information of contact in virtual environments 9/12/2008 Jean Sreng Advisors:"— Presentation transcript:

1 1 1 Contribution to the study of visual, auditory and haptic rendering of information of contact in virtual environments 9/12/2008 Jean Sreng Advisors: Claude Andriot Anatole Lécuyer Director: Bruno Arnaldi

2 2 2 Introduction Context: Manipulation of solid objects in Virtual Reality Example applications: industrial virtual assembly / disassembly / maintenance Focus: perception, simulation, rendering and of contacts between virtual objects

3 3 3 Outline State of the art on perception, simulation and rendering of contact Contributions Integrated 6DOF multimodal rendering approach Visual of rendering of multiple contacts Spatialized haptic rendering of contact Conclusion Integrated 6DOF Multimodal rendering Visual rendering of multiple contacts Spatialized haptic rendering

4 4 4 Human perception of contact Visual perception of contact: Stereoscopy (Hu et al. 2000) Motion parallax (Wanger et al. 1992) Shadows (Wanger et al. 1992, Hu et al. 2002) Auditory perception of contact: Contact properties can be directly perceived (Gaver 1993) Contact sounds conveys information about shape and material (Klatzky 2000, Rochesso 2001) Perception of contact Simulation of contact Rendering of contact

5 5 5 Haptic perception Haptic perception provides an intuitive way to feel the contact (Loomis et al. 1993) : Tactile perception (patterns at the surface of the skin) Kinesthetic perception (position and forces) Physical properties can be perceived through the contact (Klatzky et al. 2003) : Shape / Texture / Temperature Weight / Contact forces Perception of contact features through vibrations Material (Okamura et al. 1998) Texture (Lederman et al. 2001) Perception of contact Simulation of contact Rendering of contact

6 6 6 Multimodal perception of contact Known Interaction between modalities Visual-Auditory interaction -Ex: Sound can shift the perception of impact (Sekuler et al. 1997) Visual-Haptic interaction -Ex: Pseudo-haptic feedback (Lécuyer et al. 2002) Auditory-Haptic interaction -Ex: Sound can modulate the perception of roughness (Peeva et al. 1997) “Stiff” “Soft” Perception of contact Simulation of contact Rendering of contact

7 7 7 Simulation of contact Multiple contact models (impact/friction) Rigid (Newton/Huygens impact law, Coulomb/Amontons friction law) Locally deformable (Hunt-Crossley impact law, LuGre friction law) Multiple simulation methods Collision detection -VPS (McNeely 1999) -LMD (Johnson 2003) Physical simulation -Constraint based (Baraff 1989) -Penality (Moore 1988) Perception of contact Simulation of contact Rendering of contact

8 8 8 Visual rendering Visual rendering of the information of contact Proximity : -Color (McNeely et al.2006) Contact : -Color (Kitamura et al.1998) -Glyph (Redon et al. 2002) Force : -Glyph (Lécuyer et al. 2002) Perception of contact Simulation of contact Rendering of contact

9 9 9 Auditory rendering Realistic rendering of contact sounds Specific: Impact/Friction/Rolling Different techniques: FEM (O’Brien 2003), modal synthesis (Van den Doel 2001) Symbolic rendering (Richard et al. 1994, Massimino 1995, Lécuyer et al. 2002) Associate an information to a sound effect -Information: Distances / Contact / Forces -Sound effect: Amplitude / Frequencies Perception of contact Simulation of contact Rendering of contact

10 10 Haptic display of contact Haptic devices (Burdea 96) : Force feedback Tactile feedback Haptic rendering of contact : Closed loop (McNeely et al. 1999, Johnson 2003) tradeoff between stability / stiffness Open loop (Kuchenbecker et al. 2006) improve the realism of impact Device Virtual Env. Impact Perception of contact Simulation of contact Rendering of contact

11 11 Objectives of this thesis Improve the simulation, rendering, and perception of contacts in virtual environments Integrated 6DOF multimodal rendering Visual rendering of multiple contacts Spatialized haptic rendering Protocol : Integrated 6DOF approach for multisensory rendering techniques Study of rendering techniques to improve the perception of contact position -Hypothesis of improvement -Experimental implementation -Experimental evaluation

12 12 Outline Integrated 6DOF multimodal rendering Visual rendering of multiple contacts Spatialized haptic rendering

13 13 Objectives Multiplicity of techniques: Contact simulation Sensory rendering How can we integrate all techniques seamlessly together independently from the simulation ? Contribution / Overview Contact formulation (states / events) Example of contact rendering based on this formulation : Visual, Auditory, Tactile, Force feedback Integrated 6DOF multimodal rendering

14 14 Contact formulation Simple contact formulation based on : proximity points a, b Force f Integrated 6DOF multimodal rendering

15 15 Contact formulation From this formulation : Contact states Temporal evolution of contact states (such as events) : –Higher level information –Adapted to many specific rendering techniques Free motionContact ImpactDetachment Integrated 6DOF multimodal rendering

16 16 Integrated 6DOF multimodal rendering Determination of states and events The contact condition : The events are defined by : Impact : Detachment : Local linear velocity Normal Contact ImpactDetachment

17 17 Multimodal rendering architecture Integrated 6DOF multimodal rendering

18 18 Multimodal rendering architecture Superimpose states and events information: Integrated 6DOF multimodal rendering

19 19 Example of Visual rendering Contact ImpactDetachment Integrated 6DOF multimodal rendering

20 20 Integrated 6DOF multimodal rendering Example of Auditory rendering

21 21 Multimodal rendering platform Multimodal: Visual Auditory Tactile force-feedback © Hubert Raguet / CNRS photothèque Integrated 6DOF multimodal rendering

22 22 Preliminary conclusion We proposed a contact formulation (proximity/force) : Contact states and events We developed a multimodal rendering architecture : Visual (particles / pen) Auditory (modal synthesis / spatialized) Tactile (modal synthesis) 6DOF Haptic enhancement (openloop) Integrated 6DOF multimodal rendering

23 23 Outline Integrated 6DOF Multimodal rendering Visual rendering of multiple contacts Spatialized haptic rendering

24 24 Visual rendering of multiple contacts Objectives Context: complex-shaped objects Multiple contacts Difficult interaction  Help the user by providing position information Contribution / Overview Display the information of proximity / contact / forces -Glyphs -Lights Subjective evaluation

25 25 Visual rendering of multiple contacts Visualizing multiple proximity / contact / forces positions ProximityContactContact forces FAR NEARLOWHIGH

26 26 Visual rendering using glyphs ProximityContactContact forces FAR NEARLOWHIGH Visual rendering of multiple contacts

27 27 Visual rendering using glyphs Visual rendering of multiple contacts

28 28 Glyph filtering Reduce the number of displayed glyphs determine “relevance” based on the movement Visual rendering of multiple contacts

29 29 Glyph filtering Reduce the number of displayed glyphs determine “relevance” based on the movement Visual rendering of multiple contacts

30 30 Glyph filtering Reduce the number of displayed glyphs determine “relevance” based on the movement Visual rendering of multiple contacts

31 31 Glyph filtering v d The relevance is determined by comparing : The local velocity v The local normal d Visual rendering of multiple contacts

32 32 Visual rendering using lights Two types of lights : Spherical lights Conical lights Visual rendering of multiple contacts

33 33 Visual rendering using lights Visual rendering of multiple contacts

34 34 Subjective evaluation Objective: Determine user’s preferences about the different techniques Procedure: Participants were asked to perform an industrial assembly operation Without visual cues With each visual cues Conducted on 18 subjects They had to fill a subjective questionnaire Which effect : glyph / light / color change / size change / deformation For which info : forces / distances / blocking / focus attention Visual rendering of multiple contacts

35 35 Results The visual effects were globally well appreciated Significant effects : Glyph Size effect globally appreciated (distance / force) Glyph Deformation effect appreciated to provide force information Light effect appreciated to attract visual attention “Lower is better” Visual rendering of multiple contacts Mean ranking

36 36 Preliminary conclusion We proposed a visual rendering technique to display multiple contact information Display Proximity / Contact / Force Using Glyphs/Lights We presented a filtering technique to reduce the number of glyphs displayed We conducted a subjective evaluation Glyph size: proximity / force Glyph deformation: force Lights: focus the visual attention Visual rendering of multiple contacts

37 37 Outline Integrated 6DOF Multimodal rendering Visual rendering of multiple contacts Spatialized haptic rendering

38 38 Objectives Context: Complex-shaped objects  Help the user by providing position information Provide contact position information using : Visual rendering (particles/glyphs/lights) Auditory rendering (spatialized sound)  Can we provide this position information using haptic rendering ? Contribution / Overview Haptic rendering technique based on vibrations -Perceptive evaluation in a 1DOF case 6DOF Haptic rendering technique -Perceptive evaluation to determine rendering parameters -Subjective evaluation in a 6DOF case Spatialized haptic rendering

39 39 Spatialized haptic rendering Haptic rendering of contact position The impact between objects : A reaction force of contact A high frequency transient vibrations This high-frequency transient vibrations depends on : The object’s material (Okamura et. al 1998) The object’s geometry The impact position Is-it possible to perceive the impact position information using these vibrations?

40 40 Spatialized haptic rendering Vibrations depending on impact positions Examine the vibrations produced by a simple object : A cantilever beam The vibrations depend on the impact position

41 41 Spatialized haptic rendering Simulation of vibrations (Euler-Bernouilli) General solution : Euler Bernoulli model (EB)

42 42 Simplified vibration patterns Simplified patterns based on the physical behavior Maybe easier perception ? Simplified computation Chosen model : exponentially damped sinusoid Amplitude changes with impact position Frequency changes with impact position Both amplitude and frequency changes Spatialized haptic rendering

43 43 Simplified vibration patterns Am Fr AmFr (Consistent) AmCFr (Conflicting) Near impact Far impact

44 44 Evaluation Objective: “Determine it is possible to perceive the impact position using vibration” Population: 15 subjects Apparatus : Virtuose6D device Sound blocking noise headphones Spatialized haptic rendering

45 45 Procedure Task: Do two successive impacts. “Between these two impacts which one was the closest one from the hand ?” 6 models -2 realistic models (Euler-Bernoulli) (EB1, EB2) -4 simplified models (Am, Fr, AmFr, AmCFr) 4 impact positions 8 random repetitions Total of 576 trials (40 min) Spatialized haptic rendering

46 46 Results of quantitative evaluation “How well was the subject able to determine the impact position by sensing the vibration ?” Overall performance : ANOVA Significant (p < 0.007) Paired t-tests (p < 0.05) : –Am - –EB1 –EB2 –AmCFr –Fr - –EB1 –EB2 Realistic Euler-Bernoulli Simplified Spatialized haptic rendering

47 47 Results of qualitative evaluation “How was the subjective feeling of realism ?” Rate the impact realism : Paired t-tests (p < 0.05): –Am –EB1 –EB2 –Fr –AmFr –EB2 –Fr –AmCFr Realistic Euler-Bernoulli Simplified Spatialized haptic rendering

48 48 Many participants inverted the interpretation of the vibrations SensedPerceived NormalInverted Spatialized haptic rendering Quantitative evaluation and inversion

49 49 Discussion Global weak inter – subject correlation : Each subject seems to have his/her own interpretation (inversion or not) Strong intra – subject consistency : Subjects seem to be very consistent within his/her interpretation Several strong inter – subject correlations between models Several models interpreted the same way Vibrations can be used to convey impact position information Spatialized haptic rendering

50 50 6DOF Spatialized haptic rendering Generalize the previous result for 6DOF manipulation : Virtual beam model Spatialized haptic rendering

51 51 Spatialized haptic rendering Manipulation point and circle of confusion Different impact positions can generate the same haptic feedback

52 52 Perceptive evaluation Objective: “Is it possible to perceive such complex vibrations: Is it possible to perceive the vibration direction ?”  Determine the optimal amplitude / frequency vibration parameters allowing a good direction discrimination Population: 10 subjects Test among: ( 4 amplitudes a ) x ( 4 frequencies f ) Spatialized haptic rendering

53 53 Procedure and plan “On which axis the vibration was applied ?” ● ● ● 15 blocks of 4 x 4 x 3 = 48 vibrations = 720 trials (35min) ● ● ● Spatialized haptic rendering

54 54 Spatialized haptic rendering Perceptive evaluation : results 4 amplitudes a 4 frequencies f Best performances: Low frequencies High amplitudes Strategy: Most participants relied on intuitive perception

55 55 Spatialized haptic rendering Subjective evaluation Objective: Evaluation in a real case Population: 10 subjects Test without and with vibrations Subjective ratings Impact realism Impact position Comfort This evaluation provides encouraging results *

56 56 Preliminary conclusion We proposed a method to provide impact position directly on the haptic channel information using vibrations based on a vibrating beam We conducted a 1DOF study and perceptive evaluation (“realistic” / “simplified” models) -Simplified models achieved better performance We extended this method for 6DOF manipulation Perceptive study on the perception of vibration direction -Low frequencies / High amplitudes are better Subjective study on a “real case” -Better subjective perception of impact position Spatialized haptic rendering

57 57 Conclusion Integrated 6DOF Multimodal rendering Visual rendering of multiple contacts Spatialized haptic rendering Contributions: Integrated 6DOF approach for multisensory rendering techniques Study of rendering techniques to improve the perception of contact position -Visual rendering technique to display multiple contact -Haptic rendering technique to provide position information using vibrations

58 58 Perspective and future work Computer side: Higher level of information of contact (mobility) Rendering improvements (visual, auditory, tactile, force-feedback) Deeper investigation on vibratory tactile rendering Human side: Perceptive studies: multimodal perceptive effects / pseudo-haptics Quantitative evaluation of the rendering techniques in 6DOF manipulations

59 59 Publications Jean Sreng, Anatole Lécuyer, Christine Mégard, and Claude Andriot. Using Visual Cues of Contact to Improve Interactive Manipulation of Virtual Objects in Industrial Assembly/Maintenance Simulations. IEEE Transactions in Visualization and Computer Graphics, 12(5):1013–1020, 2006 Jean Sreng, Florian Bergez, Jérémie Legarrec, Anatole Lécuyer, Claude Andriot. Using an event-based approach to improve the multimodal rendering of 6DOF virtual contact. In Proceedings of ACM Symposium on Virtual Reality Software and Technology, pages 165–173, 2007 Jean Sreng, Anatole Lécuyer, Claude Andriot. Using Vibration Patterns to Provide Impact Position Information in Haptic Manipulation of Virtual Objects. In Proceedings of EuroHaptics, pages 589–598, 2008 Jean Sreng, Anatole Lécuyer, Claude Andriot. Spatialized Haptic Rendering: Improving 6DOF Haptic Simulations with Virtual Impact Position Information, 2009, In Proceedings of IEEE Virtual Reality, 2009, Accepted paper Jean Sreng, Legarrec, Anatole Lécuyer, Claude Andriot. Approche Evénementielle pour l’Amélioration du Rendu Multimodal 6DDL de Contact Virtuel. Actes des journées de l’Association Française de Réalité Virtuelle, 97–104, 2007 Jean Sreng, Anatole Lécuyer. Perception tactile de la localisation spatiale des contacts. Sciences et Technologies pour le Handicap, 3(1), 2009, Invited paper

60 60 Thank you. Questions ? Jean Sreng, Anatole Lécuyer, Christine Mégard, and Claude Andriot. Using Visual Cues of Contact to Improve Interactive Manipulation of Virtual Objects in Industrial Assembly/Maintenance Simulations. IEEE Transactions in Visualization and Computer Graphics, 12(5):1013–1020, 2006 Jean Sreng, Florian Bergez, Jérémie Legarrec, Anatole Lécuyer, Claude Andriot. Using an event-based approach to improve the multimodal rendering of 6DOF virtual contact. In Proceedings of ACM Symposium on Virtual Reality Software and Technology, pages 165–173, 2007 Jean Sreng, Anatole Lécuyer, Claude Andriot. Using Vibration Patterns to Provide Impact Position Information in Haptic Manipulation of Virtual Objects. In Proceedings of EuroHaptics, pages 589–598, 2008 Jean Sreng, Anatole Lécuyer, Claude Andriot. Spatialized Haptic Rendering: Improving 6DOF Haptic Simulations with Virtual Impact Position Information, 2009, In Proceedings of IEEE Virtual Reality, 2009, Accepted paper Jean Sreng, Legarrec, Anatole Lécuyer, Claude Andriot. Approche Evénementielle pour l’Amélioration du Rendu Multimodal 6DDL de Contact Virtuel. Actes des journées de l’Association Française de Réalité Virtuelle, 97–104, 2007 Jean Sreng, Anatole Lécuyer. Perception tactile de la localisation spatiale des contacts. Sciences et Technologies pour le Handicap, 3(1), 2009, Invited paper


Download ppt "1 1 Contribution to the study of visual, auditory and haptic rendering of information of contact in virtual environments 9/12/2008 Jean Sreng Advisors:"

Similar presentations


Ads by Google