Presentation is loading. Please wait.

Presentation is loading. Please wait.

Electrical and Computer Engineering Dept. VR PROGRAMMING.

Similar presentations


Presentation on theme: "Electrical and Computer Engineering Dept. VR PROGRAMMING."— Presentation transcript:

1 Electrical and Computer Engineering Dept. VR PROGRAMMING

2 System architecture VR Toolkits

3 VR Programming Toolkits  Are extensible libraries of object-oriented functions designed to help the VR developer;  Support various common i/o devices used in VR (so drivers need not be written by the developer);  Allow import of CAD models (saves time), editing of shapes, specifying object hierarchies, collision detection and multi-level of detail, shading and texturing, run-time management;  Have built-in networking functions for multi-user interactions, etc.

4 VR Toolkits can be classified by: Whether text-based or graphical-programming; The type of language used and the library size; The type of i/o devices supported; The type of rendering supported; Whether general-purpose or application specific; Whether proprietary (more functionality, better documented) or public domain (free, but less documentation and functionality)

5 VR Toolkits in Early ‘90s  RenderWare (Cannon), VRT3/Superscape (Dimension Ltd.), Cyberspace Developer Kit (Autodesk), Cosmo Authoring Tool (SGI/Platinum/CA), Rend386 and others;  They allowed either text-based programming (RenderWare, CDK and Rend386), or graphical programming (Superscape and Cosmo);  They were platform-independent and generally did not require graphics acceleration hardware;  As a result they tended to use “low-end” i/o devices (mouse) and to support flat shading to maintain fast rendering.

6 Rend386 scene

7 VR Toolkits discussed in this chapter NameApplPrgm mode Propri etary Language Java3D (Sun Micro) General Purpose textnoImplemented in C Programming in Java Vizard Toolkit and PeoplePak WorldViz General Purpose Text/ graph yesOpenGL-based Python scripting language GHOST (SensAble Technologies) Haptics for Phantom textyesC++ H3DHaptics/ Graphics textnoC++ PeopleShop (Boston Dynamics) Military/ civilian graphyesC/C++ Unity 3DGame engine Text/ graph yes JavaScript, C#, and Python

8 The scene graph: Is a hierarchical organization of objects (visible or not) in the virtual world (or “universe”) together with the view to that world; Scene graphs are represented by a tree structure, with nodes connected by branches. Visible objects are represented by external nodes, which are called leafs (they have no children). Example nodes F, G, H, I Internal nodes represent transformations (which apply to all their children) A BC ED J FGHI Root node Internal node External node

9 palm Scene Scene Ball Ball Scene graph shows that the ball is a child of “scene” Scene graphs are not static

10 palm Scene Scene Ball Ball Scene graph has been modified, such that the ball is now a child of the palm VC 6.1 on book CD

11 palm Scene Scene Knob 1 Knob 1 Knob 2 Knob 2 Knob 3 Knob 3 Knob 4 Knob 4 Button Button Panel Panel Thumb Pinkie Ring Middle Index

12 palm Scene Scene Knob 1 Knob 1 Knob 2 Knob 2 Knob 3 Knob 3 Knob 4 Knob 4 Button Button Panel Panel Thumb Pinkie Ring Middle Index

13 Authoring (Modeling) Stages Model Geometry Define and link sensors Define action functions Define scene graph Define networking

14 Run-time loop Start Simulation Update Objects (from sensors and intelligent actions Render scene (graphics, audio, haptics) Read Sensor Data Exit Simulation Repeats every frame

15 VR Toolkits discussed in this chapter NameApplPrgm mode Propri etary Language Java3D (Sun Micro) (Sun Micro) General Purpose textno Implemented in C Programming in Java Vizard Toolkit and PeoplePak WorldViz General Purpose Text/ graph yesOpenGL-based Python scripting language GHOST (SensAble Technologies) Haptics for Phantom textyesC++ H3DHaptics/ Graphics textnoC++ PeopleShop (Boston Dynamics) Military/ civilian graphyesC/C++ Unity 3DGame engine Text/ graph yes JavaScript, C#, and Python

16 Java and Java 3D Java object oriented programming language developed for network applications platform independent slower than C/C++ Java 3D Java hierarchy of classes that serves as an interface to 3D graphics rendering and sound rendering systems Perfectly integrated with Java Strong object oriented architecture Powerful 3D graphics API

17 Java 3D Initiation Model Geometry Setup sensors Define behaviors Define scene graph Networking

18 Java 3D Initiation Model Geometry Setup sensors Define behaviors Define scene graph Networking

19 Java 3D geometry: Geometry can be imported from various file formats (e.g. 3DS, DXF, LWS, NFF, OBJ, VRT, VTK, WRL) Can be created as a primitive geometry (e.g. sphere, cone, cylinder, …) Custom geometry created by specifying the vertices, edges, normals, texture coordinates using specially defined classes Imported geometry loader.load(“Hand.wrl") Geometry primitive: new Sphere(radius) Custom geometry: new GeometryArray(…) new LineArray(…) new QuadArray(…) new TriangleArray(…)

20 Java 3D object appearance: The appearance of a geometry is specified using an appearance object An appearance-class object stores information about the material (diffuse, specular, shininess, opacity, …) and texture Mat = new Material(); Mat.setDiffuseColor(r, g, b); Mat.setAmbientColor(r, g, b); Mat.setSpecularColor(r, g, b); TexLd = new TextureLoader(“checkered.jpg”,...); Tex = TexLd.getTexture(); Appr = new Appearance(); Appr.setMaterial(Mat); Appr.setTexture(Text); Geom.setAppearance(Appr)

21 Java 3D Initiation Model Geometry Setup sensors Define behaviors Define scene graph Networking

22 Java3D node types: Node Group Leaf BranchGroup TransformGroup Switch Background Behavior Fog Light Shape3D Compilable sub-graph Transform + child nodes Select which of the children are visible (useful for LOD) Universe background. Can be a color or an image Actions to be performed by the simulation Fog node Light node. Special derived classes: AmbientLight, PointLight, DirectionalLight Geometry + Appearance + BoundingBox

23 Java3D scene graph Node

24 Loading objects from files Java3D offers by default support for Lightwave and Wavefront model files Loaders for other file formats can be downloaded for free from the web Loaders add the content of the read file to the scene graph as a single object. However, they provide functions to access the subparts individually Universe Root CubeSphereHand ThumbIndexMiddleRingSmall

25 Scene Sc = loader.load(“Hand.wrl”); BranchGroup Bg = Sc.getSceneGroup(); RootNode.addChild(Bg); Java3D model loading Scene Sc = loader.load(“Hand.wrl”); BranchGroup Bg = Sc.getSceneGroup(); Thumb = Bg.getChild(0); Index = Bg.getChild(1); Middle = Bg.getChild(2); Ring = Bg.getChild(3); Small = Bg.getChild(4); Adding the model to the scene graph Accessing subparts of the loaded model

26 Palm = loader.load("Palm.wrl").getSceneGroup(); ThumbProximal = loader.load("ThumbProximal.wrl").getSceneGroup(); ThumbDistal = loader.load("ThumbDistal.wrl").getSceneGroup(); IndexProximal = loader.load("IndexProximal.wrl").getSceneGroup(); IndexMiddle = loader.load("IndexMiddle.wrl").getSceneGroup(); IndexDistal = loader.load("IndexDistal.wrl").getSceneGroup(); MiddleProximal = loader.load("MiddleProximal.wrl").getSceneGroup(); MiddleMiddle = loader.load("MiddleMiddle.wrl").getSceneGroup(); MiddleDistal = loader.load("MiddleDistal.wrl").getSceneGroup(); RingProximal = loader.load("RingProximal.wrl").getSceneGroup(); RingMiddle = loader.load("RingMiddle.wrl").getSceneGroup(); RingDistal = loader.load("RingDistal.wrl").getSceneGroup(); SmallProximal = loader.load("SmallProximal.wrl").getSceneGroup(); SmallMiddle = loader.load("SmallMiddle.wrl").getSceneGroup(); SmallDistal = loader.load("SmallDistal.wrl").getSceneGroup(); Java3D virtual hand loading:

27 Java3D virtual hand hierarchy: Palm.addchild(ThumbProximal ); ThumbProximal.addchild(ThumbDistal ); Palm.addchild(IndexProximal ); IndexProximal.addchild(IndexMiddle ); IndexMiddle.addchild(IndexDistal ); Palm.addchild(MiddleProximal ); MiddleProximal.addchild(MiddleMiddle ); MiddleMiddle.addchild(MiddleDistal ); Palm.addchild(RingProximal ); RingProximal.addchild(RingMiddle ); RingMiddle.addchild(RingDistal ); Palm.addchild(SmallProximal ); SmallProximal.addchild(SmallMiddle ); SmallMiddle.addchild(SmallDistal );

28 Java3D Initiation Model Geometry Setup sensors Define behaviors Define scene graph Networking

29 Input devices in Java3D The only input devices supported by Java3D are the mouse and the keyboard The integration of the input devices currently used in VR applications (position sensors, track balls, joysticks, sensing gloves…) relies entirely on the developer Usually the drivers are written in C/C++. One needs either to re- write the driver using Java or use JNI (Java Native Interface) to call the C/C++ version of the driver. The latter solution is more desirable. Java3D provides a nice general purpose input device interface that can be used to integrate sensors. However, many times developers prefer custom made approaches

30 Java3D General purpose sensor interface class PhysicalEnvironment - stores information about all the input devices and sensors involved in the simulation class InputDevice - interface for an input device driver class Sensor - class for objects that provide real time data PhysicalEnvironment InputDevicesSensors One input device can provide one or more sensors A sensors object needs not be in relation with an input device (VRML style sensors)

31 Java3D Initiation Model Geometry Setup sensors Animating the scene Define scene graph Networking

32 Java3D - Animating the simulation Java3D offers Behavior objects for controlling the simulation A Behavior object contains a set of actions performed when the object receives a stimulus A stimulus is sent by a WakeupCondition object Some wakeup classes: WakeupOnCollisionEntry WakeupOnCollisionExit WakeupOnCollisionMovement WakeupOnElapsedFrames WakeupOnElapsedTime WakeupOnSensorEntry WakeupOnSensorExit WakeupOnViewPlatformEntry WakeupOnViewPlatformExit

33 Java3D - Behavior usage WakeupOnElapsedFrames Wup = new WakeupOnElapsedFrames(0); Bhv.wakeupOn(Wup); Universe Root We define a behavior Bhv that rotates the sphere by 1 degree We define a behavior Bhv that rotates the sphere by 1 degree We want this behavior to be called each frame so that the sphere will be spinning We want this behavior to be called each frame so that the sphere will be spinning VC 6.3 on book CDVC 6.4 on book CD

34 The Java 3D View object describes the graphics display used in the simulation, as well as the user’s position versus that display (given by the tracker); The View model provides a separation between the virtual world provided by the ViewPlatform node and the real I/O devices used in the interaction; This separation helps portability. Several users that are tracked can be mapped to the same location in the virtual world. This corresponds to several Views and a single ViewPlatform;

35 Conversely, a single user can control several ViewPlatforms; This corresponds to several Views since each ViewPlatform has its own View; Thus a single user can have several Views to a virtual world, and can “teleport” between them. Thus a single user can have several Views to a virtual world, and can “teleport” between them. View platform 1 View platform 2

36 Java3D Initiation Model Geometry Setup sensors Define behaviors Define scene graph Networking

37 Java3D simulation Java Application Java3D simulation Java Applet Java3D simulation Java Application Java3D - Networking Java3D does not provide a built-in solution for networked virtual environments However, it’s perfect integration in the Java language allows the developer to use the powerful network features offered by Java Java3D applications can run as stand alone applications or as applets in a web browser Java3D simulation Java Applet Server

38 Java3D and VRML VRML provides possibilities for defining the objects and animating the objects in a virtual world Graphics APIs such as Java3D load from a VRML file only the static information, ignoring the sensors, routes, scripts, etc. Java3D structure is general enough to make the import of sensors and routes possible but currently we are not aware of any loader that does it One of the most popular library of Java3D loaders is the NCSA Portfolio (http://www.ncsa.uiuc.edu/~srp/Java3D/portfolio/)

39 NCSA Portfolio Offers loaders for several model files 3D Studio (3DS) TrueSpace COB loader (COB) Java 3D Digital Elevation Map (DEM) AutoCAD (DXF ) Imagine (IOB) Lightwave (LWS) Wavefront (OBJ) Protein Data Bank (PDB) Visualization Toolkit (VTK) VRML97 Loades the following parts of VRML97 files Appearance Box Coordinate Collision (for grouping only) Group IndexedFaceSet IndexedLineSet Material Normal Shape Sphere Transform

40 Java3D on-line resources

41 Comparison between Java3D and WTK A comparative study was done at Rutgers between Java3d (Version 1.3beta 1) and WTK (Release 9); The simulation ran on a dual Pentium III 933 MHz PC (Dell) with 512 Mbytes RAM, with an Wildcat 4110 graphics accelerator which had 64 Mbytes RAM; The I/o interfaces were a Polhemus Insidetrack or the Rutgers Master II force feedback glove; The scene consisted of several 420-polygon spheres and a virtual hand with 2,270 polygons; The spheres rotated constantly around an arbitrary axis, while the hand was either rotating, or driven by the user.

42 Java3D –WTK Comparison Graphics scene used in experiments

43 Comparison between Java3D and WTK The simulation variables used to judged performance were:  graphic mode (monoscopic, stereoscopic),  rendering mode (wireframe, Gouraud, textured);  scene complexity (number of polygons 5,000 – 50,000);  lighting (number of light sources 1, 5, 10);  interactivity (no interaction, hand input, force feedback)

44 Java3D –WTK Comparison Java3d is faster on average than WTK, but has higher variability

45 Java3D –WTK Comparison Java3d Release 3.1 Beta 1 has less system latencies than WTK Release 9

46 But Java3d has more variability in the scene rendering time

47 WTK does not have spikes in the scene rendering time

48 VR Toolkits discussed in this chapter NameApplPrgm mode Propri etary Language Java3D (Sun Micro) General Purpose textnoImplemented in C Programming in Java Vizard Toolkit and PeoplePak WorldViz General Purpose Text/ graph yesOpenGL-based Python scripting language GHOST (SensAble Technologies) Haptics for Phantom textyesC++ H3DHaptics/ Graphics textnoC++ PeopleShop (Boston Dynamics) Military/ civilian graphyesC/C++ Unity 3DGame engine Text/ graph yes JavaScript, C#, and Python

49 Vizard characteristics: Uses Python which is a scalable and cross-platform; It is object-oriented and simple to integrate with C/C++ It runs on Unix, Windows, Mac and other platforms; Uses a 4-window “workbench” which allows programmers to write and execute code, inspect 3D models, drag-and-drop objects, and issue commands while the scene is being rendered. Stack of scripts – errors are highlighted as you type Interactive window – input commands Resource window – Text list of word assets 3D window – Explore individual objects

50 Workbench use: Icon menu Scene exploration with the mouse Importing objects

51 import viz import hand viz.go() #Identify the 5DT glove's port. PORT_5DT_USB = 0 #Add the 5DT sensor sensor = viz.add('5dt.dls') #Create a hand object from the data glove glove = hand.add(sensor,hand.GLOVE_5DT) #Place the hand in front of the user glove.translate(0,1,5) glove.rotate(180,-90,180) # now when you run the script the glove should be moving Vizard virtual hand:

52 import viz viz.go() logo = viz.add('logo.wrl') #add vizard logo and place it in front of user logo.translate(0,2,4) tex1 = viz.add('gb_noise.jpg') #add two textures that will then be applied to the logo #tex2 = viz.add('brick.jpg') logo.texture(tex1) #applies the first texture logo.texture(tex2,'',1) #applies the second texture to the logo blend = viz.add('multitexblend.fp') #indicate how to blend the two textures logo.apply(blend) Vizard multi-texturing:

53 Vizard Simulation Servers: More than one user can inhabit the same environment. Each user needs to run Vizard. After the world is set up, each user has to set up two “mail boxes”. One receives information from the other user after it was given network name. Messages come in sequence [0] who sent it, [1] what is sent, [2] and larger the actual data User 2 User 1 Position is the property Ball is the object action information

54 Vizard networking example: Import viz Viz.go() Ball=viz.add(‘ball.wrl’) #create a Ball object that is controlled by the other user #add the world that will be displayed on your computer #Use a prompt to ask the other user the network name of his computer. target_machine = viz.input('Enter the name of the other machine'). upper() #Add a mailbox from which to send messages to the other user. This is your outbox. target_mailbox = viz.add(viz.NETWORK, target_machine) #Add an id for the timer. BROADCAST = 1 #Add the timer. def mytimer(num): if num == BROADCAST: #Retrieve your current position. position = viz.get(viz.HEAD_POS) #Send the data to the target mailbox. All the recipient will get your yaw, x and z coordinates. target_mailbox.send(position[0], position[1], position[2])

55 Vizard networking example: # #This function will deal with incoming messages. def mynetwork(message): #message[0] is who sent the message, message[1] is a description of what he #sent and message[2] and greater are the messages themselves. x = message[2] y = message[3] z = message[4] ball.translate(x,y,z) # Callback the network function to await incoming messages. viz.callback(viz.NETWORK_EVENT, mynetwork) # Callback the timer. viz.callback(viz.TIMER_EVENT, mytimer) # Start the timer. viz.starttimer(BROADCAST, 0.01, -1)

56 VR Toolkits discussed in this chapter NameApplPrgm mode Propri etary Language Java3D (Sun Micro) General Purpose textnoImplemented in C Programming in Java Vizard Toolkit and PeoplePak WorldViz General Purpose Text/ graph yesOpenGL-based Python scripting language GHOST (SensAble Technologies) GHOST (SensAble Technologies) Haptics for Phantom textyesC++ H3DHaptics/ Graphics textnoC++ PeopleShop (Boston Dynamics) Military/ civilian graphyesC/C++ Unity 3DGame engine Text/ graph yes JavaScript, C#, and Python

57 GHOST Toolkit for the PHANToM  Provides realistic haptic interaction  Provides and intuitive interfaces to haptics  Provides a haptic scene graph aligned with 3D graphics APIs  Provides an extensible environment for extending haptic interaction technologies  Point haptic interaction with PHANTOM  geometry based on user defined force models  Geometry moves dynamically in response to forces x y z (0,0,0) PHANToM Desktop model

58 GHOST – Application interaction x y z (0,0,0) adapted from Ghost SDK Programmer’s Guide (version 3.1) Application Process Haptic Process Scene Creation Collision detection Collision response Haptic State Update Clean-up HapticRendering Haptic Servo loop 30 fps Scene Rendering 1000 Hz 100 Hz

59 The GHOST haptics scene graph x y z (0,0,0) adapted from Ghost SDK Programmer’s Guide (version 3.1) Application Process Haptic Process Scene Creation Collision detection Collision response Haptic State Update Clean-up HapticRendering Haptic Servo loop 30 fps Scene Rendering 1000 Hz

60 Haptic scene graph  Provides a structured way to construct a haptic scene, including geometry and dynamics;  is traversed only from top to bottom (unlike WTK);  each node reachable by only one (unique) traversal from the root (a child node has only one parent node)  Each node has its own transform (no separate transform nodes);  cannot use pointers to repeat instances of the same object, since similar objects have different haptic interactions;  Separator nodes to create a hierarchy – Transforms on the Separator affect its sub-tree;

61 GHOST node classes 3D support Static Nodes Dynamic Nodes Utility Classes

62 GHOST nine node classes

63 GHOST nine node classes (continued)

64

65 Scene graph example y x z Head y x z Left Shoulder y x z Right Shoulder y x z Right Elbow y x z Left Elbow y x z Torso y x z Body

66 Static scene graph – only separators and geometry nodes as leaves

67 GHOST code example: GHOST code example: #include #include Main() gstScene *scene = new gstScene; gstSeparator *root = new gstSeparator; gstSphere *sphere = new gstSphere; Sptere -> setRadius(20); gstPHANToM *phantom = new gstPHANToM (``PHANToM name``); Root -> addChild(phantom); Root-> addChild(sphere); Scene-> setRoot(root); Scene -> startServoLoop(); While(!scene -> getDoneServoLoop()) // end application by calling scene -> stopServoLoop (); // end application by calling scene -> stopServoLoop ();

68 Force calculation and dynamics x y z (0,0,0) adapted from Ghost SDK Programmer’s Guide (version 3.1) Application Process Haptic Process Scene Creation Collision detection Collision response Haptic State Update Clean-up HapticRendering Haptic Servo loop 30 fps Scene Rendering 1000 Hz

69 Collision detection and response  The scene graph contains at least one representation of the haptic interface through gstPHANToM node. There can be up to four such nodes (four haptic interfaces in one haptic scene)  Collisions are detected between this node and the geometry nodes through the gstShape node that goes from “untouched” to “touched”;  When collision exists, the gstPHANToM_SCP (surface contact point) is added to the scene graph. This node should be added to the scene graph under the same parent as gstPHANToM node.

70 Collision detection and response Normal Force Normal Force (depends on spring (depends on spring and damper coefficients) Friction Force Friction Force (depends on static and dynamic (depends on static and dynamic friction coefficients) friction coefficients)  Forces are calculated following collision;  Collision response through dynamic effects (movable nodes, solid body dynamics);  Application informed if needed (user defined).

71 Dynamic nodes  The gstDynamic node adds movement ability to the geometry nodes beneath it. A subtree under a gstDynamic node represents one physically dynamic object.  Forces generated by gstPHANToM node colliding with one of the geometries of such object are added to the state of the gstDynamic node  Transformations (rotations, translations) are always applied to the gstDynamic node, not its children;  It has four derived classes gstDial, gstButton, gstSlider and gstRigidBody.

72 Dynamic nodes (continued)  When a gstDynamic node changes state, an event is generated which calls a user-defined callback function.  Example – the application may quit if a gstButton changes state from pressed to released. gstButton gstButton behavioral example

73 Updating the application x y z (0,0,0) adapted from Ghost SDK Programmer’s Guide (version 3.1) Application Process Haptic Process Scene Creation Collision detection Collision response Haptic State Update Clean-up HapticRendering Haptic Servo loop 30 fps Scene Rendering 1000 Hz

74 Graphics and event callbacks  The user selects which nodes have call-back functions, and what information needs to be sent back to the application;  This way the application calls updateGraphics to have graphics information updated. Nodes that have a graphics call-back defined, and have a new state since the last call to updateGraphics will copy their current state to a defined data structure  Call-backs pass new state information of the haptic scene nodes from GHOST haptics process to the application process;  For example, the user can create a callback for the graphics representation of the position of the gstPHANToM node. This should change to callback of gstPHANToM_SCP after collision, so the user can see the location of the contact point on the object.

75 Mapping the user to the haptic scene Phantom workspace User workspace

76 Camera mapping to PHANToM workspace Camera Phantom workspace z-axis Phantom workspace Camera Phantom workspace phantomSep Transform M rotation phantomSep Transform M camera phantomSep Transform M zaxisOffset from Ghost SDK Programmer’s Guide (version 3.1)

77 Scaling camera and PHANToM workspaces Camera workspace too large Camera workspace too small Camera workspace appropriate Camera Phantom workspace Camera Phantom workspace Camera Phantom workspace

78 Scaling camera and PHANToM workspaces Camera D xmax D phantomxmax  The scale factor depends on the distance D xmax from the focal point to the frustum  The distance D phantomxmax from the non-scaled PHANTOM workspace center to the side limit must also be determined  The scale factor is then S frustum =D xmax /D phantomxmax

79 Scaling camera and PHANToM workspaces  To maintain haptic fidelity, the gstShape node physical properties (compliance and damping) need to be scaled too;  SurfaceKspring new = SurfaceKspring current /S frustum  SurfaceKdamping new = SurfaceKdamping current /S frustum where SurfaceKspring and SurfaceKdamping are gstShape compliance and damping coefficients.

80 VR Toolkits discussed in this chapter NameApplPrgm mode Propri etary Language Java3D (Sun Micro) General Purpose textnoImplemented in C Programming in Java Vizard Toolkit and PeoplePak WorldViz General Purpose Text/ graph yesOpenGL-based Python scripting language GHOST (SensAble Technologies) Haptics for Phantom textyesC++ H3DHaptics/GraphicstextnoC++ PeopleShop (Boston Dynamics) Military/ civilian graphyesC/C++ Unity 3DGame engine Text/ graph yes JavaScript, C#, and Python

81 SenseGraphics u Founded in 2004 in Stockholm u SenseGraphics represents over twenty years of experience in the haptics and graphics industry. u SenseGraphics provides a high performance application development platform which enables integration of haptics and 3D stereo visualization into multimodal software applications

82 What is H3D API? u Product of SenseGraphics u Software development platform for multi- sensory applications u Uses the open standards X3D, OpenGL and SenseGraphics haptics in a unified scenegraph taking care of both the haptic and graphic rendering

83 What it does u Combines graphics and haptics into one platform. u Adds haptics to existing 3D models. u Enables rapid programming of haptic applications using X3D and Python. u Easily extended with custom graphics-haptics features using C++.

84 Continued u Supports SensAble, Novint and MOOG FCS haptic devices. u Supports most 3D stereo display systems. u Runs on Windows, Linux and Mac.

85 H3DAPI Architecture

86 Some H3D nodes

87 Example Code

88 Results of code

89 Applications u Computer Assisted Radiology & Surgery Switzerland (CARCAS)

90 Application Cont’d u University of Virginia u

91 Other Applications and Projects

92 Why H3DAPI over Ghost? u H3D is compatible with many scene-graph and 3D environment generating platforms. (VRML, X3D, Java3D, OpenGL) u Uses C++ and Python scripting language. u You get support haptics devices from several manufacturers. u H3D provides graphic renderings while ghost needs another program. (Cortona 3D)

93 References u u u u u u

94 VR Toolkits discussed in this chapter NameApplPrgm mode Propri etary Language Java3D (Sun Micro) General Purpose textnoImplemented in C Programming in Java Vizard Toolkit and PeoplePak WorldViz General Purpose Text/ graph yesOpenGL-based Python scripting language GHOST (SensAble Technologies) Haptics for Phantom textyesC++ H3DHaptics/ Graphics textnoC++ PeopleShop (Boston Dynamics) (Boston Dynamics)Military/civiliangraphyesC/C++ Unity 3DGame engine Text/ graph yes JavaScript, C#, and Python

95 VR Toolkits discussed in this chapter NameApplicationProprietaryLibrary size language Java3D (Sun Microsystems) General Purpose no Implemented in C Programming in Java 19 packages, 275 classes Vizard and PeoplePak (WorldViz) General Purpose avatar extension yesOpenGL-based Python scripting language GHOST (SensAble Technologies) Haptics for Phantom yesC++ PeopleShop (Boston Dynamics) Military/civilian yesC/C++ 3DGame StudioGame engine yes C++ C++

96 BDI PeopleShop/DI-Guy characteristics  Provides a realistic way to simulate human characters in real-time scenes without using tracking suits;  Is a task-level programming environment combined with a menu-based GUI;  Tasks are mapped to pre-defined (stored) joint motions which are interpolated in real time;  Well-suited for Distributed Interactive Simulations (DIS) due to low bandwidth requirements and live reckoning;  Initially designed for the military, now extended to civilian applications, such as accident reenactment, architectural walk-through, driving simulators, police training, etc.

97 BDI PeopleShop/DI-Guy characteristics - continued  Linkable object library that runs on SGI, Intel PCs, as well as other platforms;  The library has modules for run-time motion engines, graphics display, motion data, 3D graphics models, textures, and network interfaces for DIS;  Runs under OpenGL, Direct3D, Mak Stealth and other packages;  Recommended hardware is Intel > 200MHz, 64 MB Ram, and graphics accelerator (for Open GL, OpenGVS or Direct3D).

98 PeopleShop Initiation Scene Geometry Definesensors Definebehavior Define character path Define networking

99 PeopleShop Initiation Scene Geometry Definesensors Definebehavior Define character path Define networking

100 PeopleShop Characters  Are articulated polygonal structures with 54 DOF and 11 links;  Vehicles are also treated as characters;  Different types of characters have different acceptable actions;  Each type of character has different user-selectable appearances (ex. Character vehicle can be a tank or a police car, etc.);  Characters are textured to increase realism and reduce polygonal count

101 Character selection Character type determines acceptable actions (menu selectable)

102 Appearance selection Bob_shorts Bob_shorts Joe_blue Joe_blue Bridget_skirt Bridget_skirt Diane_teen Diane_teen Character appearance (menu selectable)

103 BDI Toolkits Character level-of-detail segmentation based on distance to virtual camera improves real-time performance (up to about 100 characters can be in a scene) 38 polygons 38 polygons 2500 polygons 2500 polygons Supplemental bdi.DI- Guy-LOD.mpg

104 PeopleShop Initiation Scene Geometry Definesensors DefineBehavior Define character path Define networking

105 PeopleShop path specification Path Waypoint Slope SlopeAdjuster Initial path Extended path Added waypoint

106 BDI Toolkits Action bead bead End End bead bead Stacked action Stacked action beads beads Last action bead Last action bead

107 Editing actions VC 6.6 on book CD

108 VC 6.5 on book CD

109 PeopleShop Initiation Scene Geometry Definesensors Definebehavior Define character path Define networking

110 BDI Toolkits Sensor boundary When soldier A enters the sensor volume, the system is notified – this triggers soldier B’s shooting of A Soldier A Soldier B

111 PeopleShop sensors Sensors are user-defined volumes in space that detect when a character enters them (PeopleShop User’s Manual) Sensor boundary Supplemental bdi.farmhouse.mpg

112 PeopleShop Initiation Scene Geometry Definesensors Definebehavior Define character path Define networking

113 PeopleShop Behaviors  Behaviors can be reflex (based on signals received from sensors);  Behaviors can also be specified with decision beads;  Decision beads can be placed on the character’s path (colored red);  The two parameters characterizing a decision bead are distance and length;  Distance specifies how far from the start of the path the decision Bead is placed;  Length indicates the distance from the beginning of the decision region that the decision bead is active;  Decisions can be converted to script:

114 BDI Toolkits Decision clauses (IF/THEN/ELSE)

115 PeopleShop Run-time Run-time PeolpeShopInitiation Interactive Training Immersive Training Scenario Visualization User(s)User(s)User(s)

116 PeopleShop Run-time PeolpeShopInitiation Interactive Training Immersive Training Scenario Visualization User(s)User(s)User(s)

117 BDI PeopleShop Toolkit User is interacting in real time with the simulation using a joystick or mouse and menu. Limited control and immersion. Natural speeds should not be exceeded. (from Koechling et al., 1997) VC 6.7

118 PeopleShop Run-time Run-time PeolpeShopInitiation Interactive Training Immersive Training Scenario Visualization User(s)User(s)User(s)

119 BDI PeopleShop – Run time modes User is interacting in real time with the simulation using a trackers and sensors. Control is at the joint level and immersion is increased. Omni-directional treadmill Sensorized weapon (BDI, 1997)

120 PeopleShop Initiation Scene Geometry Definesensors Definebehavior Define character path Define networking

121 PeopleShop Networking  Updating human figures in DIS is much more bandwidth expensive than vehicles;  Vehicles have few degrees of freedom, while a human figures with 40 joints updated at 20 Hz require 800 packets/sec.;  Instead of updating every joint, PeopleShop only updates at the task level (action, position, velocity). It requires about two packets/sec to produce a smooth simulation;  Works well for large number of participants such as in dismounted infantry training.  Uses “live reckoning” vs. dead reckoning used previously for vehicles

122 BDI Toolkits Classical DIS using dead reckoning (from Koechling et al., 1997)

123 BDI Toolkits DIS using live reckoning and human-in-the-loop DI-Guy model Task-level change Task-level change (action, orientation, velocity) (from Koechling et al., 1997)

124 PeopleShop “Top Gun” (courtesy Boston Dynamics Inc.)

125 VR Toolkits discussed in this chapter NameApplPrgm mode Propri etary Language Java3D (Sun Micro) General Purpose textnoImplemented in C Programming in Java Vizard Toolkit and PeoplePak WorldViz General Purpose Text/ graph yesOpenGL-based Python scripting language GHOST (SensAble Technologies) Haptics for Phantom textyesC++ H3DHaptics/ Graphics textnoC++ PeopleShop (Boston Dynamics) Military/ civilian graphyesC/C++ Unity 3D Game engine Text/graphyes JavaScript, C#, JavaScript, C#, and Python

126 Game Engine Comparison UnityUDK/UnrealDX StudioJava3DjMonkeyEngine PriceFree / $1500Free / $$$Free / $800Free Graphical EditingYes NoMinimal Plugin required?Web onlyNoYesJVM Language Support Mono (C#) JavaScript Boo (Python) UnrealScriptJavaScriptJava External Library SupportYes Home Computer Deployment PC Mac PC Mac Linux PC Mac Linux Web DeploymentYesNoYesWebStart Game Console Deployment (Licenses Required) XBox360+Arcade Wii+WiiWare PS3 XBox360+Arcade PS3 Sony NGP No Mobile DeploymentiOS / Android AndroidNo

127 Why not Java3D?  Advantages  Open source cross-platform development.  Low-level control of scene graph and objects.  Can be used with other Java and native libraries.  Disatvantages  Scene manipulation done strictly through source, leads to slow turnaround.  Higher level control is up to the programmer.  3D sound very buggy.  Community support only, no longer any commercial support.

128 Why Unity? Unity PriceFree / $1500 Graphical EditingYes Plugin required?Web only Language Support Mono (C#) JavaScript Boo (Python) External Library SupportYes Home Computer Deployment PC Mac Web DeploymentYes Game Console Deployment (Licenses Required) XBox360+Arcade Wii+WiiWare PS3 Mobile DeploymentiOS / Android Free edition offers robust development environment and educational licenses available. Supports multiple programming languages to design and manipulate the scene. External library and.Net support allows seamless communication with additional hardware devices. Easy-to-use graphical interface allows live scene editing for efficient development and testing. Quick turnaround times. Works on almost all available platforms.

129 Unity - Physics Engine u Unity uses NVIDIA’s PhysX Engine. u Streamlined physics modeling for rigid bodies, cars, character ragdolls, soft bodies, joints, and cloths. u By simply attaching a rigid body to a game object and adding forces, realistic physical interactions can be created. u Objects with rigid bodies attached will interact with each other. u Colliders are used to control these object interactions and trigger collision events.

130 Unity Gallery

131 Unity - GUI Inspector Project Panel Scene Hierarchy

132 Unity – Project Panel u This panel shows all of the available game assets in the current project directory. u Game assets can include scripts, prefabs, 3D models, texture images, sound files, fonts, etc… u New assets can be added by simply dragging them into the project panel or placing them into the project directory. u These files can be edited and saved using external programs and the scene will be updated automatically. u Unity supports a number of 3D model formats and converts to the Autodesk FBX format when added to the project.

133 Unity - Scene Hierarchy u Provides a means of adding new game objects and managing existing ones. u Game objects can contain a combination of transforms, graphics objects, physics colliders, materials, sounds, lights, and scripts. u Each game object in the hierarchal tree represents a node in the scene graph. u Similarly to VRML and Java3D, the scene graph represents the relative spatial relationship of objects. Example: A finger connected to a hand will translate accordingly when the hand is moved.

134 Unity - Simple Hierarchy Example

135 Unity - Inspector u Shows the components attached to currently selected game object and their properties. u Manual control over an object’s transform allows precise placement of objects. u Variables exposed by scripts attached to the object can be viewed and set through this panel, allowing parameters to be changed without the need to edit source. u These changes can be done while the project is live and the scene will update immediately.

136  Defines spatial properties (Transformation matrix)  Script to destroy object after N collisions or after elapsed time. Contains particle emitter for explosion effect.  Sound associated with this object.  Controls physics and physical interactions.  Graphics mesh, this is what you will actually see. Unity - Simple Game Object

137 Unity - GUI Console Game Preview Scene Editor

138 Unity - Scene Editor u Allows graphical monitoring and manipulation of scene objects. u Switch between various orthogonal and perspective views. u Objects can be translated, rotated, and scaled graphically with the mouse. u When live, the editor works like a sandbox in which you can play around with objects without actually modifying the scene. u Shows “Gizmos” for invisible scene objects, such as light sources, colliders, cameras, and sound effects.

139 Unity - Simple JavaScript Example Public variables are exposed to the editor, allowing monitoring and editing of the live scene. This also allows for communication between objects. The Update() method is called at every frame. In this example, every time the left-mouse button is clicked (1) a copy of the input object is created and added to the scene in front of the camera (2), the cube counter is increased (3), a randomly colored material is used (4), and a force is applied (5). This gives the appearance that the object is being launched away from you. (2) (4) (5) (1) (3)

140

141 Unity - Complex Scene

142 Unity - Asset Store u A marketplace to buy and sell assets used within Unity. u This includes 3D models, textures, scripts, etc… u Can be used to drastically reduce development time, or sell assets you have created.

143 Unity - Union Marketplace u Similar to Apple’s App Store, this is a marketplace in which games can be sold for various platforms. u Allows developers to reach out to markets that would be otherwise inaccessible. u 70% of profits go to the developer while 30% goes to Union.

144 Unity - VR Applications u Unity is able to use.Net libraries and external shared libraries. u This enables the use of nearly any hardware device within Unity applications. u Cameras can be used to create augmented reality.

145 Unity - AR on IPhone Unity on iPhone


Download ppt "Electrical and Computer Engineering Dept. VR PROGRAMMING."

Similar presentations


Ads by Google