Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sandstorm: A Dynamic Multi- contextual GPU-based Particle System, that uses Vector Fields for Particle Propagation By: Michael Smith.

Similar presentations


Presentation on theme: "Sandstorm: A Dynamic Multi- contextual GPU-based Particle System, that uses Vector Fields for Particle Propagation By: Michael Smith."— Presentation transcript:

1 Sandstorm: A Dynamic Multi- contextual GPU-based Particle System, that uses Vector Fields for Particle Propagation By: Michael Smith

2 acknowledgments

3 Overview Introduction Background Idea Software Engineering Prototype Results Conclusions and Future Work

4 Introduction The use of Virtual Reality(VR) to visualize scientific phenomenon, is quite common. VR can allow a scientists to immerse themselves in the phenomenon that they are studying.

5 Introduction Such phenomenon, such as dust clouds or smoke, would need a particle system to visualizes such fuzzy systems. Vector fields can be used to 'guide' particles according to real scientific data. Not a new idea, Vector Fields by Hilton and Egbert, c 1994.

6 Introduction VR applications and simulations require a multi- context environment. A main context, controls and updates multiple rendering contexts. This multi-contextual environment can cause problems with particle systems.

7 Introduction GPU offloading techniques have been proven to allow applications and simulations to offload work to the graphics hardware. This can allow for acceleration of non-traditional graphics calculations. GPU offloading can be used to accelerate particle calculations.

8 Introduction Sandstorm  Dynamic  Multi-contextual  GPU-based  Particle System  Using Vector Fields for Particle Propagation

9 Background Helicopter and Dust Simulation(Heli-Dust), is a scientific simulation in which the effect of a helicopter's downdraft on the surrounding desert terrain. Written using the Dust Framework, a framework which allows the developer to setup a scene using an XML file

10 Background

11 Early prototypes for Heli-Dust, implemented a very simple particle system. This particle system did not have a way to guide particles, according to observed scientific data.

12 Background Virtual Reality, is a technology which allows a user to interact with a computer-simulated environment, be it a real or imagined one. Immerses the user in an environment.

13 Background Depth Cues, is an indicator in which a human can perceive information regarding depth. They come in many shapes and sizes.  Monoscopic  Stereoscopic  Motion

14 Background Monoscopic depth cue  Information from a single eye, or image is available.  Information can include: Position Size Brightness

15 Background Stereoscopic depth cue:  Information from two eyes.  This information is derived from the parallax between the different images received by each eye.  Parallax, is the apparent displacement of objects viewed from different locations.

16 Background Motion depth cue  Motion parallax  The changing relative position between the head and the object being observed.  Objects in the distance move less than objects closer to the viewer.

17 Background Stereoscopic Displays, 'trick' the user's eyes into thinking there is depth where no depth exists. Come in all shapes and sizes.

18 Background

19

20 Multiple Contexts  A main context which controls multiple rendering contexts.  Because of these multiple context, a Virutal Reality application developer needs to make sure that all context sensitive information and algorithms are multiple context safe.

21 Background There are many Virtual Reality toolkits and libraries. Such toolkits and libraries handle things such as:  Generating Stereoscopic Images  Setting up the VR environment  And some handle distribution methods.

22 Background Virtual Reality User Interface, or VRUI, is a virtual reality development toolkit. Developed by Oliver Kreylos at UC Davis. VRUI's main mission statement is to shield the developer from a particular configuration of a VR system.

23 Background Tries to accomplish the mission by the abstraction of three main areas  Display abstraction  Distribution abstraction  Input abstraction Another feature of VRUI is its built in menu systems.

24 Background

25 FreeVR, developed and maintained by William Sherman. Open-source virtual reality interface/intergration library. FreeVR was designed to work on a diverse range of input and output hardware. FreeVR currently is designed to work on shared memory systems.

26 Background In 1983, William T. Reeves wrote, Particle Systems – A Technique for Modeling a Class of Fuzzy Objects. This paper introduces the particle system, a modeling method that models an object as a cloud of primitives particles that define its volume.

27 Background Reeves categories particle systems as “fuzzy” objects, in which they do not have smooth, well- defined, and shiny surfaces. Instead their surfaces are irregular, complex, and ill defined. This particle system was used to create the Genesis Effect, for the movie Star Trek II: The Wrath of Khan.

28 Background

29

30 Reeves described, in his paper, a particle system that had five steps.  Particle Generation  Particle Attributes Assignment  Particle Dynamics  Particle Extinction  Particle Rendering

31 Background Particle Generation  First the number of particles to be generated per time interval is calculated.  Then the particles are generated.

32 Background Particle Attributes Assignment, whenever a particle is created, the particle system must determine values for the following attributes:  Initial position and velocity  Initial size, color and transparency  And initial shape and lifetime. Initial position of the particles is determined by a generation shape.

33 Background

34 Particle Dynamics, once all the particles have been created and assign initial attributes, the positions and or velocities are updated. Particle Extinction, once a particle has live past its predetermined lifetime, measured in frames, the particle dies.

35 Background Particle Rendering, once the position and appearance of the particles where determined the particles are rendered. Two assumption where made  Particles do not intersect with other surface-based objects.  Particles where considered point light sources.

36 Background In recent years, graphics vendors have replaced areas of fixed functionality with areas of programmability. Two such areas are the Vertex and Fragment Processors.

37 Background Vertex Processor, is a programmable unit that operates on incoming vertex values. Some duties of the vertex processor are:  Vertex transformation  Normal transformation and normalization  Texture coordinate generation and transformation.

38 Background Fragment Processor, is a programmable unit that operates on incoming fragment values. Some duties of the fragment processor are:  Operations on interpolated values.  Texture access.  Texture application.  Fog

39 Background While a program, shader, is running on one of these processors, the fixed functionality is disable. Several programming languages where created to aid in the development of shaders, one such langauge is OpenGL Shading Language(GLSL).

40 Background

41 Vertex and Fragment shaders can't create vertices, only work on data past to them. Geometry shaders can create any number vertices. Can allow shaders to create geometry without having to be told to by the CPU.

42 Background Transform Feedback, allows a shader to specify the output buffer. The target output buffer can be the input buffer of another shader. Allows developers to create multi-pass shaders that do not relay information back to the CPU for the other passes.

43 Background ParticleGS, is a Geometry Shader based particle system, that does the following:  Stores particle information in Vertex Buffer Objects.  Uses a Geometry shader to create particles, and store them as vertex information in VBOs.  Uses Transform Feedback, to send particle data in between shaders.  Uses a Geometry shader to create billboards and point sprites to render particles.

44 Background In the days before shaders, the GPU was used just for rendering. But with the advent of shaders, GPU's can now be used to aid scientific computation. One can 'trick' the GPU into thinking that it is working on rendering information

45 Background Uber Flow, is a system for real-time animation and rendering of large particle sets using GPU computation. Million Particle System, a GPU-based particle system that can render a large set of particles

46 Background Both particle systems doing the following  Store particle information to textures.  Use a series of vertex and fragment shaders to update the particle information.  Use the CPU to create and send rendering information.  And use a series of vertex and fragment shaders to render the information from CPU.

47 Background

48 Idea Sandstorm  Dynamic  Multi-contextual  GPU-based  Particle System  That uses Vector Fields for Particle Propagation.

49 Idea Dynamic, Sandstorm should have the ability to change certain attributes on the fly.  Rate of emission  Size of particles  Lifetime of particles

50 Idea Multi-contextual, as previously stated 3D VR environments uses multiple contexts. Thus Sandstorm must be designed to handle these multiple contexts.  Random number generation  Between screen consistency

51 Idea GPU-based, Sandstorm will be designed to leverage the uses of todays most powerful and advanced GPUs. Use Geometry shaders to create, update, and render particles. Use Transform Feedback to direct data between shaders.

52 Idea Vector Fields, in order to 'guide' particles according to observed scientific data, vector fields will be used in Sandstorm. But, Sandstorm should not be a vector field simulator, only take vector fields.

53 Software Engineering

54

55

56

57 Prototype GPU-Based Particle System, like most particle systems, Sandstorm has three main phases:  Creation and Destruction  Update  Rendering

58 Prototype Creating and Destroying Particles, Traditional particle systems would create a particle and store it into a dynamically growing data structure. But, GPU-based particle systems store the particles in a texture.

59 Prototype Textures need to describe a rectangular area that encompasses the entire area of the data. For example if we had 19 members, the texture would need to cover an area of 20. Not so with VBO's VBO's can fit the exact amount of data that is to be used.

60 Prototype Like Million Particle System and Uber Flow, Sandstorm stores its particle information in a double buffered approach. Each frame one of the buffers is used as the read buffer and the other as a write buffer. Each buffer holds two VBO's, one for the particle position the other for the velocities

61 Prototype Geometry shaders can emit one or more vertices. At the beginning of the Creation/Destruction phase, the read buffer is passed to the shader. The shader then determines if its dealing with an emitter.

62 Prototype If the shader is dealing with an emitter the following happens.  How many particles are to be generated is determined.  The initial information for the particles is determined.

63 Prototype Determining amount of particles to be emitted  How many particles have already been emitted, a  Subtract a from the about of particles that is to be emitted per second, p  Divide p by the amount of time left in that cycle, each cycle is a second.

64 Prototype Determining the initial information of particles  A random information texture is used.  The texture is Translated, Scaled, and Rotated, by a random amount, then sent to the shader.  Emitters, have random numbers in the velocity information, that is used to do texture look ups.  The use of the texture is to make sure the particles are consistent between contexts

65 Prototype If the Creation/Destruction shader is passed a particle something different happens. First the particle is determined to be alive or dead.  If alive, the particle is re-emitted into the buffer.  If dead, the a blank particle is emitted into the buffer.

66 Prototype Updating Particles, once new particles are created and old ones destroyed, the particles are updated:  The delta time between frames is passed to the update shader.  A lookup in the vector field, 3D texture, is done according to the particles position.  Vector field velocity is added to particles velocity, and then the particles position is updated.

67 Prototype Once the particles have been updated, the particles are then rendered. The particle positions are passed to the render shader, using Transform Feedback. Particles are rendered as either  Textured deferred shaded billboards  Or, points.

68 Prototype Particle position represent the center of the particle. So, four points have to be determined to create a billboard. Information that is already known, the center of the particle and the vector pointing to the eye.

69 Prototype

70 Once the vectors are found, they can be added to the particles position to get the four points.

71 Prototype Once the points are found, they can be emitted to create the billboard. Once the billboard has been emitted, a texture is applied. Once the result of billboard shader is determined, a deferred shading method can be applied.

72 Prototype Currently Sandstorm uses a deferred shading method to blend the particles together. First step is to accumulate the particles, per pixel, so that the more particles that are behind a particle pixel the denser it looks. Once that is done the result can be rendered to a full screen quad with the result textured onto it.

73 Prototype. There is a catch though. Using this method requires the user of Sandstorm to:  Render the scene into an off-screen buffer, with a depth buffer attached.  Give the deferred shader the depth buffer, so that it can blend with the scene, obscuring any solid object and also being obscured by solid objects.

74 Prototype Vector Fields, in Sandstorm are represented as 3D textures. A texture was used instead of a VBO, because of existing internal methods for dealing with 3D textures, such as wrapping, indexing, and interpolating.

75 Prototype When a lookup is done on the vector field, to get information for a particle particle the following happens:  The position of the particle is interpolated  Dividing the members by the width, height, and depth of the vector field.

76 Prototype Like previously stated, when dealing with a multi-contextual environment, one must be care to make context sensitive data and algorithms conform to the multi-contextual environment. This is handle in Sandstorm, by having a controlling class, which is stored on the main context of the VR environment, control multiple update/render classes.

77 Prototype

78 Dynamic Sandstorm, has the ability to change some of its attributes, both at run-time and compile-time.

79 Results A sample application was created to test out Sandstorm. The Heli-Dust application was used as a test bed. A basic vector field was used in the sample application

80 Results Considering that Sandstorm is not a vector field simulator, a simple helicopter interaction model was made, as the helicopter throttle increase:  The rate of emission was increased.  The lifetime of the particle was increased.  And the maximum amount of particles was increased.

81 Results The sample application was run on the following system, that powered a four side CAVE environment.  A multi-cored shared memory machine, with four quad-core chips.  48 Gbs of RAM  an Nvidia Quadroplex  Running Ubuntu 7.10 Linux

82 Results Rendered 300,000 deferred shaded particles at:  15-20 FPS while standing in the particle system  ~65 FPS while standing a good distance back from the particle system.

83 Results Show movies.

84 Conclusion and Future Work Vector fields can be used to 'guide' particles. Sandstorm can run in a multi-contextual environment. Sandstorm utilizes the latest in GPU off-loading techniques Sandstorm can render more than 100,000 particles at above 15 FPS.

85 Conclusion and Future Work Opitmizations:  Currently both emitters and particles reside in the same buffers, separating them can limit branching in shaders.  Currently buffer sizes are static, allowing them to grow and shrink can increase speed of updating and rendering.

86 Conclusion and Future Work Other improvements  Collisions between the particles and objects in the scene.  Soft Particles, Motion Blur, and Light Scattering could be used to give the particles more realism  A shader based physics model could be implemented to allow user to change the behavior of the particles

87 Conclusion and Future Work Other work  A vector field simulator could be create to feed Sandstorm dynamically changing vector fields, so that particle motion acts more naturally.  A vector field creator/editor can be create to help scientist visualize vector fields before they are used in Sandstorm.

88 Questions/Comments?


Download ppt "Sandstorm: A Dynamic Multi- contextual GPU-based Particle System, that uses Vector Fields for Particle Propagation By: Michael Smith."

Similar presentations


Ads by Google