Presentation on theme: "The Art and Technology Behind Bioshock’s Special Effects"— Presentation transcript:
1The Art and Technology Behind Bioshock’s Special Effects Presented byStephen Alexander (Effects Artist)Jesse Johnson (Graphics Programmer)
2Presentation Overview Iterative Workflow Between Art and TechLit ParticlesBioshock’s Water: An Artist’s PerspectiveTechnology Behind Interactive WaterfallsWater CausticsInteractive Rippling Water SystemOptimization Case Study: Bioshock’s Fire
4The FX Closet Jesse Johnson (left) and Stephen Alexander (right) Our iterative workflow is something that developed naturally after initially working in close proximity to each other (illustrated above), but that wound up sticking even well after we were moved to opposite ends of the 2K Boston office.
5Why Lit Particles? High contrast lighting environments Numerous atmospheric effectsWhen we began developing the effects for Bioshock, something we knew we would need almost immediately is lit particles. This was clear for a number of reasons. We knew that Bioshock was going to be a dark game with high contrast lighting to reinforce the feelings of tension and paranoia that we were aiming to create, where an ocean of darkness is broken by bright pools of light that often flicker from broken light fixtures. We also knew we wanted Bioshock to create a very atmospheric an immersive experience for the player, and general ambient effects are very effective in helping to make a computer generated rendering feel more organic and believable, things like smoke, dust in the air, mist, etc, all help make a given space feel less sterile, they introduce subtle movements, and serve to help draw the player in. These elements combined meant that it would be very difficult to create a convincing environment without particle systems that took into account the lighting of their environment. Even with lights that don't flicker, with our colorful lighting it would have been nearly impossible to custom tune unlit particle systems to our highly varied lighting setups. – Stephen Alexander
7Lit particles (left) vs. unlit particles (right) Lit particles (left) vs. unlit particles (right). While hard to see from these still shots, the particles on the left blend more naturally into the scene, helping to tie everything together. The unlit particles on the right, however, stand out and become jarring to look at. –Stephen Alexander
8Technical Challenges With Lit Particles Must be rendered in a single passNeed to satisfy very high performance requirementsPotential for shader permutation issuesCreating a particle lighting system is challenging for a variety of reasons. Multi-pass lighting techniques can’t be used because a typical particle system will be rendered in a single draw and will be alpha blended and self overlapping. Lighting a self overlapping, alpha blended particle system in multiple passes would produce incorrect visual results. The trick is then to do all of the lighting in a single pass. With this constraint in mind, we run into a new set of challenges. Each particle emitter will have a variable number of lights affecting it, and must maintain high levels of performance. We also want to avoid having to compile and store large numbers of particle shader permutations based the variable number of lights that could affect each system. Finally, we need to minimize our interpolator usage to prevent our particle system from becoming interpolator limited. –Jesse Johnson
9Lit Particles: Iteration 1 Light attenuation done per vertexStatic branching used to accumulate multiple lights in vertex shaderAttenuated color interpolated via single float3Per vertex approximation typically good for highly tessellated geometryThe initial system we created addressed all of these issues fairly elegantly. Lighting was calculated in the vertex shader for each particle emitter, using static branching to control the number of lights we processed. The static branching eliminated the need for a large number of shader permutations, and because it was done in our vertex shaders which never showed up as a bottleneck, it satisfied our performance requirements. We actually accumulated attenuated light color in the vertex shader, and then used a single float3 interpolator to pass that down to the pixel shader. From there, it was a single operation in the pixel shader to apply the lighting. It’s important to point out here that light attenuation is typically not a linear function, so linearly interpolating attenuation in this case is really just a rough approximation of more robust light calculations. In the circumstances we used it under, where individual particles were relatively small and the light radius affecting them relatively large, the quality of the approximation was more than acceptable for the extra performance we got out of it. – Jesse Johnson
10Lit Particles: Iteration 2 Problems with iteration 1:Particles too dark in total darknessParticles too bight in bright environmentsSolutionsUsed maximum of new ambient term, vertex lightingNot ambient + lightingScale RGB down if any component over 1.0newRGB = oldRGB / (1.0 + max(0.0, maxRGBChannelValue -1.0));We found, however, that relying on environment lighting alone had some failure cases, especially in terms of player feedback. When being shot at or shooting at something else in a dark space, often the hit spangs on concrete or wood would be invisible, since there was no light to hit them and the particles didn't take into account the baseline ambient of the scene. Because of this we found the need for the ability to introduce a low-end clamp option for the particle lighting, so that no matter how dark the area, the player would be able to see where their bullets were going. We also found that the opposite case could be a problem. Since our lights had the ability to go above a value of 1 for HDR lighting, we found that allowing particles to have their lighting increased by this amount created visual artifacts that we didn't like. Because of this we clamped particle brightness at 1. – Stephen Alexander
11Example of a hit reaction that is barely visible, due to particles not taking ambient lighting into account.
12Hit reaction after particle ambient was added Hit reaction after particle ambient was added. It is now much clearer to the player what their bullets are hitting.
13Lit Particles: Iteration 2 Problems with iteration 1:Particles too dark in total darknessParticles too bight in bright environmentsSolutionsUsed maximum of new ambient term, vertex lightingNot ambient + lightingScale RGB down if any component over 1.0newRGB = oldRGB / (1.0 + max(0.0, maxRGBChannelValue -1.0));To clamp the particle brightness to 1, we scaled the entire color down by the inverse of its maximum component, whether that was red, blue, or green. We had to go this route because we couldn’t use a straightforward saturate/clamp without changing the appearance of the color itself. As for the minimum contribution, instead of adding in a constant ambient term, we instead used the greater of either the ambient or the lit color for the particle. This helped us avoid letting this constant minimum contribution ‘wash out’ the normal lighting, and made it only come into play in the situations where it was needed. –Jesse Johnson
14Lit Particles: Iteration 3 Big need for per pixel lightingSpecular highlightsAvoid painting highlights into texturesFirst prototype: 1 per pixel lightRemaining lights used per vertex lightingTwo new interpolators: per pixel direction and per pixel colorAfter we had lit particles, it became clear to me that something that would be really useful for certain cases was the ability to have particles be per-pixel lit with specular highlighting. The initial reason for this was primarily for blood. I tried quite a number of different directions for the blood in Bioshock, in particular I was trying to find a way to make it appear to be wet, without painting specular elements into the particle, since that had the effect of both looking artificial, and making the individual sprites more easy to identify, which is something I'm always trying to avoid. I was pretty certain that specular highlighting would have the effect of making every particle look different, since the highlights would change depending on the particles rotation . I also thought we might be able to use it on smoke and, of course, water effects. –Stephen AlexanderOur initial implementation of this was very straightforward (but also highly problematic). We took the light source with the greatest contribution to the particle system we were rendering, and used that single light as our per pixel light source. We took the Up and Right vectors that were used in creating the corners of each quad based particle, along with their cross product, to generate a tangent basis onto which to map the light. The remaining lights used the previously discussed per vertex lighting scheme. We had to use an additional two interpolators to make this all work, one for the color of the per pixel light, and another for the direction of the per pixel light. –Jesse Johnson
17Lit Particles: Iteration 4 Numerous issues with iteration 3PoppingWashed out lightingHeavily interpolator limitedNew solution: Averaged per pixel lightingAttenuated color same as beforeLight direction for all sources accumulated per vertexWeighted by:Incident angle for each lightMagnitude of light’s color and attenuationWhile this did accomplish some of the goals we had in mind, it had a few big shortcomings. First, as the particle system moved, grew, and shrunk in size, the light source with the maximum impact on the system would change, sometime rapidly, causing unsightly popping issues. Next, when the per vertex system was used in conjunction with the per-pixel system, it had a tendency to make the per-pixel lighting look a little washed out. I worked with this system for a little while, but quickly decided it wasn’t going to be suitable for general use. – Stephen AlexanderSo, we needed to somehow process more per pixel lights in a single pass, but we couldn’t add any more processing to the pixel shader or use any more interpolator units without really crippling performance severely. Rowan Wyborn then proposed we use a variation on something Irrational Games had used on previous titles. This system calculated a single approximate light direction, per vertex, based on all of the lights affecting that vertex. It worked by accumulating direction vectors from each light source to each vertex, weighted by how much each light source affected a given particle. This accumulated vector was then interpolated to the pixel shader, and used in our standard lighting calculations along with our per vertex attenuated color. The tricky part to all of this is how those light vectors were weighted as they were accumulated. Each light vector was scaled by not only by how much its incident angle affected the particle, but also by the magnitude of the light color and light attenuation at that vertex. –Jesse Johnson
18Per-Pixel Blood And Foam This is the texture we used for the blood, animating sequence to create the illusion of thick blood breaking apart in flight. Shockingly, the initial implementation was not the last art request on this feature... -Stephen Alexander
19I was happy with the way the effect looked on blood and water foam, but whenever the light was coming from behind the particle we had a similar problem as lit particles not getting any light. The particle would become dark. In some cases this resulted in the particle flattening out (blood), in others it just looked completely wrong, as shown in the image above. The blood being dark was a problem for player feedback, since some of the most important information you get is whether or not you hit your target. What I felt I needed was the ability to make a fake light that would light the particles in the cases where they were getting too dark. – Stephen Alexander
20Lit Particles: Iteration 5 Particles would be dark depending on lighting directionImplemented fake lightAnd this ‘fake light’ was precisely the solution we used. I added the ability for Stephen to flag certain particle emitters that needed a fake light source added to its normal list of lights. This virtual light source was always placed at a fixed location relative to/offset from the player’s location. –Jesse Johnson
21The same water illustrated previously with the addition of the fake light.
22Blood without the fake light, looking pretty flat.
24Lit Particles: Final Thoughts Performance limitationsPerformance benefitsSome of the intended uses for this system turned out to just be too expensive, tests with smoke looked good, and definitely made the smoke look as if it had some actual depth, but it filled too much of the screen and was prone to make the framerate drop. In another area, it actually saved performance. In my initial investigations into water, the solution I came up with for a section of ceiling with drips raining down was to pan them down a cylinder, since I really wanted them to be a part of the scene and sparkle in the actual scene lighting. Visually this worked pretty well, but it could also be expensive, since in the right spot you could find yourself looking through two full screen passes of a pretty expensive alpha blended shader. This was workable, but with per-pixel particles, there was a much better solution, which was to just make them a particle system with a special light to create the sparkling that I felt was required. –Stephen Alexander
28Water: An Artist’s Perspective Underwater game? Probably need to make water look ok.Began using existing techLimitations spawned a variety of art requestsObviously, something we knew was going to be important that we absolutely had to get right was the way water was depicted in the game. The game takes place at the bottom of the ocean, so you're bound to run into it somewhere, was the thinking. – Stephen Alexander
29This was the test space where I experimented on a number of different water behaviors based on reference images and video as well as general observation. –Stephen Alexander
30More of that same test space. – Stephen Alexander
31Iteration 1 TextureThere are the falling cascades of water that you encounter throughout Rapture. On the art side, the process for these was mostly about first faking the desired result with the available tools, then talking with Jesse about how we could adjust the tech to make it more of a self contained system. Initially I just placed two planes next to each other with different panning rates, and was able to get the approximate motion I was looking for. This led to the request that the water shader support 2 panners for diffuse, specular, and opacity looked pretty much identical to the 2 plane method. –Stephen Alexander
33Final Implementation Textures We then found that while this was effective, it lacked the randomness that you would see in nature, so I requested the ability to have another mapping channel exposed for a mask that could be panned independently as well. This independent UV channel was useful for all the various water elements. – Stephen Alexander
35(Non) Interactive Waterfalls One of our initial high level goals was to make the water in Rapture really react to the player and to the environment around it. Our cascading waterfalls were our first real step in that direction from the tech side of things. We wanted to have any object in the game which went underneath a waterfall to realistically interact with it, and not just have water pass right through it. –Jesse Johnson
36(Semi) Interactive Waterfalls which meant that we had to stop the water from falling through objects
37Interactive Waterfalls and that objects in the waterfalls had to have water splashing down on top of them, not below them
38Interactive Waterfalls: Iteration 1 Analogous to 1D shadow mappingCamera is placed on top of waterfall, pointed downWaterfall treated like 2D plane, depth values rendered into 1D row of 2D textureUse vertex texture fetch or ATI’s R2VB to displace particle splash effectsTo accomplish this, we used a technique very similar to 1D shadow mapping. We treated each waterfall as a 2D plane. We then rendered a 1D depth map from the top of each waterfall looking downward using all of the objects affecting it. Later, when rendering the waterfall itself, we transformed each point on that waterfall into the local space of its corresponding 1D shadow map, and if the point on the waterfall was farther from the top of the waterfall than the depth map sample was, we clipped that pixel. We also used a vertex texture fetch to offset each particle effect up to the top of the character is was interacting with.This worked reasonably well, but there was one glaring artifact. The instant any object began intersecting the plane of the waterfall, the water beneath that object would instantly disappear instead of falling down to the ground realistically. The second the object left the waterfall, the water would instantly reappear. Our really old E3 video footage had this artifact in it, so after it was released I checked around to see if anyone else had noticed it. Turns out, forum posters were quick to point out the flaw, sometimes quite brutally. My feelings on this were basically :’-( so I decided the artifact would have to be fixed with the next iteration. –Jesse Johnson
39E3 FootageTough to see due to the resolution here, but note the lack of water under the Big Daddy’s left arm.
40E3 FootageOne frame later, the water has completely reappeared.
41Interactive Waterfalls: Iteration 2 Added a second 1D shadow map, as seen from the bottom of the waterfall upwardAllowed outlines to be scissored into the waterfallAdded a simulation stepGravity!To fix the issues I did two things. I introduced a simulation element to the system, which simulated the effect of gravity. I also added another 1D depth map, as would be seen from the bottom of the waterfall looking up, which allowed full shapes to be scissored into the water and not just the upper part of a shape with the lower part completely missing. It would have been nice to add an acceleration component to this system, but after the major popping issue was solved we decided that would be enough, and moved on. -Jesse Johnson
42Note the mid air splash effect to the left of the Big Daddy Note the mid air splash effect to the left of the Big Daddy. It’s being pulled back down to the ground by the gravity simulation.
43Water Caustics Light that bounces off of water surfaces Important in creating water atmosphericsAnother element of the water was caustic light reflections into the environment. This is the rippling pattern that you see when light passes through water or reflects off the surface onto it's surroundings. –Stephen Alexander
44Caustics: Iteration 1 Texture sequence Projectors/Decals used to place effectOur initial implementation was to create an 30 frame looping texture sequence and project it as a decal onto the walls. Since the caustics had to move and distort, this was the best solution we had available. However, it was just too much memory to be used extensively and really didn't look all that great. –Stephen Alexander
46Caustics: Iteration 2 1 Diffuse texture, sampled twice .uv sample + swizzled .vu sampleDiffuse texture UV’s offset by separate normal mapAfter thinking for a while about a potentially cheap method for achieving the same results, I suggested a shader to Jesse where the same texture would have two independent panners and one of them would be swizzled. The resulting combination would then be distorted by a normal map that had it's own independent panner to create the distorted rippling effect that we wanted. –Stephen Alexander
49Caustics: Iteration 3 Use modulate blend mode Simulates surface brighteningThe other piece of this tech was the way it was drawn into the environment. Initially it was basically additive or a straight alpha blended overlay, which looked OK, but didn't really create the illusion that the surface was being brightened by light. Although you'd think that an additive texture would create the result you want, it actually ends up overpowering the detail on the surface your projecting onto and replacing too much of the color. With a light you'd expect the surface details to become amplified and brightened. –Stephen AlexanderOriginally when I implemented this I was doing a resolve/scene color buffer sample then scaling that value up manually and overwriting it back into the scene. After getting this in, I went to Stephen to have our typical post feature wrap up talk and let him know the performance on these would still be a little worse than their additive counterpart. At this point he asked why I wasn’t just using a modulate blend mode combined with values higher than 1.0, which of course was the perfect solution here in terms of both quality and performance. This might seem like a minor oversight on my part, but I bring it up for a very good reason. I talk to Stephen about how every tech system I code works, down to the most minute details. Sometimes he’ll catch minor things I’m doing wrong or could do better, but more importantly he just brings a completely new perspective to the tech. It gives him the chance to make informed suggestions on everything from quick extensions to the tech to ideas for optimizations. This is something that was consistently a huge win for us. – Jesse Johnson
50Shader implementation of caustics with modulate blend mode.
51Water = LightingControlling how light hits water surfaces is very importantSpecial lighting system extension gave artists precise control over light contributionsA relative simple piece of tech turned out to be incredibly important in making the water look the way we wanted it to. In our version of Unreal, there is the ability to make an object or a light use the "Special Lit" channel, which means that objects flagged will only be affected by lights that also have the flag flipped. In order for water to read properly, probably the most important element is how it's lit. Ideally you want to emphasize the specular elements of the water as much as possible. A reflection map can be effective for this, but can sometimes make the water appear to glow in a dark environment. What we found is that by expanding the special lit system to support an arbitrary number of special lighting channels, we could pretty much provide a specific lighting set up for every water feature in the game. This made a huge difference in how convincing the water effects were. Any place there was a pool of water, drips, or a cascade, we'd set up a number of lights close to the feature and adjusted so as to reflect the most convincing specular highlight to as many viewpoints as possible. – Stephen Alexander
52Interactive Rippling Water Considered four basic approachesExposing ripple locations to entire fluid surface shaderDynamically tessellating fluid surfaces around ripple areas using Delaunay TriangulationsDynamically tessellating fluid surfaces around ripple areas using real time 2D BSP treesDeferred rendering approachAnother thing that we knew we wanted for Bioshock’s water right from the beginning was interactive pools of water. When characters walked through it, or objects splashed into it, we wanted the surface of the water to react naturally and visually display that interaction. In other words, we wanted real time interactive rippling. We came up with a lot of ideas for this that we didn’t end up using. –Jesse Johnson
53First Approach Bind an array of ripple locations to pixel shader For each pixelFor each rippleCalculate UV’s, sample ripple normal mapPerformance of every pixel bound by number of ripples on entire surfaceWe thought about adding some fixed number of ripple locations as an array of pixel shader parameters and sampling in ripples at these points, but under this scheme each pixel would have to check itself against each potential ripple location, and the performance of the entire surface would degrade relative to the current number of ripples anywhere on the water surface. –Jesse Johnson
54Tessellation Approach We also experimented with and even built two rough prototypes around the idea of dynamically tessellating water surfaces on a 2D plane around areas where ripples were occurring, then rendering sections of the water surface with different shaders and UV sets depending on how many ripples were affecting each given area. –Jesse Johnson
55Delaunay Triangulation Method Run Delaunay Triangulation using all ripple quad cornersFor each triangle generatedGenerate UV’s for all ripples the triangle intersectsBatch triangles according to how many ripples they were affected byThe first prototype was developed around Delaunay triangulations, but the tessellation with this technique was very unpredictable and could lead to sections of the water surface with no ripples on them being treated as though they were ripple sections. It was also very CPU intensive, 3ms+ for 80 ripples on our test platform. –Jesse Johnson
56Issues with Delaunay Method An example of how the Delaunay method could create ripple influenced triangles across areas where no ripples were present.
57BSP Method Build 2D BSP tree using edges of each ripple quad Last edge added from ripple quad is flagged to indicate child leafs are affected by that ripplePost build, tree could be traversed, adding new geometry for each BSP leaf.The next iteration on this involved creating a 2D BSP tree over the water surface, with the edges of each ripple used as the BSP split locations. After the tree was built, we could ‘walk’ it and determine how many ripples each leaf of the tree had on it via special flags marked on the split locations. The results of this were actually weren’t too bad, 80 ripples could be processed in under 1ms, but it still had its limitations. We needed a new shader permutation/draw batch for each new number of ripples we wanted to be able to process simultaneously, and there were also concerns about how was texture cache coherency was going to hold up. Also, < 1 ms is still a really long time when your entire frame has to execute in less than 33. From here we shifted to more of a purely GPU based effect. –Jesse Johnson
58Deferred Rendering Approach Render all ripple info into a ‘Ripple Buffer’Same resolution as main color/depth buffersSigned FLOAT16 buffer allows normal and diffuse to be stored without MRTStore normals in local space to the water surfaceSample ripple buffer from fluid shaders in screen spaceThe final approach we came up with was loosely based on deferred rendering techniques. We used a screen sized ripple buffer to store ripple information too, and sampled from this buffer later as we rendered each water surface. We took our active ripple’s and rendered/accumulated their normal maps as well as bits of diffuse foam information to this buffer, putting the xy components of the normal map in the red and green channels of the ripple buffer, and a monochrome sample from a diffuse texture in the blue channel. We used a signed float16 buffer for this so we could store negative normal components easily and without issue. The ripple geometry was rendered every frame into this buffer in world space, but the normals themselves were stored in local space to the water surface they affected. Later, when rendering water surfaces with this rippling technology enabled, they would sample from this buffer in screen space, add the stored normal map offsets to the ordinary normal of the water surface, as well as the monochrome diffuse foam value to the ordinary diffuse value of the surface to complete the effect. –Jesse JohnsonAt this point the technique was looking pretty good, however from the art side is was a little too clean and perfect. All the various water surfaces in rapture have some movement on them, and clean ripples moving across perturbed water just felt inorganic and artificial. Having recently had some good success distorting the caustics with a panning normal map, we felt that this would be a great way to introduce some slight random motion to the interactive ripples. Jesse added this, along with tools to allow us to tweak ripple speeds and sizes which eventually gave us the kind of results we were looking for. –Stephen Alexander
62Visualization of the normals stored in the Ripple Buffer.
63Visualization of the normals stored in the Ripple Buffer.
64Visualization of the normals stored in the Ripple Buffer.
65Fire OptimizationHaving art and tech on the same page during optimizations is priceless!!Fire optimizations:Merged multiple particle emitters and lights into single emitters and lightsScaled particle size, spawn rates depending on parent object sizeReduced particle spawn rates where possibleAnother way that this process really pays off is when you come to the point in the project where you have to start making hard decisions about performance optimizations and some of the aesthetic tradeoffs that those entail. The dynamic fire system was a really good example of this. In Bioshock, pretty much any dynamic object can catch on fire or be explicitly lit on fire by the player. The player can start standing fires in the world that can then spread to characters, props, etc. This has the potential to quickly get out of hand. These fires also have dynamic lighting components, since it would look strange for a fire in a world as dark as rapture to emit no light. It became very clear that to make sure that the game ran smoothly in even the worst fire related scenarios, we would have to come up with a number of ways to optimize the effects. This was where the close communication between tech and art was very important. Because Jesse kept me informed of the general functionality of our various systems and exactly where performance was going, I was able to make adjustments to effects where they would see the least visual degradation for the biggest performance gain. In addition, I could make suggestions for things that I felt might preserve the look while addressing some of tech’s concerns. The result was a game that ran smoothly in even some of the most insane circumstances. –Stephen Alexander