Presentation is loading. Please wait.

Presentation is loading. Please wait.

Frame Buffers Fall 2018 CS480/680.

Similar presentations


Presentation on theme: "Frame Buffers Fall 2018 CS480/680."— Presentation transcript:

1 Frame Buffers Fall 2018 CS480/680

2 An OpenGL pipeline Retrospective Stuff we know in detail
Stuff we briefly talked about ???

3 What are Frame Buffers?

4 Frame Buffers They hold data about the current frame
While a Frame Buffer is called a buffer, it is actually a collection of buffers that store the actual fragment data (like a folder of images) There are 3 main types of buffers stored in the Frame Buffer: Color Buffer Stencil Buffer Depth Buffer

5 Buffer Types Color Buffer
Stores data about the color of each pixel in a frame Usually stored as 4 bytes (RGBA) Depth Buffer Stores data about the depth of each pixel in a frame Usually stored as a float Ranges from 0.0 (the near place) to 1.0 (the far plane) Stencil Buffer Stores data about whether or not the pixel should be rendered While stored as a byte, the stencil buffer is effectively used as a boolean value

6 Frame Buffers in our engine
While there are no references to a frame buffer in code, we are implicitly writing to the most important Frame Buffer: the screen Imagine an invisible ‘glBindFramebuffer(GL_FRAMEBUFFER, 0)’ being called every time we call our render function All the code in Window.cpp is what sets up Framebuffer 0 (the back buffer for the screen)

7 What can we use frame buffers for?
While OpenGL has a data type for creating buffers that won’t be seen (Render Buffers)... we can make the buffers Textures as well Since we store data like a texture, we can sample it like a texture This allows us to do effects originally impossible in our current engine, such as mirrors, shadows, refraction, and portals

8 Example in Industry: Half-Life 2
The Source Engine (HL2’s engine) uses frame buffers to generate textures from a camera in the world, known as “Render Targets:” Screenshots of Half-Life 2, Courtesy of Valve Corporation. All rights reserved.

9 Post-Processing Shaders

10 Why Post Processing Shaders?
Problem: For each fragment in the fragment shader, I want to sample the pixels around me Issue: The GPU calculates stuff in parallel; the issue is, we don’t know when each calculation is finished Each fragment is treated as an independent calculation We don’t even know if the fragments around us have finished rendering! Solution: Store the finished fragments into a buffer and wait until all fragments have finished Use another pass to modify the finished fragments

11 High Level Overview Render scene to an offscreen texture
Draw a quad that covers the entire screen Use the rendered scene as a texture input

12 Why this works: Screen Coordinates
Without matrices, everything would be stretched and out of place, since we’re rendering using screen coordinates. We can use this to our advantage to draw a screen fitting quad

13 High Level Overview: Creating a Frame Buffer
In the same fashion as creating other buffers in OpenGL, we have to: glGenFramebuffers to allocate a space in GPU space glBindFramebuffer to use the FrameBuffer glFramebufferTexture / glFramebufferRenderbuffer to add our data Because a Frame buffer is a container for buffers, we need to first initialize the Textures and Render Buffers we attached to the Frame Buffer NOTE: A Frame Buffer must have at least one Color buffer attached to be usable

14 Learning By Example

15 Questions?


Download ppt "Frame Buffers Fall 2018 CS480/680."

Similar presentations


Ads by Google