Week 3 - Monday.  What did we talk about last time?  Graphics rendering pipeline  Rasterizer Stage.

Slides:



Advertisements
Similar presentations
Perspective aperture ygyg yryr n zgzg y s = y g (n/z g ) ysys y s = y r (n/z r ) zrzr.
Advertisements

COMPUTER GRAPHICS CS 482 – FALL 2014 NOVEMBER 10, 2014 GRAPHICS HARDWARE GRAPHICS PROCESSING UNITS PARALLELISM.
Lecture 38: Chapter 7: Multiprocessors Today’s topic –Vector processors –GPUs –An example 1.
GLSL Basics Discussion Lecture for CS 418 Spring 2015 TA: Zhicheng Yan, Sushma S Kini, Mary Pietrowicz.
Week 8 - Friday.  What did we talk about last time?  Radiometry  Photometry  Colorimetry  Lighting with shader code  Ambient  Directional (diffuse.
Understanding the graphics pipeline Lecture 2 Original Slides by: Suresh Venkatasubramanian Updates by Joseph Kider.
CS-378: Game Technology Lecture #9: More Mapping Prof. Okan Arikan University of Texas, Austin Thanks to James O’Brien, Steve Chenney, Zoran Popovic, Jessica.
Week 8 - Wednesday.  What did we talk about last time?  Textures  Volume textures  Cube maps  Texture caching and compression  Procedural texturing.
The Programmable Graphics Hardware Pipeline Doug James Asst. Professor CS & Robotics.
Introduction to Shader Programming
Computer Science – Game DesignUC Santa Cruz Adapted from Jim Whitehead’s slides Shaders Feb 18, 2011 Creative Commons Attribution 3.0 (Except copyrighted.
GPU Graphics Processing Unit. Graphics Pipeline Scene Transformations Lighting & Shading ViewingTransformations Rasterization GPUs evolved as hardware.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
GAM532 DPS932 – Week 1 Rendering Pipeline and Shaders.
Week 3 - Wednesday.  What did we talk about last time?  Project 1  Graphics processing unit  Programmable shading.
CSE 872 Dr. Charles B. Owen Advanced Computer Graphics1 Illumination and Shading Lights Diffuse and Specular Illumination BasicEffect Setting and Animating.
REAL-TIME VOLUME GRAPHICS Christof Rezk Salama Computer Graphics and Multimedia Group, University of Siegen, Germany Eurographics 2006 Real-Time Volume.
Week 3 - Friday.  What did we talk about last time?  Vertex shaders  Geometry shaders  Pixel shaders.
Basic Graphics Concepts Day One CSCI 440. Terminology object - the thing being modeled image - view of object(s) on the screen frame buffer - memory that.
GPU Programming Robert Hero Quick Overview (The Old Way) Graphics cards process Triangles Graphics cards process Triangles Quads.
CHAPTER 4 Window Creation and Control © 2008 Cengage Learning EMEA.
Programmable Pipelines. Objectives Introduce programmable pipelines ­Vertex shaders ­Fragment shaders Introduce shading languages ­Needed to describe.
Real-time Graphical Shader Programming with Cg (HLSL)
Geometric Objects and Transformations. Coordinate systems rial.html.
GPU Shading and Rendering Shading Technology 8:30 Introduction (:30–Olano) 9:00 Direct3D 10 (:45–Blythe) Languages, Systems and Demos 10:30 RapidMind.
Programmable Pipelines. 2 Objectives Introduce programmable pipelines ­Vertex shaders ­Fragment shaders Introduce shading languages ­Needed to describe.
Week 2 - Wednesday CS361.
Chris Kerkhoff Matthew Sullivan 10/16/2009.  Shaders are simple programs that describe the traits of either a vertex or a pixel.  Shaders replace a.
A Crash Course in HLSL Matt Christian.
Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.
Computer graphics & visualization The programmable (D3D 10) Pipeline.
1 Dr. Scott Schaefer Programmable Shaders. 2/30 Graphics Cards Performance Nvidia Geforce 6800 GTX 1  6.4 billion pixels/sec Nvidia Geforce 7900 GTX.
Real-Time High Quality Rendering CSE 291 [Winter 2015], Lecture 4 Brief Intro to Programmable Shaders
Introduction to XNA Graphics Programming Asst. Prof. Rujchai Ung-arunyawee COE, KKU.
Computer Graphics The Rendering Pipeline - Review CO2409 Computer Graphics Week 15.
GRAPHICS PIPELINE & SHADERS SET09115 Intro to Graphics Programming.
CS662 Computer Graphics Game Technologies Jim X. Chen, Ph.D. Computer Science Department George Mason University.
Programmable Pipelines Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts Director, Arts Technology Center University.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Computer Graphics 3 Lecture 6: Other Hardware-Based Extensions Benjamin Mora 1 University of Wales Swansea Dr. Benjamin Mora.
COMPUTER GRAPHICS CS 482 – FALL 2015 SEPTEMBER 29, 2015 RENDERING RASTERIZATION RAY CASTING PROGRAMMABLE SHADERS.
09/25/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Shadows Stage 2 outline.
Ray Tracing using Programmable Graphics Hardware
What are shaders? In the field of computer graphics, a shader is a computer program that runs on the graphics processing unit(GPU) and is used to do shading.
GLSL I.  Fixed vs. Programmable  HW fixed function pipeline ▪ Faster ▪ Limited  New programmable hardware ▪ Many effects become possible. ▪ Global.
An Introduction to the Cg Shading Language Marco Leon Brandeis University Computer Science Department.
Computer Science – Game DesignUC Santa Cruz Tile Engine.
COMP 175 | COMPUTER GRAPHICS Remco Chang1/XX13 – GLSL Lecture 13: OpenGL Shading Language (GLSL) COMP 175: Computer Graphics April 12, 2016.
How to use a Pixel Shader CMT3317. Pixel shaders There is NO requirement to use a pixel shader for the coursework though you can if you want to You should.
GPU Architecture and Its Application
Week 8 - Monday CS361.
Week 3 - Monday CS361.
Programmable Shaders Dr. Scott Schaefer.
Week 8 - Wednesday CS361.
Week 2 - Friday CS361.
Programmable Pipelines
Graphics Processing Unit
Introduction to OpenGL
Chapter 6 GPU, Shaders, and Shading Languages
The Graphics Rendering Pipeline
CS451Real-time Rendering Pipeline
UMBC Graphics for Games
Chapter VI OpenGL ES and Shader
Graphics Processing Unit
Computer Graphics Practical Lesson 10
Computer Graphics Introduction to Shaders
CIS 441/541: Introduction to Computer Graphics Lecture 15: shaders
03 | Creating, Texturing and Moving Objects
Introduction to OpenGL
CS 480/680 Computer Graphics GLSL Overview.
Presentation transcript:

Week 3 - Monday

 What did we talk about last time?  Graphics rendering pipeline  Rasterizer Stage

 An effect says how things should be rendered on the screen  We can specify this in details using shader programs  The BasicEffect class gives you the ability to do effects without creating a shader program  Less flexibility, but quick and easy  The BasicEffect class has properties for:  World transform  View transform  Projection transform  Texture to be applied  Lighting  Fog

 Vertices can be stored in many different formats depending on data you want to keep  Position is pretty standard  Normals are optional  Color is optional  We will commonly use the VertexPositionColor type to hold vertices with color

 The GPU holds vertices in a buffer that can be indexed into  Because it is special purpose hardware, it has to be accessed in special ways  It seems cumbersome, but we will often create an array of vertices, create an appropriately sized vertex buffer, and then store the vertices into the buffer VertexPositionColor[] vertices = new VertexPositionColor[3] { new VertexPositionColor(new Vector3(0, 1, 0), Color.Red), new VertexPositionColor(new Vector3(+0.5f, 0, 0), Color.Green), new VertexPositionColor(new Vector3(-0.5f, 0, 0), Color.Blue) }; vertexBuffer = Buffer.New (GraphicsDevice, 3, BufferFlags.VertexBuffer); vertexBuffer.SetData (vertices); inputLayout = VertexInputLayout.New (0); VertexPositionColor[] vertices = new VertexPositionColor[3] { new VertexPositionColor(new Vector3(0, 1, 0), Color.Red), new VertexPositionColor(new Vector3(+0.5f, 0, 0), Color.Green), new VertexPositionColor(new Vector3(-0.5f, 0, 0), Color.Blue) }; vertexBuffer = Buffer.New (GraphicsDevice, 3, BufferFlags.VertexBuffer); vertexBuffer.SetData (vertices); inputLayout = VertexInputLayout.New (0);

 In order to draw a vertex buffer, you have to:  Set the basic effect to have the appropriate transformations  Set the vertex buffer on the device as the current one being drawn  Set the vertex input layout so that the device knows what's in the vertex buffer  Loop over the passes in the basic effect, applying them  Draw the appropriate kind of primitives basicEffect.World = world; basicEffect.View = view; basicEffect.Projection = projection; GraphicsDevice.SetVertexBuffer (vertexBuffer); GraphicsDevice.SetVertexInputLayout(inputLayout); foreach (EffectPass pass in basicEffect.CurrentTechnique.Passes) { pass.Apply(); GraphicsDevice.Draw(PrimitiveType.TriangleList, 3); } basicEffect.World = world; basicEffect.View = view; basicEffect.Projection = projection; GraphicsDevice.SetVertexBuffer (vertexBuffer); GraphicsDevice.SetVertexInputLayout(inputLayout); foreach (EffectPass pass in basicEffect.CurrentTechnique.Passes) { pass.Apply(); GraphicsDevice.Draw(PrimitiveType.TriangleList, 3); }

 Sometimes a mesh will repeat many vertices  Instead of repeating those vertices, we can give a list of indexes into the vertex list instead short[] indices = new short[3] { 0, 1, 2}; indexBuffer = Buffer.New (GraphicsDevice, 3, BufferFlags.IndexBuffer); indexBuffer.SetData (indices); short[] indices = new short[3] { 0, 1, 2}; indexBuffer = Buffer.New (GraphicsDevice, 3, BufferFlags.IndexBuffer); indexBuffer.SetData (indices);

 Once you have the index buffer, drawing with it is very similar to drawing without it  You simply have to set it on the device  The false means that the indexes are short values instead of int values  Then call DrawIndexed() instead of Draw() on the device basicEffect.World = world; basicEffect.View = view; basicEffect.Projection = projection; GraphicsDevice.SetVertexBuffer (vertexBuffer); GraphicsDevice.SetVertexInputLayout(inputLayout); GraphicsDevice.SetIndexBuffer(indexBuffer, false); foreach (EffectPass pass in basicEffect.CurrentTechnique.Passes) { pass.Apply(); GraphicsDevice.DrawIndexed(PrimitiveType.TriangleList, 3); } basicEffect.World = world; basicEffect.View = view; basicEffect.Projection = projection; GraphicsDevice.SetVertexBuffer (vertexBuffer); GraphicsDevice.SetVertexInputLayout(inputLayout); GraphicsDevice.SetIndexBuffer(indexBuffer, false); foreach (EffectPass pass in basicEffect.CurrentTechnique.Passes) { pass.Apply(); GraphicsDevice.DrawIndexed(PrimitiveType.TriangleList, 3); }

 An icosahedron has 20 sides, but it only has 12 vertices  By using an index buffer, we can use only 12 vertices and 60 indices  Check out the XNA tutorial on RB Whitaker's site for the data:   There are some minor changes needed to make the code work

 It is very common to define primitives in terms of lists and strips  A list gives all the vertex indices for each of the shapes drawn  2n indices to draw n lines  3n indices to draw n triangles  A strip gives only the needed information to draw a series of connected primitives  n + 1 indices to draw a connected series of n lines  n + 2 indices to draw a connected series of n triangles

 GPU stands for graphics processing unit  The term was coined by NVIDIA in 1999 to differentiate the GeForce256 from chips that did not have hardware vertex processing  Dedicated 3D hardware was just becoming the norm and many enthusiasts used an add- on board in addition to their normal 2D graphics card  Voodoo2

 Modern GPU's are generally responsible for the geometry and rasterization stages of the overall rendering pipeline  The following shows colored-coded functional stages inside those stages  Red is fully programmable  Purple is configurable  Blue is not programmable at all Vertex Shader Geometry Shader Clipping Screen Mapping Triangle Setup Triangle Traversal Pixel Shader Merger

 You can do all kinds of interesting things with programmable shading, but the technology is still evolving  Modern shader stages such as Shader Model 4.0 and 5.0 (DirectX 10 and 11) use a common-shader core  Strange as it may seem, this means that vertex, pixel, and geometry shading uses the same language

 They are generally C-like  There aren't that many:  HLSL: High Level Shading Language, developed by Microsoft and used for Shader Model 1.0 through 5.0  Cg: C for Graphics, developed by NVIDIA and is essentially the same as HLSL  GLSL: OpenGL Shading Language, developed for OpenGL and shares some similarities with the other two  These languages were developed so that you don't have to write assembly to program your graphics cards  There are even drag and drop applications like NVIDIA's Mental Mill

 To maximize compatibility across many different graphics cards, shader languages are thought of as targeting a virtual machine with certain capabilities  This VM is assumed to have 4-way SIMD (single- instruction multiple-data) parallelism  Vectors of 4 things are very common in graphics:  Positions: xyzw  Colors: rgba  The vectors are commonly of float values  Swizzling and masking (duplicating or ignoring) vector values are supported (kind of like bitwise operations)

 A programmable shader stage has two types of inputs  Uniform inputs that stay constant during draw calls ▪ Held in constant registers or constant buffers  Varying inputs which are different for each vertex or pixel

 Fast operations: scalar and vector multiplications, additions, and combinations  Well-supported (and still relatively fast): reciprocal, square root, trig functions, exponentiation and log  Standard operations apply: + and *  Other operations come through intrinsic functions that do not require headers or libraries: atan(), dot(), log()  Flow control is done through "normal" if, switch, while, and for (but long loops are unusual)

 In 1984, Cook came up with the idea of shade trees, a series of operations used to color a pixel  This example shows what the shader language equivalent of the shade tree is

 There are three shaders you can program  Vertex shader  Useful, but boring, mostly about doing transforms and getting normals  Geometry shader  Optional, allows you to create vertices from nowhere in hardware  Pixel shader  Where all the color data gets decided on  Also where we'll focus

 The following, taken from RB Whitaker's Wiki, shows a shader for ambient lighting  We start with declarations: float4x4 World; float4x4 View; float4x4 Projection; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; struct VertexShaderInput { float4 Position : POSITION0; }; struct VertexShaderOutput { float4 Position : POSITION0; }; float4x4 World; float4x4 View; float4x4 Projection; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; struct VertexShaderInput { float4 Position : POSITION0; }; struct VertexShaderOutput { float4 Position : POSITION0; };

VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); return output; } float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { return AmbientColor * AmbientIntensity; } technique Ambient { pass Pass1 { VertexShader = compile vs_1_1 VertexShaderFunction(); PixelShader = compile ps_1_1 PixelShaderFunction(); } VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); return output; } float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { return AmbientColor * AmbientIntensity; } technique Ambient { pass Pass1 { VertexShader = compile vs_1_1 VertexShaderFunction(); PixelShader = compile ps_1_1 PixelShaderFunction(); }

The result, applied to a helicopter model:

 The following, taken from RB Whitaker's Wiki, shows a shader for diffuse lighting float4x4 World; float4x4 View; float4x4 Projection; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; float4x4 WorldInverseTranspose; float3 DiffuseLightDirection = float3(1, 0, 0); float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 1.0; struct VertexShaderInput { float4 Position : POSITION0; float4 Normal : NORMAL0; }; struct VertexShaderOutput { float4 Position : POSITION0; float4 Color : COLOR0; }; float4x4 World; float4x4 View; float4x4 Projection; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; float4x4 WorldInverseTranspose; float3 DiffuseLightDirection = float3(1, 0, 0); float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 1.0; struct VertexShaderInput { float4 Position : POSITION0; float4 Normal : NORMAL0; }; struct VertexShaderOutput { float4 Position : POSITION0; float4 Color : COLOR0; };

VertexShaderOutput VertexShaderFunction(VertexShaderInput input){ VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float4 normal = mul(input.Normal, WorldInverseTranspose); float lightIntensity = dot(normal, DiffuseLightDirection); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); return output; } float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { return saturate(input.Color + AmbientColor * AmbientIntensity); } technique Diffuse { pass Pass1 { VertexShader = compile vs_1_1 VertexShaderFunction(); PixelShader = compile ps_1_1 PixelShaderFunction(); } VertexShaderOutput VertexShaderFunction(VertexShaderInput input){ VertexShaderOutput output; float4 worldPosition = mul(input.Position, World); float4 viewPosition = mul(worldPosition, View); output.Position = mul(viewPosition, Projection); float4 normal = mul(input.Normal, WorldInverseTranspose); float lightIntensity = dot(normal, DiffuseLightDirection); output.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity); return output; } float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { return saturate(input.Color + AmbientColor * AmbientIntensity); } technique Diffuse { pass Pass1 { VertexShader = compile vs_1_1 VertexShaderFunction(); PixelShader = compile ps_1_1 PixelShaderFunction(); }

The result, applied to a helicopter model:

 GPU architecture  Vertex shading  Geometry shading  Pixel shading

 Keep reading Chapter 3  Keep working on Assignment 1, due this Friday by 11:59  Keep working on Project 1, due next Friday, February 6 by 11:59