Week 2 - Friday CS361.

Slides:



Advertisements
Similar presentations
Visible-Surface Detection(identification)
Advertisements

Real-Time Rendering 靜宜大學資工研究所 蔡奇偉副教授 2010©.
COMPUTER GRAPHICS SOFTWARE.
CS 4731: Computer Graphics Lecture 20: Raster Graphics Part 1 Emmanuel Agu.
CSC418 Computer Graphics n Polygon normals n Back Faces n Visibility Algorithms.
CS 352: Computer Graphics Chapter 7: The Rendering Pipeline.
Graphics Pipeline.
Computer Graphic Creator: Mohsen Asghari Session 2 Fall 2014.
Week 7 - Monday.  What did we talk about last time?  Specular shading  Aliasing and antialiasing.
Computer Graphics Visible Surface Determination. Goal of Visible Surface Determination To draw only the surfaces (triangles) that are visible, given a.
Chapter 6: Vertices to Fragments Part 2 E. Angel and D. Shreiner: Interactive Computer Graphics 6E © Addison-Wesley Mohan Sridharan Based on Slides.
CS 4731: Computer Graphics Lecture 18: Hidden Surface Removal Emmanuel Agu.
Computer Graphics Hardware Acceleration for Embedded Level Systems Brian Murray
CS6500 Adv. Computer Graphics © Chun-Fa Chang, Spring 2003 Object-Order vs. Screen-Order Rendering April 24, 2003.
CS6500 Adv. Computer Graphics © Chun-Fa Chang, Spring 2003 Adv. Computer Graphics CS6500, Spring 2003.
Vertices and Fragments III Mohan Sridharan Based on slides created by Edward Angel 1 CS4395: Computer Graphics.
Computer Graphics: Programming, Problem Solving, and Visual Communication Steve Cunningham California State University Stanislaus and Grinnell College.
Hidden Surface Removal
Shadows Computer Graphics. Shadows Shadows Extended light sources produce penumbras In real-time, we only use point light sources –Extended light sources.
Computer Graphics Inf4/MSc Computer Graphics Lecture 9 Antialiasing, Texture Mapping.
Texture Mapping. Scope Buffers Buffers Various of graphics image Various of graphics image Texture mapping Texture mapping.
CS 450: COMPUTER GRAPHICS REVIEW: INTRODUCTION TO COMPUTER GRAPHICS – PART 2 SPRING 2015 DR. MICHAEL J. REALE.
Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.
CSC 461: Lecture 3 1 CSC461 Lecture 3: Models and Architectures  Objectives –Learn the basic design of a graphics system –Introduce pipeline architecture.
Visible-Surface Detection Jehee Lee Seoul National University.
CS 638, Fall 2001 Multi-Pass Rendering The pipeline takes one triangle at a time, so only local information, and pre-computed maps, are available Multi-Pass.
OpenGL Conclusions OpenGL Programming and Reference Guides, other sources CSCI 6360/4360.
3D Graphics for Game Programming Chapter IV Fragment Processing and Output Merging.
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
1Computer Graphics Lecture 4 - Models and Architectures John Shearer Culture Lab – space 2
GRAPHICS PIPELINE & SHADERS SET09115 Intro to Graphics Programming.
1Computer Graphics Implementation II Lecture 16 John Shearer Culture Lab – space 2
Implementation II Ed Angel Professor of Computer Science, Electrical and Computer Engineering, and Media Arts University of New Mexico.
Implementation II.
Review on Graphics Basics. Outline Polygon rendering pipeline Affine transformations Projective transformations Lighting and shading From vertices to.
Lecture 6 Rasterisation, Antialiasing, Texture Mapping,
Computer Graphics I, Fall 2010 Implementation II.
What are shaders? In the field of computer graphics, a shader is a computer program that runs on the graphics processing unit(GPU) and is used to do shading.
1 E. Angel and D. Shreiner: Interactive Computer Graphics 6E © Addison-Wesley 2012 Models and Architectures 靜宜大學 資訊工程系 蔡奇偉 副教授 2012.
Graphics Pipeline Bringing it all together. Implementation The goal of computer graphics is to take the data out of computer memory and put it up on the.
- Introduction - Graphics Pipeline
Computer Graphics Implementation II
Week 2 - Monday CS361.
Week 7 - Wednesday CS361.
Week 7 - Monday CS361.
Photorealistic Rendering vs. Interactive 3D Graphics
CSC418 Computer Graphics Back Faces Visibility Algorithms.
Graphics Processing Unit
Deferred Lighting.
Lecture 18 Fasih ur Rehman
3D Graphics Rendering PPT By Ricardo Veguilla.
The Graphics Rendering Pipeline
CS451Real-time Rendering Pipeline
Models and Architectures
Models and Architectures
CSCE 441: Computer Graphics Hidden Surface Removal
Models and Architectures
Introduction to Computer Graphics with WebGL
Introduction to Computer Graphics with WebGL
Implementation II Ed Angel Professor Emeritus of Computer Science
Graphics Processing Unit
Introduction to Computer Graphics with WebGL
The Graphics Pipeline Lecture 5 Mon, Sep 3, 2007.
UMBC Graphics for Games
Lecture 13 Clipping & Scan Conversion
Visibility (hidden surface removal)
Models and Architectures
Models and Architectures
Introduction to OpenGL
Implementation II Ed Angel Professor Emeritus of Computer Science
Presentation transcript:

Week 2 - Friday CS361

Last time What did we talk about last time? Graphics rendering pipeline Geometry Stage

Questions?

Project 1

Assignment 1

Let's see those matrices in MonoGame again

Backface culling I did not properly describe an important optimization done in the Geometry Stage: backface culling Backface culling removes all polygons that are not facing toward the screen A simple dot product is all that is needed This step is done in hardware in MonoGame and OpenGL You just have to turn it on Beware: If you screw up your normals, polygons could vanish

Graphics rendering pipeline For API design, practical top-down problem solving, and hardware design, and efficiency, rendering is described as a pipeline This pipeline contains three conceptual stages: Produces material to be rendered Application Decides what, how, and where to render Geometry Renders the final image Rasterizer

Student Lecture: Rasterizer Stage

Rasterizer Stage

Rasterizer Stage The goal of the Rasterizer Stage is to take all the transformed geometric data and set colors for all the pixels in the screen space Doing so is called: Rasterization Scan Conversion Note that the word pixel is actually a portmanteau for "picture element"

More pipelines As you should expect, the Rasterization Stage is also divided into a pipeline of several functional stages: Triangle Setup Triangle Traversal Pixel Shading Merging

Triangle Setup Data for each triangle is computed This could include normals This is boring anyway because fixed-operation (non-customizable) hardware does all the work

Triangle Traversal Each pixel whose center is overlapped by a triangle must have a fragment generated for the part of the triangle that overlaps the pixel The properties of this fragment are created by interpolating data from the vertices Again, boring, fixed-operation hardware does this

Pixel Shading This is where the magic happens Given the data from the other stages, per-pixel shading (coloring) happens here This stage is programmable, allowing for many different shading effects to be applied Perhaps the most important effect is texturing or texture mapping

Texturing Texturing is gluing a (usually) 2D image onto a polygon To do so, we map texture coordinates onto polygon coordinates Pixels in a texture are called texels This is fully supported in hardware Multiple textures can be applied in some cases

Merging The final screen data containing the colors for each pixel is stored in the color buffer The merging stage is responsible for merging the colors from each of the fragments from the pixel shading stage into a final color for a pixel Deeply linked with merging is visibility: The final color of the pixel should be the one corresponding to a visible polygon (and not one behind it)

Z-buffer To deal with the question of visibility, most modern systems use a Z-buffer or depth buffer The Z-buffer keeps track of the z-values for each pixel on the screen As a fragment is rendered, its color is put into the color buffer only if its z value is closer than the current value in the z-buffer (which is then updated) This is called a depth test

Pros and cons of the Z-buffer Polygons can usually be rendered in any order Universal hardware support is available Cons Partially transparent objects must be rendered in back to front order (painter's algorithm) Completely transparent values can mess up the z buffer unless they are checked z-fighting can occur when two polygons have the same (or nearly the same) z values

More buffers A stencil buffer can be used to record a rendered polygon This stores the part of the screen covered by the polygon and can be used for special effects Frame buffer is a general term for the set of all buffers Different images can be rendered to an accumulation buffer and then averaged together to achieve special effects like blurring or antialiasing A back buffer allows us to render off screen to avoid popping and tearing

Finals words on the pipeline This pipeline is focused on interactive graphics Micropolygon pipelines are usually used for film production Predictive rendering applications usually use ray tracing renderers The old model was the fixed-function pipeline which gave little control over the application of shading functions The book focuses on programmable GPUs which allow all kinds of tricks to be done in hardware

Upcoming

Next time… GPU architecture Programmable shading

Reminders Read Chapter 3 Start on Assignment 1, due next Friday, February 3 by 11:59 Keep working on Project 1, due Friday, February 10 by 11:59