Pipelines are for Whimps Raycasting, Raytracing, and Hardcore Rendering.

Slides:



Advertisements
Similar presentations
SI31 Advanced Computer Graphics AGR
Advertisements

CSE3AGT – Zones Portals and Anti-Portals, Fog and Ray Tracing Paul Taylor 2010.
CS 352: Computer Graphics Chapter 7: The Rendering Pipeline.
Graphics Pipeline.
Computer graphics & visualization Global Illumination Effects.
Lecture 14 Illumination II – Global Models
Week 7 - Monday.  What did we talk about last time?  Specular shading  Aliasing and antialiasing.
3D Graphics Rendering and Terrain Modeling
CAP4730: Computational Structures in Computer Graphics Visible Surface Determination.
CHAPTER 12 Height Maps, Hidden Surface Removal, Clipping and Level of Detail Algorithms © 2008 Cengage Learning EMEA.
Computer Graphics Visible Surface Determination. Goal of Visible Surface Determination To draw only the surfaces (triangles) that are visible, given a.
Part I: Basics of Computer Graphics
Ray Tracing & Radiosity Dr. Amy H. Zhang. Outline  Ray tracing  Radiosity.
Introduction to 3D Graphics Lecture 5: From Realism to Real-Time Anthony Steed University College London.
Rasterization and Ray Tracing in Real-Time Applications (Games) Andrew Graff.
Chapter 6: Vertices to Fragments Part 2 E. Angel and D. Shreiner: Interactive Computer Graphics 6E © Addison-Wesley Mohan Sridharan Based on Slides.
University of British Columbia CPSC 314 Computer Graphics Jan-Apr 2008 Alla Sheffer Advanced Rendering Week.
3D Graphics Processor Architecture Victor Moya. PhD Project Research on architecture improvements for future Graphic Processor Units (GPUs). Research.
7M836 Animation & Rendering
Introduction to Parallel Rendering: Sorting, Chromium, and MPI Mengxia Zhu Spring 2006.
1 Angel: Interactive Computer Graphics 4E © Addison-Wesley 2005 Models and Architectures Ed Angel Professor of Computer Science, Electrical and Computer.
Introduction to 3D Graphics John E. Laird. Basic Issues u Given a internal model of a 3D world, with textures and light sources how do you project it.
Hidden Surface Removal
1 Computer Graphics Week13 –Shading Models. Shading Models Flat Shading Model: In this technique, each surface is assumed to have one normal vector (usually.
COMP 175: Computer Graphics March 24, 2015
Technology and Historical Overview. Introduction to 3d Computer Graphics  3D computer graphics is the science, study, and method of projecting a mathematical.
Computer graphics & visualization REYES Render Everything Your Eyes Ever Saw.
Buffers Textures and more Rendering Paul Taylor & Barry La Trobe University 2009.
-Global Illumination Techniques
CS 376 Introduction to Computer Graphics 04 / 16 / 2007 Instructor: Michael Eckmann.
CSC418 Computer Graphics n BSP tree n Z-Buffer n A-buffer n Scanline.
Matrices from HELL Paul Taylor Basic Required Matrices PROJECTION WORLD VIEW.
Week 2 - Friday.  What did we talk about last time?  Graphics rendering pipeline  Geometry Stage.
Ray Tracing Chapter CAP4730: Computational Structures in Computer Graphics.
CS447/ Realistic Rendering -- Radiosity Methods-- Introduction to 2D and 3D Computer Graphics.
CSC 461: Lecture 3 1 CSC461 Lecture 3: Models and Architectures  Objectives –Learn the basic design of a graphics system –Introduce pipeline architecture.
Introduction to Parallel Rendering Jian Huang, CS 594, Spring 2002.
Rendering Overview CSE 3541 Matt Boggus. Rendering Algorithmically generating a 2D image from 3D models Raster graphics.
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
Stream Processing Main References: “Comparing Reyes and OpenGL on a Stream Architecture”, 2002 “Polygon Rendering on a Stream Architecture”, 2000 Department.
1Computer Graphics Lecture 4 - Models and Architectures John Shearer Culture Lab – space 2
Computer Graphics Chapter 6 Andreas Savva. 2 Interactive Graphics Graphics provides one of the most natural means of communicating with a computer. Interactive.
Binary Space Partitioning Trees Ray Casting Depth Buffering
David Luebke11/26/2015 CS 551 / 645: Introductory Computer Graphics David Luebke
04/30/02(c) 2002 University of Wisconsin Last Time Subdivision techniques for modeling We are now all done with modeling, the standard hardware pipeline.
Basic Ray Tracing CMSC 435/634.
Graphics Graphics Korea University cgvr.korea.ac.kr 1 Surface Rendering Methods 고려대학교 컴퓨터 그래픽스 연구실.
Review on Graphics Basics. Outline Polygon rendering pipeline Affine transformations Projective transformations Lighting and shading From vertices to.
Ray Tracing Fall, Introduction Simple idea  Forward Mapping  Natural phenomenon infinite number of rays from light source to object to viewer.
COMPUTER GRAPHICS CS 482 – FALL 2015 SEPTEMBER 29, 2015 RENDERING RASTERIZATION RAY CASTING PROGRAMMABLE SHADERS.
1 CSCE 441: Computer Graphics Hidden Surface Removal Jinxiang Chai.
Computer Graphics Inf4/MSc 1 Computer Graphics Lecture 5 Hidden Surface Removal and Rasterization Taku Komura.
RENDERING : Global Illumination
CSE 681 Introduction to Ray Tracing. CSE 681 Ray Tracing Shoot a ray through each pixel; Find first object intersected by ray. Image plane Eye Compute.
Global Illumination (3) Path Tracing. Overview Light Transport Notation Path Tracing Photon Mapping.
1cs426-winter-2008 Notes. 2 Atop operation  Image 1 “atop” image 2  Assume independence of sub-pixel structure So for each final pixel, a fraction alpha.
Chapter 1 Graphics Systems and Models Models and Architectures.
1 E. Angel and D. Shreiner: Interactive Computer Graphics 6E © Addison-Wesley 2012 Models and Architectures 靜宜大學 資訊工程系 蔡奇偉 副教授 2012.
Graphics Processing Unit
3D Graphics Rendering PPT By Ricardo Veguilla.
The Graphics Rendering Pipeline
Models and Architectures
© University of Wisconsin, CS559 Fall 2004
Models and Architectures
CSCE 441: Computer Graphics Hidden Surface Removal
Models and Architectures
Introduction to Computer Graphics with WebGL
Models and Architectures
Models and Architectures
Introduction to Ray Tracing
Presentation transcript:

Pipelines are for Whimps Raycasting, Raytracing, and Hardcore Rendering

Ray Casting Definition Time There are two definitions of Ray Casting The Old and the New The old was related to 3D games back in the Wolfenstein / Doom 1 Era. Where gameplay was on a 2D platform The New definition is: – Non Recursive Ray Tracing

Ray Tracing Glass Ball

Rays from the Sun or from the Screen Rays could be programmed to work in either direction We choose from the screen to the Light – Only X x Y Pixels to trace From the Light we would need to emulate Millions of Rays to find the few thousand that reach the screen

Our Rays

Center of Projection (0,0)

Viewport (0,0) Screen Clipping Planes

Into World Coordinates (0,0) Screen Clipping Planes

Getting Each Initial Ray Origin = (0,0,0) Direction = ScreenX,screenY, zMin – ScreenX, ScreenY are the float locations of each pixel in projected world coordinates – zMin is the plane on which the screen exists

Materials Surfaces must have their Material Properties set – Diffuse, Reflective, Emissive, and Colour need to be considered

For Each Pixel (the main Raytrace loop) For each pixel { Construct ray from camera through pixel Find first primitive hit by ray Determine colour at intersection point Draw colour to pixel Buffer }

Intersections The simplest way is to loop through all your primitives (All Polygons) – If the Polygon Normal DOT RayDirection(Cos Theta) < 0 // Face is opposite to Ray Ignore – Now we can Intersect the Ray with the Polygon – Or Intersect the Ray with the Polygons Plane

Ray / Polygon Intersection p0, p1 and p2 are verts of the triangle point(u,v) = (1-u-v)*p0 + u*p1 + v*p2 U > 0 V > 0 U + V <= 1.0

Line Representation point(t) = p + t * d t is any point on the line p is a known point on the line D is the direction vector Combined: p + t * d = (1-u-v) * p0 + u * p1 + v * p2 A Point on the line (p + t * d) which Is part of the triangle [(1-u-v) * p0 + u * p1 + v * p2]

Intersections Suck! etry/planeline/ etry/planeline/ c.html c.html /algorithm_0104B.htm#Line- Plane%20Intersection /algorithm_0104B.htm#Line- Plane%20Intersection or/Vplanelineint.html or/Vplanelineint.html

Intersecting a Plane A point on the Plane = p1 Plane Normal = n. Ray = p(t) = e + td P(t) = Point on Ray E = Origin D = Direction Vector t = [(P1 – e). n]/ d.n

World / Object Coordiantes We need to translate the Ray into Object Coordinates / Vice Versa to get this to work Ray = p(t) = e + td Ray = Inv (Object->World)e + t Inv (Object- >World)d

After Finding the Intersecting Plane You need a simple way to check for a hit or miss If your Object has a bounding box this can be achieved through a line check

Miss Conditions

Hit Conditions

For Other Shaped Flat Polygons An Even Number of Intersections with the Outside of the Polygon means a Miss An Odd Number of Intersections means a Hit

Task List for Ray Casting 1)Create a vector for each Pixel on the screen a)From the Origin of the Camera Matrix (0,0,0) b)That intersects with a Pixel in the screen 2)Use this Vector to create a trace through the World a)From the Zmin to the Zmax Clipping Volume b)UnProjected into World Coordinates 3)Intersect the trace with every object in the world

4)When the ray hits an Object we need to check how the pixel should be lit a)Check if the Ray has a direct view to each of the lights in the scene b)calculate the input from each light. c)Color the pixel based on the lighting and surface properties

One extra task for Ray Casting After Intersection Calculate the reflective Vector – Dot Product of Ray and Surface Normal Then cast a new Ray – This continues in a recursive fashion untill: A ray heads off into the universe A ray hits a light We reach our maximum recursion level

How we would like to be able to calculate light

Conservation of Energy A Physics-Based Approach to Lighting – Surfaces will absorb some light, and reflect some light – Any surfaces may also be light emitting – Creating a large simultaneous equation can solve the light distribution (I mean LARGE) – The light leaving a point is the sum of the light emitted + the sum of all reflected light

Don’t Scream (loudly)

The Rendering Equation Light Leaving Point X in direction  Light Emitted by Point X in direction  Integral over the Input Hemisphere Bidirectional reflective function (BDRF) in the direction  from direction  ’ Light toward Point X from direction  ’ Attenuation of inward light related to incidence angle

The Monte Carlo Method Repeated Random Sampling Deterministic Algorithms may be unfeasibly complex (light)

Metropolis Light Transport A directed approach to simplifying the BDRF Still considered a Monte Carlo Method It directs the randomness considering more samples from directions with a higher impact on the point being assessed

BDRF Tracing

Metropolis Light Transport

Radiosity Simplifying the Rendering Equation by making all surfaces perfectly diffuse reflectors This simplifies the BDRF function

Parallel Rendering (Rendering Farms) There are Three major Type Definitions – Sort-First – Sort-Middle – Sort-Last These are just the outlines, in reality things need to be customised based on technical limitations / requirements

Sort-Middle Application Sort Geometry (Vertex Shading) Geometry (Vertex Shading) Geometry (Vertex Shading) Fragments (Pixel Shading) Fragments (Pixel Shading) Fragments (Pixel Shading) Display Fragments (Pixel Shading) Geometry (Vertex Shading)

Sort-Middle Pros – The number of Vertex Processors is independent of the Number of Pixel Processors Cons – Normal Maps may mess with Polygons on overlap areas – Correcting Aliasing between Display Tiles (RenderMan??) – Requires specific hardware – Rendering may not be balanced between Display Tiles

Sort-Last Application Composite Geometry (Vertex Shading) Geometry (Vertex Shading) Geometry (Vertex Shading) Fragments (Pixel Shading) Fragments (Pixel Shading) Fragments (Pixel Shading) Display Fragments (Pixel Shading) Geometry (Vertex Shading)

Sort-Last Pros – Can be easily created from networked PCs Cons – Each Vertex Processor requires a Pixel Processor – Unsorted Geometry means each Pixel Processor must carry a full-size frame buffer Limited scalability – Composing the image requires integrating X frame buffers considering X Z-Buffers

Sort-Last Compositing can be done more efficiently (memory requirements) utilising a binary tree approach – May lead to idle processors Another approach is a Binary Swap architecture – Large Data Bus usage

Sort-First Application Sort Geometry (Vertex Shading) Geometry (Vertex Shading) Geometry (Vertex Shading) Fragments (Pixel Shading) Fragments (Pixel Shading) Fragments (Pixel Shading) Display Fragments (Pixel Shading) Geometry (Vertex Shading)

Sort-First Pros – Pixel Processors only need a tile of the display buffer – Can be created utilising PC hardware – Infinitely Scalable Cons – We are sorting Primitives BEFORE they are translated into projected space!!! This requires some overhead – Polygons crossing tiles will be sent to both pipelines – An error backup could consider a bus to move incorrectly sorted polygons to the correct render queue (Transparency causes issues here!) – Correcting Aliasing between Display Tiles – Rendering may not be balanced between Display Tiles

Parallel Processing Techniques Conclusively – Sort-Middle is for expensive hardware – Sort-Last is limited by scalability – Sort-First requires careful consideration on implementation Sort First / Sort Last COULD be run on a Cloud – Bandwidth?? – Security?? – What happens when you max the cloud??

Image Based Rendering Geometric Upscaling! The process of getting 3D information out of 2D image(s) – Far outside our scope, but interesting in Research

RenderMan

RenderMan / Reyes Reyes (Renders Everything You Ever Saw) RenderMan is an implementation of Reyes – Reyes was developed by two staff at the ‘Lucasfilm's Computer Graphics Research Group’ now known as Pixar! – RenderMan is Pixar’s current implementation of the Reyes Architecture

The Goals of Reyes Model Complexity / Diversity Shading Complexity Minimal Ray Tracing Speed Image Quality (Artefacts are Unacceptable) Flexibility – Reyes was designed so that new technology could be incorporated without an entire re- implementation

The Functionality of Reyes / RenderMan Objects (Polygons and Curves) are divided into Micro Polygons as needed – A Micro Polygon is a typically smaller than a pixel – In Reyes Micro Polygons are quadrilaterals – Flat shading all the Quads gives an excellent representation of shading These quads allow Reyes to use a Vector Based Rendering Approach – This allows simple Parallelism

Bound – Bounding Boxes Split – Geometry Culling & Partials Dice – Polygons into grids of Micro Polygons Shade – Shading Functions are applied to the Micro Polygons Functions used are Independent of Reyes Bust – Do Bounding and Visibility checking on each Micro Polygon Sample (Hide) – Generate the Render based on the remaining Micro Polygons

The Reyes Pipeline

Interesting Facts Some Frames take 90 hours!!! (1/24 th of a second of footage) On average Frames take 6 hours to render!!!! 6 * 24 = 1 second = 144 Hours – About 2 years for a 2 hour Movie!!

Licensing $3500 us per Server $2000us – 3500us per Client Machine Far cheaper than I expected!

References n opics_Techniques-Part_1_Introduction.shtml dex.php?raytriint