Week 11 - Wednesday CS361.

Slides:



Advertisements
Similar presentations
Real-Time Rendering 靜宜大學資工研究所 蔡奇偉副教授 2010©.
Advertisements

Exploration of advanced lighting and shading techniques
POST-PROCESSING SET09115 Intro Graphics Programming.
Cs /11/2003 Page 1 Special Image Effects Particle Systems Fog Lens Flares Shadows Programmable Shaders.
Compositing and Blending Ed Angel Professor Emeritus of Computer Science University of New Mexico 1 E. Angel and D. Shreiner: Interactive Computer Graphics.
Frame Buffer Postprocessing Effects in DOUBLE-S.T.E.A.L (Wreckless)
Ray tracing. New Concepts The recursive ray tracing algorithm Generating eye rays Non Real-time rendering.
Computer graphics & visualization Global Illumination Effects.
Week 7 - Monday.  What did we talk about last time?  Specular shading  Aliasing and antialiasing.
Material obtained from a Guilford County Workshop, July, 2014 V
Week 11 - Wednesday.  Image based effects  Skyboxes  Lightfields  Sprites  Billboards  Particle systems.
Week 10 - Monday.  What did we talk about last time?  Global illumination  Shadows  Projection shadows  Soft shadows.
Week 7 - Wednesday.  What did we talk about last time?  Transparency  Gamma correction  Started texturing.
Compositing and Blending Mohan Sridharan Based on slides created by Edward Angel 1 CS4395: Computer Graphics.
Image Compositing Angel 8.11 Angel: Interactive Computer Graphics5E © Addison-Wesley
Real-Time Rendering SPEACIAL EFFECTS Lecture 03 Marina Gavrilova.
(conventional Cartesian reference system)
Final Gathering on GPU Toshiya Hachisuka University of Tokyo Introduction Producing global illumination image without any noise.
CS 563 Advanced Topics in Computer Graphics Introduction To IBR By Cliff Lindsay Slide Show ’99 Siggraph[6]
Post-rendering Cel Shading & Bloom Effect
Aaron Schultz. Idea: Objects close to a light shadow those far away. Anything we can see from the light’s POV is lit. Everything hidden is dark. Distance.
Computer graphics & visualization REYES Render Everything Your Eyes Ever Saw.
Week 11 - Thursday.  What did we talk about last time?  Image processing  Blurring  Edge detection  Color correction  Tone mapping  Lens flare.
Lecture Exposure/histograms. Exposure - Four Factors A camera is just a box with a hole in it. The correct exposure is determined by four factors: 1.
Graphics Graphics Korea University cgvr.korea.ac.kr 1 Chapter 6 Special Effects 강 신 진강 신 진
TERRAIN SET09115 Intro to Graphics Programming. Breakdown  Basics  What do we mean by terrain?  How terrain rendering works  Generating terrain 
09/11/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Graphics Pipeline Texturing Overview Cubic Environment Mapping.
Advanced Computer Graphics Depth & Stencil Buffers / Rendering to Textures CO2409 Computer Graphics Week 19.
Image-based Rendering. © 2002 James K. Hahn2 Image-based Rendering Usually based on 2-D imagesUsually based on 2-D images Pre-calculationPre-calculation.
4.1. R ENDERING Aspects of Game Rendering. From Wikipedia: Rendering is the process of generating an image from a model. The model is a description.
Real-Time rendering Chapter 4.Visual Appearance 4.4. Aliasing and antialiasing 4.5. Transparency,alpha,and compositing 4.6. Fog 4.7. Gamma correction
1 Perception and VR MONT 104S, Fall 2008 Lecture 21 More Graphics for VR.
Realtime NPR Toon and Pencil Shading Joel Jorgensen May 4, 2010.
Advanced Computer Graphics Shadow Techniques CO2409 Computer Graphics Week 20.
09/16/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Environment mapping Light mapping Project Goals for Stage 1.
Maths & Technologies for Games Advanced Graphics: Scene Post-Processing CO3303 Week
Image-based Rendering Ref: RTRv2. 2 Introduction Sprite, billboard, overview.
Processing Images and Video for An Impressionist Effect Automatic production of “painterly” animations from video clips. Extending existing algorithms.
Computer Graphics I, Fall 2008 Compositing and Blending.
CAD Illustration For the exam you must know the role of common techniques for the representations of light, reflection, shadow, tone layout, material and.
IMAGE PROCESSING is the use of computer algorithms to perform image process on digital images   It is used for filtering the image and editing the digital.
Aerial perspective a cue for suggesting represented depth in the image by presenting objects in the distance less distinctly than those in the foreground.
Objective % Select and utilize tools for digital video production.
Landscape Photography
Basic Ray Tracing CMSC 435/634.
- Introduction - Graphics Pipeline
Week 14 - Wednesday CS361.
Week 7 - Wednesday CS361.
Photographic Compositions
Week 7 - Monday CS361.
Week 2 - Friday CS361.
Aspects of Game Rendering
Manual Settings of the Digital Single Lens Reflex camera
Image-Based Rendering
Introduction to Digital Photography
Deferred Lighting.
CS451Real-time Rendering Pipeline
Introducing Blender.
© University of Wisconsin, CS559 Fall 2004
Introducing Blender.
(c) 2002 University of Wisconsin
Lighting.
Static Image Filtering on Commodity Graphics Processors
Computer Animation Texture Mapping.
Dr. Jim Rowan ITEC 2110 Chapter 3
UMBC Graphics for Games
Introduction to Digital Photography
CIS 441/541: Introduction to Computer Graphics Lecture 15: shaders
Frame Buffers Fall 2018 CS480/680.
Week 11 - Monday CS361.
Presentation transcript:

Week 11 - Wednesday CS361

Last time Image based effects Billboards Particle systems Skyboxes Lightfields Sprites Billboards Particle systems

Questions?

Project 3

Impostors An impostor is a billboard created on the fly by rendering a complex object to a texture Then, the impostor can be rendered more cheaply This technique should be used to speed up the rendering of far away objects The resolution of the texture should be at least:

Displacement techniques An impostor that also has a depth map is called a depth sprite This depth information can be used to make a billboard that intersects with real world objects realistically The depth in the image can be compared against the z-buffer Another technique is to use the depth information to procedurally deform the billboard

Billboard clouds Impostors have to be recomputed for different viewing angles Certain kinds of models (trees are a great example) can be approximated by a cloud of billboards Finding a visually realistic set of cutouts is one part of the problem The rendering overhead of overdrawing is another Billboards may need to be sorted if transparency effects are important

Image processing Image processing takes an input image and manipulates it Blurring Edge detection Any number of amazing filters Much of this can be done on the GPU And needs to be (blurring shadows) Photoshop and other programs are GPU accelerated these days

Filtering kernels From Project 1, you guys know all about convolution filtering kernels I just wanted to point out that some kernels are separable Rows and columns can be done separately A separable m x m kernel can be run in 2mn2 time instead of m2n2 time

Color correction Color correction takes an existing image and converts each of its colors to some other color Important in photo and movie processing Special effects Time of day effects

Tone mapping Tone mapping takes a wide range of luminances and maps them to the limited range of the computer's screen To properly simulate the relative luminances Some bright areas are made extremely bright Some dark areas are made extremely dark We can look for the brightest pixel and scale on that basis

Lens flare and bloom Lens flare is an effect caused by bright light hitting the crystalline structure of a camera lens Actually happens less and less now that camera technology has improved They can be rendered with textures The bloom effect is where bright areas spill over into other areas To render, take bright objects, blur them, then layer over the original scene

Depth of field In photography, some things are in focus and some things (either too near or too far) are blurry When desired, this effect can be achieved by rendering the image in a range of blurred states and then interpolating between them based on z-value

Motion blur Fast moving objects exhibit motion blur in films It doesn't often happen in real life We have come to expect it, nevertheless The effect can be reproduced by keeping a buffer of previously rendered frames, adding the latest frame and subtracting the first frame Other techniques can be used to do various kinds of selective, directional blurring

Fog Fog effects can be used to give atmosphere or cue depth Fog is usually based on distance (z-buffer or true distance) from the viewer Different equations determine the effect (linear or exponential) of the fog

DirectX fog The BasicEffect makes fog easy with the following members FogEnabled Whether fog is on or off FogColor Color of the fog FogStart Distance (in world space) where fog starts to appear FogEnd Distance (in world space) where fog covers everything It's not hard to write shader code that interpolates between fog color and regular color based on distance

Quiz

Upcoming

Next time… Review

Reminders Finish Project 3 Exam 2 is Monday in class Due Friday before midnight Exam 2 is Monday in class Friday will be review Review chapters 5 – 10