Buffers Textures and more Rendering Paul Taylor & Barry La Trobe University 2009.

Slides:



Advertisements
Similar presentations
Today Composing transformations 3D Transformations
Advertisements

15.1 Si23_03 SI23 Introduction to Computer Graphics Lecture 15 – Visible Surfaces and Shadows.
8.1si31_2001 SI31 Advanced Computer Graphics AGR Lecture 8 Polygon Rendering.
Lecture 8 Transparency, Mirroring
This terms course Last term we both worked on learning 2 things –Processing –The concepts of graphics etc. This term will focus more on the basic concepts.
MAT 594CM S2010Fundamentals of Spatial ComputingAngus Forbes Overview Goals of the course: 1. to introduce real-time 3D graphics programming with openGL.
Graphics Pipeline.
Computer Graphic Creator: Mohsen Asghari Session 2 Fall 2014.
1 Computer Graphics Chapter 8 3D Transformations.
10/10/02 (c) 2002 University of Wisconsin, CS 559 Last Time Finished viewing: Now you know how to: –Define a region of space that you wish to view – the.
CAP4730: Computational Structures in Computer Graphics Visible Surface Determination.
Computer Graphics Visible Surface Determination. Goal of Visible Surface Determination To draw only the surfaces (triangles) that are visible, given a.
Informationsteknologi Wednesday, November 7, 2007Computer Graphics - Class 51 Today’s class Geometric objects and transformations.
University of New Mexico
Texture Mapping Mohan Sridharan Based on slides created by Edward Angel 1 CS4395: Computer Graphics.
CHAPTER 7 Viewing and Transformations © 2008 Cengage Learning EMEA.
3D Rendering with JOGL Introduction to Java OpenGL Graphic Library By Ricardo Veguilla
Introduction to 3D Graphics John E. Laird. Basic Issues u Given a internal model of a 3D world, with textures and light sources how do you project it.
Computer Graphics Inf4/MSc Computer Graphics Lecture 11 Texture Mapping.
2/26/04© University of Wisconsin, CS559 Spring 2004 Last Time General Orthographic Viewing –Specifying cameras in world coordinates –Building world  view.
3D Computer Graphics: Textures. Textures: texels Texture is a way of assigning a diffuse color to a pixel – can be with 1, 2 or 3D- can use maps, interpolation.
CSE 381 – Advanced Game Programming 3D Mathematics
Introduction to 3D Computer Graphics and Virtual Reality McConnell text.
CS 450: Computer Graphics REVIEW: OVERVIEW OF POLYGONS
University of Illinois at Chicago Electronic Visualization Laboratory (EVL) CS 426 Intro to 3D Computer Graphics © 2003, 2004, 2005 Jason Leigh Electronic.
COMP 175: Computer Graphics March 24, 2015
Viewing and Projections
CSE 381 – Advanced Game Programming Basic 3D Graphics
Sky Rendering The actual physics is very complicated and costly to calculate. Several cheap approaches for very distant skies: –Constant backdrop –Skybox.
02/26/02 (c) 2002 University of Wisconsin, CS 559 Last Time Canonical view pipeline Orthographic projection –There was an error in the matrix for taking.
CS 325 Introduction to Computer Graphics 03 / 08 / 2010 Instructor: Michael Eckmann.
Fundamentals of Computer Graphics Part 9 Discrete Techniques prof.ing.Václav Skala, CSc. University of West Bohemia Plzeň, Czech Republic ©2002 Prepared.
CS 450: COMPUTER GRAPHICS REVIEW: INTRODUCTION TO COMPUTER GRAPHICS – PART 2 SPRING 2015 DR. MICHAEL J. REALE.
Homogeneous Form, Introduction to 3-D Graphics Glenn G. Chappell U. of Alaska Fairbanks CS 381 Lecture Notes Monday, October 20,
CS 638, Fall 2001 Multi-Pass Rendering The pipeline takes one triangle at a time, so only local information, and pre-computed maps, are available Multi-Pass.
Image Synthesis Rabie A. Ramadan, PhD 1. 2 About my self Rabie A. Ramadan My website and publications
Computer Graphics The Rendering Pipeline - Review CO2409 Computer Graphics Week 15.
CAP4730: Computational Structures in Computer Graphics 3D Transformations.
CS 450: COMPUTER GRAPHICS PROJECTIONS SPRING 2015 DR. MICHAEL J. REALE.
Basic Perspective Projection Watt Section 5.2, some typos Define a focal distance, d, and shift the origin to be at that distance (note d is negative)
Review on Graphics Basics. Outline Polygon rendering pipeline Affine transformations Projective transformations Lighting and shading From vertices to.
Chapters 5 2 March Classical & Computer Viewing Same elements –objects –viewer –projectors –projection plane.
Rendering Pipeline Fall, D Polygon Rendering Many applications use rendering of 3D polygons with direct illumination.
11/24/ :45 Graphics II Shadow Maps Reflections Session 5.
What are shaders? In the field of computer graphics, a shader is a computer program that runs on the graphics processing unit(GPU) and is used to do shading.
Coordinate Systems Lecture 1 Fri, Sep 2, The Coordinate Systems The points we create are transformed through a series of coordinate systems before.
2/28/2016 CS 551 / 645: Introductory Computer Graphics Framebuffer Mathematical Foundations The Rendering Pipeline.
1 Computer Graphics Week11 : Hidden Surface Removal.
Viewing and Projection. The topics Interior parameters Projection type Field of view Clipping Frustum… Exterior parameters Camera position Camera orientation.
CSE 681 Brief Review: Vectors. CSE 681 Vectors Basics Normalizing a vector => unit vector Dot product Cross product Reflection vector Parametric form.
Mapping: Image Texturing CPSC 591/691. Texture Mapping Two-dimensional techniques place a two-dimensional (flat) image onto an object using methods similar.
Viewing. Classical Viewing Viewing requires three basic elements - One or more objects - A viewer with a projection surface - Projectors that go from.
3D Ojbects: Transformations and Modeling. Matrix Operations Matrices have dimensions: Vectors can be thought of as matrices: v=[2,3,4,1] is a 1x4 matrix.
CSE 681 Brief Review: Vectors. CSE 681 Vectors Direction in space Normalizing a vector => unit vector Dot product Cross product Parametric form of a line.
Visible Surface Detection
Rendering Pipeline Fall, 2015.
University of New Mexico
Buffers and texture mapping
Texture Mapping.
Computer Graphics Texture Mapping
Intro to 3D Graphics.
Rendering Pipeline Aaron Bloomfield CS 445: Introduction to Graphics
CSCE 441 Computer Graphics 3-D Viewing
Modeling 101 For the moment assume that all geometry consists of points, lines and faces Line: A segment between two endpoints Face: A planar area bounded.
CENG 477 Introduction to Computer Graphics
OpenGL 2D Viewing Pipeline
CSC4820/6820 Computer Graphics Algorithms Ying Zhu Georgia State University View & Projection.
The Graphics Pipeline Lecture 5 Mon, Sep 3, 2007.
Lecture 13 Clipping & Scan Conversion
Texture Mapping Ed Angel Professor Emeritus of Computer Science
Presentation transcript:

Buffers Textures and more Rendering Paul Taylor & Barry La Trobe University 2009

Clearing up the OpenGL Perspective Functions OpenGL uses a Right Handed Coordinate System This means that the Z axis is positive out of the screen Unfortunately for our heads the OpenGl functions glOrtho(left, right, bottom, top, near, far) And GluPerspective(…) glFrustrum(…) All take znear and zfar in positive values as the forward distance from the COP (Center of Projection) 0,0,0 What it all means is that remembering Z is negative into the screen: To create an Orthogonal Clipping Box that goes from z = -1 (1 unit infront of you) back to z=-10 (10 units infront of you) means calling glOrtho(left, right, bottom, top, 1.0f, 10.0f)

Your OpenGL Objects Basically your OpenGL Objects should still have a central point of 0,0,0 in Object Coordinates Your Render Function should translate all Objects into the MODELVIEW Matrix using a Translation Matrix that pushes the objects backwards and into the clipping Plane glTranslatef(x,y,-5.0f) // Middle of a z=0-10 clipping Box

Normal A Normal is a Vector is a Perpendicular vector to the Surface, Pixel or Vertex it relates to They are extremely useful for many, many reasons in Graphics They are easily calculated by the Cross Product of 2 non-parallel sides of a polygon

Unit Vector A Unit vector is exactly 1 unit long! Any Vector can be tuned into a Unit vector by dividing all components of the vector by its Magnitude Normal V = x/|v| + y / |v| + z / |v| Where |v| 2 = (x 2 + y 2 + z 2 ) Unit Vectors are very useful as they are non scalar, and results from operations will always continue to be unit length In the Matrix world the Dot Product of 2 Unit Vectors is equal to their cosine

Single Face Rendering

Finishing off your Basic Polygons Clockwise Polygons Counter-Clockwise Polygons Polygon Normals – RHS Polygons – LHS Polygons Z-Positive is typically the Polygon Normal (In Object Coordinates)

So Far our Polygons have been 2-faced FrontBack

So Far our Polygons have been 2-faced FrontBack

This is the front and back view of a 1 faced Polygon FrontBack

To make rendering more efficient we should only consider the front faces of polygons This reduces the rendering load very quickly when you consider a complex 3D scene

Fast Culling Remember the Dot Product of 2 Vectors is the Cosine Angle Combining the Polygon Normal and the Viewport Direction If A.B < 0 Polygon is facing away

Basic Texturing There are 2 types of Buffer in you video card* Color and Depth Buffers Only Color Buffers can be drawn to your output device So far we have been using 3 Buffers – A Front Buffer – A Back Buffer – A Depth Buffer (Z Buffer) * According to OpenGL

Buffers and Formats A Buffer is an X x Y array of Values where X and Y are the resolution of the screen – In this course typically Values will be 32bit floats Bits Pixels and Bitplanes – Firstly in OpenGL a Bitmap is an X x Y array (or image) with a 1 bit depth! Bit Block Transfers (BitBlt) are covered in other subjects so we’ll skip them

Pixel Maps These are much more like what most would consider a ‘Bitmap’ (Windows Type) – An array X x Y of Pixels Here is a 24bit Image / Texture declaration Glubyte stupidImage[512][512][3] To create a 32bit version we just need to extend the depth to 4 bytes.

Using the Images / Textures in OpenGL Texture Maps Bump Maps Normal Maps Environment Maps (Reflection Maps) ctions.com/ChurchSto ne1Bump.jpg

Texture Formats 1D, 2D or 3D Textures can be used Typically you will only be using 2D Textures, as a 3D Texture can create an incredible load on the graphic Processing (And Video / Hardware Memory Use)

Texture Mapping In OpenGl Texture Coordinates range [0.0f->1.0f] These are known as the Parametric Surface Coordinates and commonly reffered to as UV Coordinates

Theoretically…. OpenGL Takes the XYZ Coordinates of a Vertex, then uses the related U,V coordinates of the Texture to apply it to the Object in World Coordinates. This object is then rendered to the Screen Buffer Actually the Texture is NOT mapped onto the object up until writing the Pixel Buffer (After Perspective Transformation)

For Simple Objects Cubes (Squares / Rectangles) Triangles Cylinders Spheres (less simple) www- evasion.imag.fr/.../sgi_html /ch07.html

UV Mapping For Complex Objects

The End (for now)

Mip Maps

Texturing Odd shaped Geometry

Using Intermediate Objects

Using UV Maps

Aliasing Aliasing is when the data representing your 3D world becomes corrupted with artifacts