Presentation is loading. Please wait.

Presentation is loading. Please wait.

Game Hardware and Engines J. Scott Hofmann Bethesda Softworks, Inc. * * Now at Boeing - Autometric, Inc.

Similar presentations


Presentation on theme: "Game Hardware and Engines J. Scott Hofmann Bethesda Softworks, Inc. * * Now at Boeing - Autometric, Inc."— Presentation transcript:

1 Game Hardware and Engines J. Scott Hofmann Bethesda Softworks, Inc. * shofmann@mindspring.com * Now at Boeing - Autometric, Inc.

2 Introduction  Game Hardware  History  Examples  Game Engines  Definition  Examples  Implementation  Game Engineering

3 Game History  Source: http://gamespot.com/gamespot/features/video/hov/p2_01.html  1889: Marufuku Company founded  Changes name to Nintendo in 1951  1947: Tokyo Telecommunications Engineering Company founded  Changes name to Sony in 1952  1958: “Table Tennis” played on an oscilloscope  This later becomes Pong (in 1972)

4 Game History  1961: Spacewar developed for PDP-1  first game implemented in software  1971: First arcade game shipped  Nolan Bushnell at Nutting Associates  1972  Magnavox ships the Odyssey  first home game console

5 Game History  1972: Bushnell leaves Nutting to found Atari  Pong released  1977  Atari 2600 ships  Nintendo releases first arcade game  1979: Milton Bradley releases Microvision  first handheld programmable electronic game  1980: Activision founded  first third-party developer

6 Game History  1981: IBM ships the IBM PC  1985: Nintendo Entertainment System released  1989: Nintendo Gameboy released  1994  Sony PlayStation released  Entertainment Software Rating Board (ESRB) established

7 Game History  2000  Playstation 2 released  2001  Nintendo Gameboy Advance released  Nintendo GameCube released  Microsoft Xbox released

8 Game Console Hardware  Sony Playstation 2  development done on a PSTool  PSTool is a Linux-based workstation  Microsoft Xbox  development done on a PC connected to an Xbox Development Kit (XDK)  Nintendo GameCube  code-named Dolphin

9 PlayStation 2 Architecture

10 PS2 Overview  I/O Processor (IOP)  controllers, FireWire, and USB ports  Emotion Engine (EE)  Geometry calculation  Behavior/World simulation  some program control and housekeeping  Graphics Synthesizer (GS)  receives display lists from the EE  Graphics Interface (GIF) unit mediates EE-GS communications

11 Emotion Engine Architecture  CPU: 300MHz MIPS III derivative  VU: Vector Unit  GS: Graphics Synthesizer

12 EE overview  CPU + FPU: program control, housekeeping  CPU + FPU + VU0: AI, physics simulation  VU1: geometry processing  More info:  http://arstechnica.com/reviews/1q00/playstation2/ee-1.html

13 GS Overview  Massively parallel rendering engine  150MHz core processor  16 pixel engines  4MB DRAM (i.e. frame buffer)  75 million polygons per second  20 million textured lit zbuffered blended polygons per second  2.4 billiion pixels per second  Designed for massively multipass rendering  each frame may have 20+ passes  NTSC TV is 30 frames per second

14 Microsoft Xbox Architecture  PC-like architecture  Intel Pentium III 733 MHz  NVIDIA graphics processing unit (GPU)  GeForce3 derivative  NVIDIA Media Communication Processor (MCP)  sound, video, network processor  64MB DDR SDRAM  Unified Memory Architecture (UMA) - No AGP or video memory!  10GB 5400RPM hard drive!!!

15 Developing for the Xbox  Xbox Development Kit (XDK)  Hardware: modified Xbox  includes SCSI card for DVD emulator (hard drive in PC)  Software: Libraries, tools for Visual Studio 6  XDK hardware connected to PC via ethernet  Xbox Debug Kit  XDK minus DVD emulator  used by QA teams (cheaper than XDK)  Parts of DirectX 8.0  Modified versions of Direct3D, DirectInput, DirectSound, Win32

16 GPU Architecture  GeForce3 derivative  250MHz clock rate  between Ti200 and Ti500 in the PC world  Vertex and Pixel Shaders  Apply a program to individual vertices or polygon fragments (“pixels”)  Provides much more hardware acceleration

17 NVIDIA MCP  Sound processing  256 voices over 64 channels  Dolby Digital encoding for surround sound  Video processing  MPEG-2 decoding  used also for DVD playback  Network processing  10/100-base-T ethernet

18 Comparing the PS2 and Xbox  PS2: Massively Multipass  Xbox: Only a few passes required  2-3 passes per frame vs 20-30 passes per frame  Xbox: 10 million polygons per second  125 million polys/sec theoretical peak  PS2: 20 million polys/sec  75 million polys/sec theoretical peak  Poly counts are per pass!  Xbox: 10M / 3 passes = 3.3M polys per frame  PS2: 20M / 20 passes = 1M polys per frame

19 GameCube Architecture  485MHz IBM PowerPC 750CXe CPU  equivalent to a ~700MHz Pentium III  24MB fast 1T-SRAM  16MB slower DRAM  162MHz ArtX (now ATI) GPU  2MB 1T-SRAM as zbuffer  1MB 1T-SRAM as texture cache

20 GameCube MCP  Macronix DSP provides sound  Not powerful enough for Dolby Digital  Dolby Pro Logic II possible instead  Network adapter not included  optional peripheral  DVD playback not included  possible in Matsushita device

21 Game Engines  Control the presentation of game content  Geometry and Texture  Script  Sound  Simulation (i.e. Physics)  AI  Network Traffic

22 Engine Architecture Core Library Geometry Management Animation and Simulation Sound Script Interpreter Game Content Network

23 Core Library  Math  Range-checked Trigonometric functions  Linear Algebra  Vector, Matrix, Quaternion  Curves and Surfaces  Linear, Quadratic, and Cubic curves Bezier and Tension-Continuity-Bias (TCB) curves  Bezier and Subdivision Surfaces

24 Core Library  Bounding Volumes  Box and Sphere  Intersection testing  Memory Management  Leak detection  Reference counting and/or garbage collection  Error Reporting  Miscellaneous Types

25 Geometry Management  Move as many polygons as possible as quickly as possible  Apply Texture (usually multiple textures)  Lighting  Color and Blending  Depth sort (ZBuffer, BSP tree)  Bind shader programs (DX8)  Control each rendering pass  Cull out invisible geometry

26 Geometry Management  Meshes are exported from the art tool  Usually 3D Studio MAX or Maya  Some additional processing is done  Level-of-detail simplification  Polygons organized into engine-friendly structures  Vertex arrays  Triangle strips  Scene graph  Texture compression  Colors checked for “illegal” values  Important if the display is a TV

27 Vertex Arrays  Vertex Buffers (DX) / Vertex Arrays (OGL)  An array of per-vertex information  x, y, z: position  r, g, b: color  i, j, k: normal (for lighting and physics)  u, v, w: texture (w optional)  Multitexturing would append other (u, v, w) coordinates  Other stuff (tangent, binormal for bump mapping; other application-specific per-vertex information)

28 Vertex Arrays  Each array has a primitive type  Points  Lines  Triangles  Triangle Strip or Fan  Quadrilaterals  Quad strip  Polygons

29 Triangle and Quad Strips  Reduce the number of redundant vertex specifications  Reduces the function-call overhead  Vertices transformed only once  More efficient use of bus bandwidth

30 Triangle Stripping  Triangle strips can be created from an arbitrary triangle mesh  generating optimal strips is NP-hard  Usually done by a noninteractive tool

31 Render State Management  Minimize the number of state changes  Enable/disable lights  Change textures  Enable/disable blending  Geometry organized to batch similar polygons  Scene graph and/or display lists built

32 Scene Graph  Directed Graph structure  Used for scene-database management  includes culling and interaction support  graph traversal paramount for speed  OpenInventor, VRML, Fly3D (in book)

33 Scene Graph

34 Display Lists  Basic support for scene-database management  Store a list of render commands in an optimized fashion  Display lists are immutable once created  OpenGL: called display lists  Also compiled vertex arrays  Direct3D: part of the vertex buffer

35 Culling  Select what geometry should be drawn  Three kinds of culling  Frustum culling  Occlusion culling  Detail culling  Culling algorithms use bounding volumes (BVs)  The actual polygon mesh is too big  Also useful for collision detection

36 Bounding Volumes  Oriented Bounding Boxes (OBB)  Axis-Aligned Bounding Boxes (AABB) a subset  slow, but usually the tightest bound  Bounding Spheres  fastest intersection tests  lots of wasted space, which causes false positive tests  Bounding Capsules  middle ground between OBBs and Bounding Spheres

37 Hierarchical Bounding Volumes  Tree of bounding volumes  Parent node’s volume encloses child node volumes  Leaf nodes are actual geometry  Usually built using boxes or spheres

38 Frustum Culling  Walk bounding volume tree, testing each BV for intersection with frustum  For static scenes, a spatial data structure is faster than the BV tree  octtree or BSP tree best  for dynamic scenes, data structure management swamps culling speedup

39 Backface Culling  A simple form of occlusion culling  remove all polygons which face away from the eye  Usually done by calculating the winding order of a polygon  Mirroring transforms can flip this order  Acceleration through normal masks  sort polygons into clusters by normal vector  test each cluster once for culling

40 Occlusion Culling  Remove geometry in the camera frustum but not visible  Difficult to do  Portals  Research still being done:  Hierarchical Z-Buffer  Occlusion Masks  Shadow volume culling

41 Portals  Divide indoor scene into convex cells connected by portals  Build cell graph from this representation

42 Portals  What is rendered is then determined by querying the cell graph  cell identified containing eye point  That cell is rendered  For each portal, that portal is projected onto the view plane  if portal’s projection is non-empty, recursively render that cell

43 Detail Culling  Select the appropriate number of polygons to display  view-independent criteria:  per-frame polygon count  arbitrary number picked by developer  view-dependent criteria:  size of screen projection  distance from eye

44 Static Level-of-Detail  Pre-generate a fixed number of reduced-polygon-count meshes  Switch between meshes at runtime  view-independent: polygon count caps  view-dependent: remove redundant detail  Morph between meshes  eliminates “popping” artifacts during the switch

45 Continuous Level-of-Detail  Precalculate a sequence of vertex splits or edge collapses to generate a LOD level at runtime

46 Image-based Rendering  Textures with pre-rendered images are used to replace overly complex geometry  Billboards  screen- or world-aligned polygons  used for trees, smoke, explosions, etc.  Impostors (aka sprites)  Polygons containing images of complex scenes  Regenerated when the eye moves beyond threshold

47 Animation  Rigid Body Animation  Controlling a mesh’s position and orientation  TCB curves used  Soft Body Deformation  Character Animation  3D Studio MAX’s Physique plugin  Freeform Deformation

48 Character Animation  Create a hierarchy of “bones”  Bones are usually scene-graph nodes  The bones are then animated  Rigid-body animation  Skin animation derived from the bone animation

49 Character Animation

50 Simulation  Physics Simulation  Behavior Simulation  Often part of AI  For each frame:  Calculate Motion  Apply Constraints (e.g. collision detection)  Apply Motion to Geometry  Draw Frame

51 Collision Detection  Necessary for “realistic” gameplay  Prevent walking through walls, other objects  Used by most dynamics algorithms  Two phases  Collision Detection  Does one object intersect another?  Collision Resolution  Given two intersecting objects, what should happen?  Bounding Volumes used here

52 Sound  Platform-specific APIs used  3D audio  Sounds are part of the scene graph  Each sound has a geometric location  Occlusion, attenuation, doppler shifting  Dolby Pro Logic and Dolby Digital  “5.1” channels:  2 main, center, 2 rear surround, subwoofer

53 Network  Manage network connections  Connect to game server  Battle.net, MSN Game Zone  LAN play  Peer-to-peer  Synchronize game clients  Game world is a distributed database  Scene graph derived from game world

54 Script Interpreter  Game content implemented through scripts  “When this door opens, summon monster”  UnrealScript, QuakeC  Scripts are usually interpreted  Scripts are compiled to bytecodes  More engines are now using the Java VM

55 Game Content  What the player sees and does  Art, sound, scripts  Usually not developed by the programmers  Programmers provide engine and technology

56 Game Development  Game Team Composition  Management  Programmers  Software Development  Content Developers  Game Play  Script Development  Artists  Graphics and Sound

57 Conclusion  Game hardware changes every 3-5 years  NES -> SNES -> N64 -> GameCube  PlayStation -> PlayStation2  Usually no backwards compatibility  PC: software -> Voodoo -> GeForce ->GeForce3  When hardware changes, the engine must change  Hardware transformation & lighting (T&L)  Programmable geometry  Networks


Download ppt "Game Hardware and Engines J. Scott Hofmann Bethesda Softworks, Inc. * * Now at Boeing - Autometric, Inc."

Similar presentations


Ads by Google