9.2. O THER NOTABLE AI A SPECTS / HLSL I NTRO Common board game AI approaches and Strategic AI.

Slides:



Advertisements
Similar presentations
Stupid OpenGL Shader Tricks
Advertisements

Real-Time Rendering 靜宜大學資工研究所 蔡奇偉副教授 2010©.
Exploration of advanced lighting and shading techniques
POST-PROCESSING SET09115 Intro Graphics Programming.
7.2. AI E NGINE AND S TEERING B EHAVIOUR I Design of an AI Engine and introduction to steering in game AI.
© Copyright Khronos Group, Page 1 COLLADA FX Update and FX Composer 2.0 Daniel Horowitz & Ignacio Castaño.
Perspective aperture ygyg yryr n zgzg y s = y g (n/z g ) ysys y s = y r (n/z r ) zrzr.
COMPUTER GRAPHICS CS 482 – FALL 2014 NOVEMBER 10, 2014 GRAPHICS HARDWARE GRAPHICS PROCESSING UNITS PARALLELISM.
Exploration of bump, parallax, relief and displacement mapping
Bump Mapping CSE 781 Roger Crawfis.
Understanding the graphics pipeline Lecture 2 Original Slides by: Suresh Venkatasubramanian Updates by Joseph Kider.
Real-Time Rendering TEXTURING Lecture 02 Marina Gavrilova.
Real-Time Rendering SPEACIAL EFFECTS Lecture 03 Marina Gavrilova.
Rasterization and Ray Tracing in Real-Time Applications (Games) Andrew Graff.
Morphing and Animation GPU Graphics Gary J. Katz University of Pennsylvania CIS 665 Adapted from articles taken from ShaderX 3, 4 and 5 And GPU Gems 1.
Introduction to Shader Programming
Introduction to 3D in XNA Game Design Experience Professor Jim Whitehead February 27, 2009 Creative Commons Attribution 3.0 creativecommons.org/licenses/by/3.0.
The Graphics Pipeline CS2150 Anthony Jones. Introduction What is this lecture about? – The graphics pipeline as a whole – With examples from the video.
The programmable pipeline Lecture 10 Slide Courtesy to Dr. Suresh Venkatasubramanian.
Shading Languages By Markus Kummerer. Markus Kummerer 2 / 19 State of the Art Shading.
Computer Science – Game DesignUC Santa Cruz Adapted from Jim Whitehead’s slides Shaders Feb 18, 2011 Creative Commons Attribution 3.0 (Except copyrighted.
GPU Graphics Processing Unit. Graphics Pipeline Scene Transformations Lighting & Shading ViewingTransformations Rasterization GPUs evolved as hardware.
1 Introduction to Computer Graphics with WebGL Ed Angel Professor Emeritus of Computer Science Founding Director, Arts, Research, Technology and Science.
CHAPTER 4 Window Creation and Control © 2008 Cengage Learning EMEA.
Programmable Pipelines. Objectives Introduce programmable pipelines ­Vertex shaders ­Fragment shaders Introduce shading languages ­Needed to describe.
Real-time Graphical Shader Programming with Cg (HLSL)
Geometric Objects and Transformations. Coordinate systems rial.html.
Week 2 - Wednesday CS361.
Chris Kerkhoff Matthew Sullivan 10/16/2009.  Shaders are simple programs that describe the traits of either a vertex or a pixel.  Shaders replace a.
A Crash Course in HLSL Matt Christian.
CS 450: COMPUTER GRAPHICS REVIEW: INTRODUCTION TO COMPUTER GRAPHICS – PART 2 SPRING 2015 DR. MICHAEL J. REALE.
Week 3 - Monday.  What did we talk about last time?  Graphics rendering pipeline  Rasterizer Stage.
Computer Graphics Module Review CO2409 Computer Graphics.
CS 638, Fall 2001 Multi-Pass Rendering The pipeline takes one triangle at a time, so only local information, and pre-computed maps, are available Multi-Pass.
Computer Graphics The Rendering Pipeline - Review CO2409 Computer Graphics Week 15.
COMPUTER GRAPHICS CSCI 375. What do I need to know?  Familiarity with  Trigonometry  Analytic geometry  Linear algebra  Data structures  OOP.
Advanced Computer Graphics Advanced Shaders CO2409 Computer Graphics Week 16.
GRAPHICS PIPELINE & SHADERS SET09115 Intro to Graphics Programming.
09/16/03CS679 - Fall Copyright Univ. of Wisconsin Last Time Environment mapping Light mapping Project Goals for Stage 1.
Week 3 Lecture 4: Part 2: GLSL I Based on Interactive Computer Graphics (Angel) - Chapter 9.
 Learn some important functions and process in OpenGL ES  Draw some triangles on the screen  Do some transformation on each triangle in each frame.
What are shaders? In the field of computer graphics, a shader is a computer program that runs on the graphics processing unit(GPU) and is used to do shading.
Advanced D3D10 Shader Authoring Presentation/Presenter Title Slide.
Programming with OpenGL Part 3: Shaders Ed Angel Professor of Emeritus of Computer Science University of New Mexico 1 E. Angel and D. Shreiner: Interactive.
GLSL I.  Fixed vs. Programmable  HW fixed function pipeline ▪ Faster ▪ Limited  New programmable hardware ▪ Many effects become possible. ▪ Global.
An Introduction to the Cg Shading Language Marco Leon Brandeis University Computer Science Department.
Computer Science – Game DesignUC Santa Cruz Tile Engine.
How to use a Pixel Shader CMT3317. Pixel shaders There is NO requirement to use a pixel shader for the coursework though you can if you want to You should.
Lecture Rendering pipeline, shaders and effects 1Elias Holmlid.
- Introduction - Graphics Pipeline
Graphics Processing Unit
Introduction to OpenGL
Chapter 6 GPU, Shaders, and Shading Languages
The Graphics Rendering Pipeline
CS451Real-time Rendering Pipeline
Understanding Theory and application of 3D
Chapters VIII Image Texturing
9.2. Other notable AI Aspects / HLSL Intro
Introduction to Computer Graphics with WebGL
Graphics Processing Unit
UMBC Graphics for Games
Computer Graphics Module Review
Computer Graphics Module Overview
Programming with OpenGL Part 3: Shaders
Computer Graphics Introduction to Shaders
CIS 441/541: Introduction to Computer Graphics Lecture 15: shaders
03 | Creating, Texturing and Moving Objects
Introduction to OpenGL
Emerging Technologies for Games Review & Revision Strategy
CS 480/680 Fall 2011 Dr. Frederick C Harris, Jr. Computer Graphics
Presentation transcript:

9.2. O THER NOTABLE AI A SPECTS / HLSL I NTRO Common board game AI approaches and Strategic AI

Brief introduction to tactical and strategic AI Execution Management World Interface Moveme nt Strat egy Decision Making AnimationPhysics

Tactical and strategic AI encompasses a wide range of algorithms that try to: derive a tactical assessment of some situation, possibly using incomplete or probabilistic information Use tactical assessments to make decisions and coordinate the behaviour of multiple characters Aside: Not every genre of game needs tactical and/or strategic forms of AI

A waypoint is simply a position in the game world. As with path-finding waypoints (holding path-finding information, e.g. terrain cost, etc.), tactical waypoints hold tactical information, e.g.: Cover points Reconnaissance/sniper locations Shadowed locations Power-up spawn points Exposed locations

Tactical locations can be either set by the designer or derived from game data or analytical algorithms. Tactical nodes can be combined with pathfinding nodes to provide tactically aware pathfinding.

Influence mapping is widely used in strategy games to map the influence/strength of each side. The game world is split into chunks (tile-based is a common representation). Each chunk is assigned an influence score based on the combined balance of influence ‘emitted’ by game objects that can effect that chunk. The influence map can be used to identify points of weakness and strength and, from this, drive strategic goal selection.

The influence exerted on a particular area can depend upon the proximity of game objects (e.g. mobile units or stationary bases), type of surrounding terrain (e.g. a mountain range may ‘prevent’ influence passing), side specific factors (e.g. current financial or happiness state), etc. In most games, influence emitted by a game object decays over distance (e.g. using a linear drop-off alongside a defined maximum influence range)

Overview of approaches enabling jumping in games Execution Management World Interface Moveme nt Strat egy Decision Making AnimationPhysics

Unlike other forms of steering behaviour, jumps are inherently risky (i.e. they can fail, possibly with ‘fatal’ consequences). To jump, the character must be moving at the right speed and in the right direction and ensure the jump is executed at the right time. Also, steering behaviours typically re- evaluate decisions several times/second, correcting small mistakes. A jump action is a one-time, single event, fail-sensitive decision.

The most simple approach is to place jump points into the game level. If characters can move with different speeds, then the jump point also needs a minimum jump speed. The character can then seek towards the jump pad, matching the specified speed, and jump whenever it is on the jump pad. Minimum jump velocity

Some forms of jump require a defined target speed or a defined direction/angle of approach to the jump pad. Precise jump speed needed Additionally, some jumps may have a higher ‘price’ of failure (e.g. ‘death’ vs. a short delay to climb-up). Precise jump direction needed Aside: Such information can be incorporated into the jump point, but it is difficult to extensively test.

A good approach is to pair a jump pad with a landing pad. Doing this permits the game object to determine the needed speed and direction (by solving the trajectory equations). This approach is more flexible (different characters can differ in their movement approach) and is less prone to error.

Introduction to HLSL

Early versions of the DirectX and OpenGL APIs defined a number of fixed rendering stages. This forced all games to use the same approach with only a few parameters open to change.

Application As GPUs increased in capability it became possible to inject small programs (called shaders) allowing the application to have greater control. A list of vertices (points) are sent to the vertex shader. In the rasterisation stage primitives are constructed from output vertices. The primitives are then rasterized (i.e. the onscreen pixels determined). Vertex attributes are interpolated between the pixels. The pixel shader determines the on- screen colour of each pixel. Vertices Vertex Shader Pixel Shader Rasterisation / Interpolation Z-buffer test Frame buffer To screen

Shaders are small programs that run on the GPU. Different shader languages are available. Vertex Shader The vertex shader can set/change rendered vertices, e.g. object deformation, skeletal animation, particle motion, etc. Pixel Shader The pixel shader sets the colour of the pixel, e.g. for per-pixel lighting, texturing. Can also be used to apply effects over an entire scene, e.g. bloom, depth of field blur, etc. Aside: DirectX 10 also supports geometry shaders (not supported by XNA).

HLSL is a shading language developed by Microsoft for the Direct3D API. HLSL offers a number of functions (mostly centred around branching control, math functions and texture access). Aside: See us/library/bb aspx for a complete HLSL reference To do: Decide if you want to explore this

HLSL supports the shown scalar data- types. Note: vectors/matrices forms can also be defined, e.g. float3, int2x2, double4x4, etc. HLSL also provides a sampler type (used to read, i.e. sample, textures): sampler, sampler1D, sampler2D, and sampler3D. The sampler type is defined using a number of different states, e.g. MinFilter, MagFilter, and MipFilter controlling texture filtering, and AddressU, AddressV, and AddressW controlling addressing states. texture textureName; sampler2D textureSampler = sampler_state { Texture = textureName; MinFilter = Linear; MagFilter = Linear; MipFilter = Linear; AddressU = Wrap; AddressV = Wrap; AddressW = Wrap; }

Semantics are used to map input and output data to variables. All varying input data (from the application or between rendering stages) requires a semantic tag, e.g. all outputs from the vertex shader must be semantically tagged. Above right note: [n] is an optional integer that provides support for multiple data types, e.g. Texture0, Texture1, Texture2. float4 vertexPosition : POSITION0; Aside: The only valid semantic inputs to the pixel shader are Color[n] and Texture[n]. Often, custom data (i.e. non-texture addressing) is passed using a Texture[n] semantic.

HLSL permits C like functions to be specified. A shader must define at least one vertex function that will consider vertex information and at least one pixel function that will consider pixel colours. These functions must define their inputs and outputs with semantics. Intrinsic Functions HLSL offers a set of ‘built-in’ functions, mostly centred around flow control, math operations and texture access. float2 CalculateParallaxOffset( float3 view, float2 texCoord ) { view = normalize(view); float height = parallaxScale * (tex2D(HeightSampler, texCoord).r ) + parallaxOffset; float2 viewOffset = view.xy * (height); return viewOffset; }

The following is a simple example shader pixelShaderInput SimpleVS (vertexShaderInput input) { pixelShaderInput ouput; output. screenPosition = mul(input.position, wvpMatrix ); output.colour = float3(1.0f, 1.0f, 1.0f); return output; } float4 SimplePS (pixelShaderInput input) : Colour0 { return float4(input.colour.rgb, 1.0f); } technique SimpleShader { pass { VertexShader = compile vs_1_1 SimpleVS(); PixelShader = compile ps_1_1 SimplePS(); } } float4x4 wvpMatrix : WorldViewProjection; struct vertexShaderInput { float4 vertexPosition : Position0; }; struct pixelShaderInput { float4 screenPosition : Position; float3 colour : Color0; }; Define world-view-projection matrix Vertex shader function Transform from model space to screen space Pixel shader function Output pixel colour Technique definition – specifying VS and PS functions and compile type Define input structure expected by the vertex shader Define input structure expected by the pixel shader (and also output by the vertex shader)

Effects in XNA are types of game asset (alongside textures and models). The Effect class represents an effect, permitting effect parameter configuration, technique selection, and actual rendering. An effect can be loaded/configured as shown. Effect effect; effect = content.Load (“effectName"); effect.CurrentTechnique = lightEffect.Techniques[“techique"]; effect.Parameters[“colour"].SetValue(Vector3.One); effect.Parameters[“tolerance"].SetValue(0.8f); effect.Begin(); foreach ( EffectPass pass in effect.CurrentTechnique.Passes) { pass.Begin(); // Send vertex information to the effect, e.g. //graphicsDevice.DrawUserIndexedPrimitives (PrimitiveType.TriangleList,... ); pass.End(); } effect.End(); Load the effect Select the desired technique Aside: For better performance, the effect.Parameter[“name”] can be stored as an EffectParameter object (upon effect construction), and the setValue(...) method called on the parameter. Define effect parameters Begin effect and iterate over each pass End the effect Vertex information can be sent using other approaches

To do: Complete Question Clinic Consider if material is of use within your game. Complete section in project document on Alpha hand-in Roundup work for the alpha hand-in at the end of this week. Today we explored: Brief intro to some types of strategic/ tactical AI Overview of how jumps can be supported within games HLSL and effect usage in XNA