Download presentation
Presentation is loading. Please wait.
1
Week 2 - Monday COMP 4290
2
What did we talk about last time?
C# MonoGame
3
Questions?
4
Assignment 1
5
Project 1
6
Back to MonoGame
7
Review of MonoGame Program creates a Game1 (or similar) object and starts it running Game1 has: Initialize() LoadContent() Update() Draw() It runs an update-draw loop continuously until told to exit
8
Console game We're used to interacting with programs from the command line (console) MonoGame was not designed with this in mind It has pretty easy ways to read from the keyboard, the mouse, and also Xbox controllers But you'll need a console for Project 1 so that you can tell it which file to load and what kind of manipulations to perform on it So that Console.Write() and Console.Read() work Go to the Properties page for your project Go to the Application tab Change Output Type to Console Application More information: You'll need a separate thread to read and write to the console if you don't want your game to freeze up
9
Drawing a picture cat = Content.Load<Texture2D>("cat.jpg");
To draw a picture on the screen, we need to load it first Inside a MonoGame project, right-click the Content.mgcb file and choose Open with… Select MonoGame Pipeline Tool Add and then Existing Item… Find an image you want on your hard drive Make sure the Build Action is Build The Importer should be Texture Importer - MonoGame Create a Texture2D member variable to hold it Assume the member variable is called cat and the content is called cat.jpg In LoadContent(), add the line: cat = Content.Load<Texture2D>("cat.jpg");
10
Drawing a picture continued
Now the variable cat contains a loaded 2D texture Inside the Draw() method, add the following code: This will draw cat at location (x, y) All sprites need to be drawn between Begin() and End() spriteBatch calls spriteBatch.Begin(); spriteBatch.Draw(cat, new Vector2(x, y), Color.White); spriteBatch.End();
11
Drawing text Modern TrueType and OpenType fonts are vector descriptions of the shapes of characters Vector descriptions are good for quality, but bad for speed MonoGame allows us to take a vector-based font and turn it into a picture of characters that can be rendered as a texture Just like everything else
12
Drawing text continued
Inside a MonoGame project, right-click the Content.mgcb file and choose Open with… Select MonoGame Pipeline Tool Right click on Content in the tool, and select Add -> New Item… Choose SpriteFont Description and give your new SpriteFont a name Open the spritefont file, choosing a text editor like Notepad++ By default, the font is Arial at size 12 Edit the XML to pick the font, size, and spacing You will need multiple Sprite Fonts even for different sizes of the same font Repeat the process to make more fonts Note: fonts have complex licensing and distribution requirements
13
Drawing a font continued
Load the font similar to texture content Add a DrawString() call in the Draw() method: font = Content.Load<SpriteFont>("Text"); spriteBatch.Begin(); spriteBatch.DrawString(font, "Hello, World!", new Vector2(100, 100), Color.Black); spriteBatch.End();
14
Why are they called sprites?
They "float" above the background like fairies… Multiple sprites are often stored on one texture It's cheaper to store one big image than a lot of small ones This is an idea borrowed from old video games that rendered characters as sprites
15
Drawing sprites with rotation
It is possible to apply all kinds of 3D transformations to a sprite A sprite can be used for billboarding or other image-based techniques in a fully 3D environment But, we can also simply rotate them using an overloaded call to Draw() spriteBatch.Draw(texture, location, sourceRectangle, Color.White, angle, origin, 1.0f, SpriteEffects.None, 1);
16
Let's unpack that texture: Texture2D to draw
location: Location to draw it sourceRectangle Portion of image Color.White Full brightness angle Angle in radians origin Origin of rotation 1.0f Scaling SpriteEffects.None No effects Float level
17
Graphics rendering pipeline
For API design, practical top-down problem solving, and hardware design, and efficiency, rendering is described as a pipeline This pipeline contains three conceptual stages: Produces material to be rendered Application Decides what, how, and where to render Geometry Renders the final image Rasterizer
18
Student Lecture: Geometry Stage
19
Geometry Stage
20
Model and View Transform
Geometry stage The output of the Application Stage is polygons The Geometry Stage processes these polygons using the following pipeline: Model and View Transform Vertex Shading Projection Clipping Screen Mapping
21
Model Transform Each 3D model has its own coordinate system called model space When combining all the models in a scene together, the models must be converted from model space to world space After that, we still have to account for the position of the camera
22
Model and View Transform
We transform the models into camera space or eye space with a view transform Then, the camera will sit at (0,0,0), looking into negative z The z-axis comes out of the screen in the book's examples and in MonoGame (but not in older DirectX)
23
Vertex Shading Figuring out the effect of light on a material is called shading This involves computing a (sometimes complex) shading equation at different points on an object Typically, information is computed on a per-vertex basis and may include: Location Normals Colors
24
Projection Projection transforms the view volume into a standardized unit cube Vertices then have a 2D location and a z-value There are two common forms of projection: Orthographic: Parallel lines stay parallel, objects do not get smaller in the distance Perspective: The farther away an object is, the smaller it appears
25
Clipping Clipping process the polygons based on their location relative to the view volume A polygon completely inside the view volume is unchanged A polygon completely outside the view volume is ignored (not rendered) A polygon partially inside is clipped New vertices on the boundary of the volume are created Since everything has been transformed into a unit cube, dedicated hardware can do the clipping in exactly the same way, every time
26
Screen mapping Screen-mapping transforms the x and y coordinates of each polygon from the unit cube to screen coordinates A few oddities: DirectX has weird coordinate systems for pixels where the location is the center of the pixel DirectX conforms to the Windows standard of pixel (0,0) being in the upper left of the screen OpenGL conforms to the Cartesian system with pixel (0,0) in the lower left of the screen
27
Upcoming
28
Next time… Rendering pipeline Rasterizer stage
29
Reminders Keep reading Chapter 2 Want a Williams-Sonoma internship?
Visit Interested in coaching 7-18 year old kids in programming? Consider working at theCoderSchool For more information: Visit Contact Kevin Choo at Ask me!
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.