Presentation is loading. Please wait.

Presentation is loading. Please wait.

Getting Started with VR in Unreal Engine 4

Similar presentations


Presentation on theme: "Getting Started with VR in Unreal Engine 4"— Presentation transcript:

1 Getting Started with VR in Unreal Engine 4
Introduction Introduce yourself and how you got started at Epic and into Game Design (very quick) Summary of what this talk will cover. Terminology New Rendering Features and Techniques. Best Practices. How to Optimize for VR. Resources. Show VR Template and console commands and how to search for them *if there’s enough time. Let’s jump right to something awesome! Andrew Hurley – Epic Games

2 Talk about the newest VR game by Epic Games and Oculus.
- Incredibly engaging and fun VR FPS! - Uses new forward rendering features including MSAA

3 Talk about each image a little bit.
- Top Left: VR Showdown (Free on the Launcher) - Bottom Left: Bullet Train (Free on Launcher) - Top Right: VR Fun House (Modding) - Bottom Right: Robo Recall Mod Kit (Modding)

4 Epic Games Launcher Tons of Free Content! VR Mod Kits!
UE4 Marketplace! Helpful Documentation! AnswerHub and Forums! Talk about the Different Tabs (i.e Learn Tab with Epic Created Content), Unreal Engine Marketplace driven by community, and Helpful Support and Documentation. Go to to download the launcher.

5 Choosing your VR Platform
PC Mobile Console Native support for VR Platforms and shipped with a nice abstract layer so switching between or targeting different VR platforms is fairly simple. Google Daydrem PSVR – PlaystationVR OSVR –Open Source VR OpenVR – Valves VR Oculus – PC Vive – PC

6 New: Forward Renderer with MSAA
Recent Developments New: Forward Renderer with MSAA Full support for stationary lights, including dynamic shadows from movable objects which blend together with precomputed environment shadows. Multiple reflection captures blended together with parallax correction. Support for D-Buffer decals. Unshadowed Movable Lights. Supports Capsule shadows. Compatible with Instanced Stereo rendering. Supports Light Functions for Shadowed Lights. Talk about being able to blend between static and dynamic shadows for Stationary lights using Cascaded shadowmaps. Reflection capture being combined with planar reflections to create very accurate a parallax corrected reflections Planar reflections – Unlike Screen Space Reflections (SSAO) which restricts the reflection what is only being rendered to the screen, Planar reflections allow the reflections to render things off screen by rendering the level again from the direction of the reflection.

7 Terminology Unreal Engine 4 VR Specific Visual Scripting | Blueprints
Unreal Motion Graphics (UMG) Realtime Cinematics | Sequencer Particle Editor | Cascade Renderers | Deferred vs Forward Anti-aliasing | MSAA, TAA, FXAA VR Specific Head Mounted Display (HMD) Motion Controllers | Oculus Touch, Vive Controllers, PS Move Tracking Systems | Vive Lighthouses, Oculus Sensors Tracking Types | Positional and Rotational Ocular Vestibular Mismatch | VR Sickness Software Development Kit | SDKs Unreal Engine 4 - Blueprints - gameplay scripting system based on the concept of using a node-based interface - UMG - Visual UI authoring tool which can be used to create UI elements such as in-game HUDs, menus or other interface related graphics you wish to present to your users. - Sequencer - gives users the ability to create in-game cinematics through a specialized multi-track editor - Cascade – particle systems editor Anti-Aliasing – a rendering method implemented to reduce the stair-step effect on curve surfaces and edges. Deferred vs. Forward – The way geometry and lighting are stored and then rendered to the screen. Deferred collects the geometry first then uses a series of buffers for things like world normals and depth. Then it applies the shading and lighting information. Deferred has problems with thing like translucent sorting, while forward has a more difficult time with dynamic shadows as an example between the differences in limitations when choosing. VR Specific HMD – Head Mounted Display Motion Controllers – Oculus touch, Vive controllers, Playsation Move Tracking Systems – Vive Lighthouses Oculus Sensors Tracking Types – Positional and Rotational Shoots out invisible rays at which are constantly tracking the players position and rotation within either the chaperone or grid created by the tracking systems. Ocular Vestibular Mismatch – Occurs when your eyes and and ears are not in sync, which can induce the feeling of nausea. VR Sickness.

8 Instanced Stereo Rendering
The UE4 Deferred Render is a full-featured workhorse, but takes a bit of skill to fully leverage. Temporal Anti Aliasing can limit the sharpness of edges through how it functions by blurring nearby pixels to smooth out the overall image. UE4 Forward Renderer is a specialized renderer, with currently less features, but faster baseline. Multi Sampling Anti Aliasing (MSAA) is the sharpest solution for anti-aliasing. Deferred vs. Forward – The way geometry and lighting are stored and then rendered to the screen. Deferred collects the geometry first then uses a series of buffers for things like world normals and depth. Then it applies the shading and lighting information. Deferred has problems with thing like translucent sorting, while forward has a more difficult time with dynamic shadows as an example between the differences in limitations when choosing. Temporal AA vs. MSAA – Temporal AA blurs jagged edges and the stair step affect by sampling frames and blurring their results post processing implementation. Multi-Sampling AA is faster because it only samples once per pixel which reduces overall cost, but has issues when blending textures in the pixel shader. Instanced Stereo Previously, the engine rendered a stereoscopic image by drawing everything for the left eye, and then drawing everything for the right eye. With Instanced Stereo Rendering, we render both eyes at the same time, which significantly cuts down on the work done by the CPU, and improves efficiency in the GPU. Here are the two techniques running side-by-side: Instead of left then right eye, it is left and right eye. Rendering simultaneously instead of sequentially. When we draw the left eye, we also instance the geometry for the right eye. when we make the instanced draw call we bind both view's constant buffers. in the vertex shader can then sort out which eye we're rendering and use the correct constants and we also handle ensuring everything is moved and scaled correctly in screen space since the viewport covers both eyes Instanced Stereo Rendering Lets us use a single draw call to draw both the left and right eyes simultaneously, saving CPU and some GPU time. Currently works on PC, and coming soon on PS4 and mobile platforms.

9 Hidden and Visible Area Mesh
We draw meshes to cull out geometry that you can’t see, and only apply post processing where you can. For the deferred renderer, the visible area mesh is a bigger optimization! The Hidden Area Mask allows the renderer to reject pixels that won’t be seen due to the HMD lens distortion. Ultimately saving GPU time by reducing resource consumption!

10 Monoscopic Far Field Rendering for Mobile VR (Experimental)
At a certain distance, the stereo rendering of distant objects is indistinguishable from a regular monoscopic rendering. Monoscopic far field rendering takes advantage of this by dividing the scene into two partitions with a clipping plane: near field and far field. Everything on the near field side of the clipping plane is rendered in stereo, everything on the far field side of the plane is only rendered once and then composited into the near field result.

11 Near Field Clipping Plane
Far Field Clipping Plane Final Composited Image

12 VR Editor The VR Editor enables you to design and build worlds in a virtual reality environment using the full capabilities of the editor toolset combined with interaction models designed specifically for VR world building. Working directly in VR provides the proper sense of scale necessary to create realistic, believable worlds, while the use of motion controllers means you can build environments with natural motions and interactions. Talk about how it is a natural way to build environments to scale, while maintaining a creative flow of when creating environments or and alpha blockout.

13 VR Mesh Editor The Mesh Editor mode is a new geometry editing toolset designed to enable designers to quickly create and modify Static Mesh geometry in the Level Editor viewport. Create simple primitives and organic shapes, for alpha or whiteboxing, and then export to third part application to texture modify etc. Then reimport and replace the existing mesh. In addition to a low-poly toolset, Mesh Editor mode adds the ability to work with subdivision surfaces, enabling artists and designers to create smooth, organic surfaces directly in Unreal Editor.

14 VR Template Project The template comes with a number of rendering optimizations for VR. Single Sample Shadow from Stationary Lights to reduce shadow cost and have dynamic shadows Scalability settings applied to your DefaultEngine.ini found within Template project, are a good default starting point for VR development.

15 VR Best Practices Avoid Screen Space Effects (e.g. SSR and SSAO)
Don’t rely on reprojection technology and aim for high framerate! Consider lowering the resolution to increase frame rate. The players head movements should be in control of the camera at all times. Avoid accelerations as they can create ocular vestibular mismatch and cause VR Sickness. Do not override the players FOV as this can cause discomfort when rotating. Scale - The best thing to do about the scale of the objects in your VR world, is to mimic reality as closely as you can. Making objects bigger or smaller than their real world counterparts could lead to confusion or Simulation Sickness. Missing Polygon Faces - In standard games, it is often acceptable (and preferred) to remove polygon faces from objects that cannot be seen by the player. However, in VR games, players have much more freedom to look around, and this practice can sometimes lead to players being able to see things that they're not supposed to see. Which Type of Lighting to use - You should always use Static lighting and Lightmaps when making a VR project... this is the cheapest option to render. If you need to use dynamic lighting, make sure to limit the amount of dynamic lights to as few as possible, and make sure that they never touch one another. If you have an outdoor scene, set your directional light to dynamic instead of stationary, and then turn on Cascaded Shadow Maps (CSM); adjusting the settings to be as simple as possible while still giving you shadows. VR & VFX - Some VFX tricks, like using SubUV Textures to simulate fire or smoke, do not hold up very well when viewed in VR. In many cases you are going to need to use static meshes instead of 2D particles to simulate VFX's like explosions or smoke trails. Near field effects, or effects that happen very close to the camera, work well in VR, but only when the effects are made up of Static Mesh particles. VR & Transparency - In 3D graphics, rendering transparency is extremely costly, because transparency will generally have to be re-evaluated per-frame to ensure that nothing has changed. Because of this re-evaluation, rendering Transparency in VR can be so costly, that its cost outweighs its benefits. However, to get around this issue, you can use the DitherTemporalAA Material Function. This Material Function will allow a Material to look like it is using transparency. Also, this will help you avoid common transparency issues, such as self-sorting.

16 VR Optimization Real-Time GPU Profiler
Hardware requirements of virtual reality, as well as the stronger impact poor performance can have on your users' comfort, maintaining good VR project performance is key. Unreal Engine 4 contains a suite of tools that allow you to view the performance of your project on both the CPU and the GPU. Real-Time GPU Profiler Generic Function Library – Unreal provides a number of generic functions common across multiple HMDs to make debugging and scripting easier. HTML Console Library Output Debugging tool to search for various cvars and console commands. Buffer Visualizations – Embedded within the viewport

17 GPU Profiling Console Commands
This tool allows you to look at where the GPU cost is going when looking for the bottleneck. Console Commands Stat UnitGraph - This will give you general game thread, draw thread, and GPU time, as well as overall frame timing as text and virtual graph. Stat GPU - This command, added in 4.14, gives similar stats to the GPU profiler, but in a form that you can watch and monitor from in the game. Great for checking quick costs on GPU work. GPU Profiler – A tool to visualize and break down the scenes cost of rendering a particular frame to the GPU in milliseconds. Realitme Graph to visualize GPU

18 Resources Video: 2015 UE4 - VR and Unreal Engine
Making Bullet Train and Going off the Rails in VR VR Bow and Arrow Tutorial w/ Ryan Brucks - Training Livestream Training Livestream - Sam and Wes' VR Stream: Cameras, Multiplayer, Tips and Tricks! Creating Interactions in VR with Motion Controllers 1-3 Setting Up VR Motion Controllers VR Networking and 3D Menus Up and Running with Gear VR Developing for VR  Integrating the Oculus Rift into UE4  Presentations: UE4 VR - Niklas Smedberg Lessons from Integrating the Oculus Rift into UE4 Going Off The Rails: The Making of Bullet Train Links: Sam Deiter - 10 VR tips for Unreal Engine Tom Looman’s - Getting Started with VR in Unreal Engine 4 Iterate the importance of reverse engineering and utilizing all of the help, documentation, and free content out there to help you learn along the way. Ask if there are questions before opening the Template project. Mention that I can share the resources from the presentation.

19 Starting Out: Oculus Quick Starts SteamVR Quick Start Google VR Quick Start Gear VR Quick Starts VR Platforms: Samsung Gear VR Development Google VR Development Oculus Rift Development SteamVR Development VR Topics: VR Cheat Sheets VR Best Practices Motion Controller Component Setup VR Camera Refactor VR Editor Starting Out: Activating VR Mode VR Editor Guides: Setting Up VR Editor from GitHub Navigating the World in VR Mode Working with Actors in VR Mode VR Editor Reference: VR Editor Controls Quick Select Menu Radial Menu Transforming Actors in VR Editor Windows in VR Mode


Download ppt "Getting Started with VR in Unreal Engine 4"

Similar presentations


Ads by Google