Course Description

Modern video games employ a variety of sophisticated algorithms to produce ground-breaking 3D rendering pushing the visual boundaries and interactive experience of rich environments. This course brings state-of-the-art and production-proven rendering techniques for fast, interactive rendering of complex and engaging virtual worlds of video games.

 

This year the course includes speakers from the makers of several innovative game companies, such as Guerrilla Games, Bungie, Studio Gobo, Activision, Infinity Ward, Microsoft and Unity Technologies. Topics range from variations on the latest techniques for improved antialiasing, improvements for tiled and clustered rendering methods, particle and FX system design for AAA game engines, rendering shippable volumetric cloudscapes on current generation of commodity hardware, approximations to spherical area lights, atmospheric scattering and checkerboard rendering methods for 4K resolutions, real-time ocean rendering, global illumination improvements for indirect illumination, and an accelerated technique for screen-space reflections for planar reflection receivers.

 

This is the course to attend if you are in the game development industry or want to learn the latest and greatest techniques in real-time rendering domain!

 

Previous years’ Advances course slides: go here.

 


Syllabus

Advances in Real-Time Rendering in Games: Part I

Monday, 31 July, 9:00 am - 12:15 pm | Los Angeles Convention Center, Room 408AB

Advances in Real-Time Rendering in Games: Part II

 

Monday, 31 July, 2:00 pm - 5:15 pm | Los Angeles Convention Center, Room 408AB

 

Prerequisites

Working knowledge of modern real-time graphics APIs like OpenGL or Direct3D and a solid basis in commonly used graphics algorithms. Familiarity with the concepts of programmable shading and shading languages. Familiarity with shipping gaming consoles hardware and software capabilities is a plus but not required.

Intended Audience

Technical practitioners and developers of graphics engines for visualization, games, or effects rendering who are interested in interactive rendering.

Advances in Real-Time Rendering in Games: Part I

 

9:00 am

Natalya Tatarchuk

Welcome and Introduction

 

9:10 am

Andrew Schneider (Guerrilla Games)

Nubis: Authoring Real-Time Volumetric Cloudscapes with the Decima Engine

 

9:55 am

Huw Bowles (Studio Gobo)

Crest: Novel Ocean Rendering Techniques in an Open Source Framework

10:35 am

Michal Iwanicki (Activision)

Peter-Pike Sloan (Activision)

Precomputed lighting in Call of Duty: Infinite Warfare

 

11:40 am

Jorge Jimenez (Activision)

Dynamic Temporal Antialiasing in Call of Duty: Infinite Warfare

12:15pm

Closing Q&A

Advances in Real-Time Rendering in Games: Part II

 

2:00 pm

Tatarchuk

Welcome (and Welcome Back!)

2:05 pm
Brandon Whitley (Bungie)

The Destiny Particle Architecture

2:55 pm

Giliam de Carpentier (Guerrilla Games)

Kohei Ishiyama (Kojima Productions)

Decima Engine: Advances in Lighting and AA

 

3:45 pm

Adam Cichocki (Microsoft)

Optimized pixel-projected reflections for planar reflectors

4:10 pm

Michal Drobot (Infinity Ward)

Improved Culling for Tiled and Clustered Rendering

5:15 pm
Tatarchuk

Closing Remarks

 

 

Course Organizer

Natalya Tatarchuk (@mirror2mask) is a graphics engineer and a rendering enthusiast. As the Director of Global Graphics at Unity Technologies, she is focusing on driving the state-of-the-art rendering technology and graphics performance for the Unity engine. Previously she was the Graphics Lead and an Engineering Architect at Bungie, working on innovative cross-platform rendering engine and game graphics for Bungie’s Destiny franchise, including leading graphics on the upcoming Destiny 2 title. Natalya also contributed graphics engineering to the Halo series, such as Halo: ODST and Halo:Reach. Before moving into game development full-time, Natalya was a graphics software architect and a lead in the Game Computing Application Group at AMD Graphics Products Group (Office of the CTO) where she pushed parallel computing boundaries investigating advanced real-time graphics techniques. Natalya has been encouraging sharing in the games graphics community for several decades, largely by organizing a popular series of courses such as Advances in Real-time Rendering and the Open Problems in Real-Time Rendering at SIGGRAPH. She has also published papers and articles at various computer graphics conferences and technical book series, and has presented her work at graphics and game developer conferences worldwide. Natalya is a member of multiple industry and hardware advisory boards. She holds an M.S. in Computer Science from Harvard University with a focus in Computer Graphics and B.A. degrees in Mathematics and Computer Science from Boston University.

 

 


 

Nubis: Authoring Real-Time Volumetric Cloudscapes with the Decima Engine

 

Abstract: In the 2015 Advances in Real-Time Rendering Course, we presented a prototype solution for real-time volumetric cloudscapes which produced a variety of cloud types in various lighting conditions and rendered in under 2 milliseconds on the PlayStation 4. However, many practical challenges remained in the way of it becoming a successful production tool for use in our game Horizon: Zero Dawn: authoring cloudscapes on a regional scale, animation and transitions, integration into our atmospheric system, further optimization to pay for these new features, and the task of creating a language and long term plan for what we want to achieve in the context of our game engine, Decima. Nubis is our solution to these challenges. This talk will explain how and why the Nubis system works and highlight some advances beyond the prototype that we presented in 2015, including changes to the lighting model. In addition, we will go a bit more in-depth for several topics that have garnered interest in the development community, namely Perlin-Worley noise generation and our weather simulation. Finally, we will offer a quick look ahead to where we are going next with Nubis.

Presenters:

Andrew Shneider (Guerrilla Games)

Bios:

Andrew Schneider is a Principal FX Artist at Guerrilla Games in Amsterdam. He spends his time working with the developer Nathan Vos constructing the Nubis system for Guerrilla and creating assets and tools for the FX team. Previously, he worked as a Senior FX Technical Director at Blue Sky Studios, where he developed the volumetrics and clouds pipelines for the Rio and Ice Age animated movies. His interests include simulation, lighting, and volumetrics. He has previously given 4 Talks at SIGGRAPH from 2011 to 2015.

 

Materials:
(Updated: August 4th, 2017)


Powerpoint slides (808 MB), PDF Slides (13MB)

 

 

 


 

Crest: Novel Ocean Rendering Techniques in an Open Source Framework

 

Abstract: We present Crest, an open source ocean renderer implemented in Unity3D which showcases a number of novel techniques in mesh generation, shape representation and surface shading.

 

To generate the geometry we extend clipmapping with continuous level of detail, and a strip of skirting geometry to extend to the horizon. The surface is composed of non-overlapping tiles which support frustum culling. To maintain screen space mesh density, we show how to scale the mesh horizontally when camera changes height, without noticeable pops. We also implement a simple heuristic for placing the detail center in front of the viewer which is robust to all view positions and directions.

 

We render ocean shape on the GPU into displacement textures, arranged to match the density and placement of the geometry LODs. We limit the number of Gerstner wavetrains added if the wavelengths are too short for each shape texture to avoid aliasing and unnecessary work. We also use ping-pong render targets to simulate the wave equation to add a dynamic layer of motion.

 

For shading, we show that normal map UVs can be scaled with the geometry LOD rings to achieve detail close to viewer, as well as preventing the glassy flat appearance of insufficient detail in the mid to far distance, and eliminating visible repetition. For foam we add additional render targets alongside the shape textures and use feedback render passes to emulate dissipation. Finally, we'll show results from applying depth peeling to achieve non-trivial light transport paths through multiple surfaces.

 

Presenter:

Huw Bowles (Studio Gobo)

 

Bios:

Huw Bowles is a Lead Programmer at Studio Gobo which in recent years has shipped three Disney Infinity playsets. He has diverse interests in game development and computer science and has been fortunate enough to work on a number of areas including graphics, gameplay and physics. Prior to joining Gobo he worked on a number of graphics research projects at Black Rock Studio and Disney Research Zurich. He has an MSc from ETH in Zurich and a BSc from Melbourne University.

 

Materials:
(Updated: August 4th, 2017)



PowerPoint slides

GitHub link

 


 

Precomputed lighting in Call of Duty: Infinite Warfare

 

Abstract: Indirect lighting is an important factor in creating a believable look of game worlds. Many of the solutions commonly used for rendering indirect illumination suffer from numerous problems:
they are either expensive, in both memory and runtime performance, making them unsuitable for use in a 60Hz title or provide insufficient quality.

 

We present a complete description of the system used in the Call of Duty: Infinite Warfare, from the initial motivations and assumptions, through the description of our baking pipeline and the runtime components. We will show how various components of the illumination signal can be decoupled - which allows to both speed up the precomputation times as well as achieve higher quality results at runtime, while avoiding much of the memory cost. 

 

We will describe the light grid system responsible for providing indirect lighting for the dynamic entities, talk about our experiences with automatically generating indirect illumination probes as well as share some interesting solutions for dealing with common problems arising when working with precomputed lighting – namely, heuristics for determining validity of the samples, used, various approaches for dealing with de-ringing of spherical harmonic signals, etc.

 

Presenters:

Michal Iwanicki (Activision)
Peter-Pike Sloan (Activision)


Bios:

Michał Iwanicki has been working in the game industry for twelve years. He started in Poland at CD Projekt RED, working on The Witcher and laying the foundations for the in-house Red engine. Since then, he’s worked on Milo & Kate at Lionhead Studios and The Last of Us at Naughty Dog, contributing to graphics engine technology. He now serves as a Technical Director at Activision, helping the internal studios as a part of the Central Technology group.

 

Peter-Pike Sloan is a Technical Fellow at Activision, heading up a small graphics research group in Washington state. Prior to that he has worked at NVIDIA, Disney and Microsoft. His research has been used extensively in the games industry and he has published papers in animation, skinning, simulation and interactive rendering. His papers are available online at:
http://www.ppsloan.org/publications/

 

Materials:
(Updated: August 8th, 2017)


PowerPoint slides, PDF slides

 

 


Dynamic Temporal Antialiasing in Call of Duty: Infinite Warfare

Abstract: This talk covers the temporal supersampling techniques created during the development of Call of Duty: Infinite Warfare. First, we present a new temporal upsampling technique that upgrades dynamic resolution into dynamic antialiasing. Rather than having the resolution change under load, the AA quality will be reduced instead, yielding full resolution output frames regardless of engine load.

 

Second, we discuss a differential blend operator working in both time and space domains. It is capable of reconstructing additional subpixel information or upsampling to higher resolutions.

 

Third, we present a novel resampling technique that achieves bicubic quality with a single sample.

 

Finally, we discuss additional engine features designed to compensate for performance spikes - including a shader and model Level of Detail system that allows linear performance scaling in Forward+ renderers.  

 

 

Presenters:

Jorge Jimenez (Activision)

Bio:

Jorge Jimenez is a Graphics R&D Technical Director at Activision Blizzard. He received his PhD degree in Real-Time Graphics from Universidad de Zaragoza (Spain) in 2012. His interests include photorealism, special effects and the attention to the details. He has contributions in conferences, books, and journals, including SIGGRAPH and GDC, the GPU Pro series, the Game Developer magazine, and the journal Transaction on Graphics. He co-organized the course "Filtering Approaches for Real-Time Anti-Aliasing at SIGGRAPH 2011", declaring open war against the jaggies. In GDC 2013, he co-presented the talk “Next Generation Character Rendering”, and collaborated in the Digital Ira project, which used this character rendering technology. Since then, he has worked on the Call of Duty franchise, including Advanced Warfare, Black Ops III and Infinite Warfare. Some of the key achievements he has been involved include Jimenez's MLAA, SMAA, GTAO and the separable subsurface scattering technique. More about him in his twitter account: @iryoku1.

 

Materials:



 


 

The Destiny Particle Architecture

Abstract: The world of Destiny is filled with “Space Magic”, so we expect a lot from our FX. Our particles need to convey a wide variety of visual experiences, support sub-second iteration, and achieve high performance. This presentation will discuss how we tackled these problems in Destiny 2. We’ll see that particle systems in Destiny are represented by node graphs, where each node contains parameters – such as particle size and color – represented by expressions. We’ll discuss the techniques that allow these expressions to support sub-second iteration and high performance, including our expression-to-HLSL converter and our bytecode interpreter, which can execute on both the CPU and the GPU. We’ll provide an overview of the code architecture, including the relatively simple changes we made to support GPU particles. We’ll also show one of our features, the motion primitives, as a demonstration of this architecture. These are shape primitives -- such as spheres, points, and planes – used to influence the motion of a particle.

 

Presenter:

Brandon Whitley (Bungie)

Bio:

Brandon Whitley has been a graphics engineer at Bungie since 2011. Over his 10 years in AAA game development, he has been fortunate to work on a wide variety of real-time rendering problems, including the creation of the particle engine for Destiny. He earned a Masters in Computer Science from Georgia Tech and a Bachelors in Computer Science from Pacific Lutheran University.

 

Materials:
(Updated: August 14th, 2017)

PowerPoint Slides (~485MB)

 


 

Decima Engine: Advances in Lighting and AA



Abstract: The Decima engine was originally developed for the Killzone series, and is now powering Horizon: Zero Dawn as well as Death Stranding (Kojima Productions). In this talk we’ll cover some of the rendering techniques we developed for these titles. Topics include an improved method for approximating spherical area lights by bending the light vector of a single point light, practical realistic atmospheric scattering with height fog, our 2-frame temporal anti-aliasing solution for 1080p, and finally our optimized 2160p checkerboard rendering and ‘tangram’ resolve strategy used on the PS4 Pro.

 

 

Presenters:

Giliam de Carpentier (Guerrilla Games)
Kohei Ishiyama (Kojima Productions)

Bio:

Giliam de Carpentier is a Principal Tech Engineer at Guerrilla Games in Amsterdam. He developed a number of the lighting and post processing techniques for Horizon: Zero Dawn, including its AA, HDR, reworked PBR, and checkerboard rendering. Before joining Guerrilla in 2015, he worked for 8 years at Force Field, developing graphics, AI, physics and back-end systems for many of their multi-platform games.

 

Kohei Ishiyama is a graphics programmer at Kojima Productions, with 6 years of previous experience as a VFX and physics programmer. On Death Stranding his main focus is on rendering programming, including the areas of physically based rendering as well as lighting. He has a strong passion for physics, and is particularly interested in ways in which the radiative transfer equation can be approximated for use in games.

 

Materials:
(Updated: August 3rd, 2017)

Powerpoint slides, PDF slides

 

 


 

Optimized pixel-projected reflections for planar reflectors    

 

Abstract: The talk will introduce a technique for generating real-time dynamic reflections in screen-space. The technique, while having the limitation to handle only sharp planar reflections, has low runtime cost on current commodity hardware. Reflections are obtained from the rendered scene color buffer by a simple algorithm operating on artist-defined scene reflective areas approximation shapes. With additional to depth buffer, reflective areas approximation in hand, this method allows fast generation of dynamic screen-space reflections with high visual quality.

 

Presenter:

Adam Cichocki (Microsoft)

Bio:

Adam Cichocki is a Senior Software Engineer working for Microsoft on rendering for Windows Mixed Reality. Before joining Microsoft he spent ten years in game development industry working as a graphics programmer. This includes six and a half years in CD Projekt Red, which he recently left as Lead Principal Graphics Programmer. He also worked as an Engine Programmer for People Can Fly, and was a co-founder of a mobile games company Demonual Studios.

 

Materials:
(Updated: August 9, 2017)

PowerPoint Slides, PDF Slides (No Notes), PDF Slides (With Notes)

 

 


 

Improved Culling for Tiled and Clustered Rendering

Abstract: This lecture covers two novel rendering algorithms used in Call of Duty: Infinite Warfare. The first, z-binning, significantly improves the quality and performance of volumetric entity-vs-geometry culling as compared to classic tiled and clustered techniques. The second, conservative proxy culling, utilizes a custom conservative rasterization approach to further improve the quality of culling, significantly improving scenarios with complex occlusion.

The lecture will present the design process, the resulting pipeline, and an in-depth implementation of both algorithms on AMD GCN-based GPUs. The target audience is experienced rendering engineers with prior knowledge of tiled or cluster rendering techniques, as well as a basic understanding of GCN intrinsic.

Presenter:

Michal Drobot (Infinity Ward)

Bio:

Michal Drobot is a Principal Rendering Engineer at Infinity Ward, Activision. Most recently, he worked on the rendering engine architecture of Call of Duty: Infinite Warfare. Before that he helped in designing and optimizing the 3D renderer in Far Cry 4 at Ubisoft Montreal. Prior to that he worked at Guerrilla Games, designing and optimizing the rendering pipeline for the Playstation 4 launch title Killzone: Shadow Fall. Michal specializes in rendering algorithms, render architectures, hardware architectures and low level optimizations.

 

Materials:
(Updated: August 2nd, 2017)

Powerpoint Slides, PDF Slides (Without notes)

 

 

 

 

Contact: