After summarizing the course program, I’ll continue going over content in other SIGGRAPH 2011 programs which may be of interest to game developers or real-time rendering researchers. Next up is the Talks program; this post will also be a multi-parter, since there is a LOT of content to cover in this program.
Update July 16, 2011: Added link to “Coherent Out-of-Core Point-Based Global Illumination” EGSR 2011 paper.
Talks (which used to be called “Sketches” a few years ago) are short presentations – 20 minutes long (rare “long talks” are 40 minutes). Talks are a lot “leaner” than Technical Papers, which require detailed analysis, comprehensive citations of previous work, and comparisons to competing techniques. For this reason, SIGGRAPH Talks tend to be the venue of choice for industry practitioners, who often have limited time to spend on writing publications.
The SIGGRAPH Talk program has historically been dominated by talks from the fields VFX and CG feature animation – many of these contain relevant information for game developers, but the game industry itself has been under-represented. SIGGRAPH 2011 has a record number of game industry Talks, but there is still a lot to go before we match the film people (I hope to get a lot closer in future years!)
I will now summarize relevant Talks regardless of speaker affiliation. Since Talks are scheduled in sessions of four I will organize my summary along the same lines, skipping sessions without any relevant Talks and using the session order from the SIGGRAPH 2011 Talks page.
Pushing Production Data
This session contains four film talks, all of potential interest:
- Coherent Out-of-Core Point-Based Global Illumination describes a system used at DreamWorks Animation for computing global illumination and ambient occlusion- the details may be of interest to game developers working on “baking” precomputation systems. There is also an EGSR 2011 paper by the authors on this topic.
- Similarly, the information in Destroying Metro City: An Artist-Friendly and Efficient Demolition Pipeline for “Megamind” (also from DreamWorks Animation) could be relevant for precomputation of destroyed and fractured versions of game assets.
- The efficient digital acquisition of real-world props is a problem facing games as well as film; PhotoSpace: A Vision-Based Approach for Digiziting Props describes an interesting system used at Weta Digital for this.
- Games and film development are not “one size fits all” – individual games and films often require specific assets which can benefit from specialized authoring and rendering systems. Artistic Rendering of Feathers for Animated Films (yet another DreamWorks Animation talk) describes such a system.
Facing Hairy Production Problems
This session contains one game talk and three relevant film talks:
- Extensive use of geometry instancing is important in both games and film to save on asset authoring time and memory. The talk Kami Geometry Instancer: Putting the “Smurfy” in Smurf Village describes an instancing pipeline developed by Sony Pictures Imageworks which allows for distinct deformation of individual instances.
- The talk Making Faces: Eve Online’s New Portrait Rendering describes the impressive new avatar portrait system developed by CCP Games for the Eve Online space MMO.
- SpeedFur: A GPU-Based Procedural Hair and Fur Modeling System describes a hair modeling system (developed by Fido). The procedural authoring system and the GPU-accelerated preview mode both appear relevant for hair and fur in games.
- The talk GPU Fluids in Production: A Compiler Approach to Parallelism details a specialized CPU/GPU parallel compiler for fluid simulation developed by Double Negative Visual Effects. New parallelism approaches are always interesting, and I suspect fluid simulation will be a major differentiating feature for games on the next generation of platforms.
Eye on the Road
Two of the talks in this session (one by game developers, and one by academic researchers) appear relevant:
- MotorStorm Apocalypse: Creating Urban Off-Road Racing – this talk by Evolution Studios presents rendering and tools advances which enabled adding large-scale dynamic events to MotorStorm Apocalypse, the latest entry in the MotorStorm racing game franchise (also showcased in The Sandbox).
- Facial scanning has been a topic of heightened interest in the game industry since its highly publicized use in L.A. Noire. The talk R&D Facial Cartography: Interactive High-Resolution Scan Correspondence (by Paul Debevec’s graphics lab at the USC Institute for Creative Technologies) covers some interesting advances in this area.
Tiles and Textures and Faces Oh My!
This session contains talks by game developers, CG feature animation professionals, and hardware vendors; all four are relevant:
- Artist-guided procedural authoring systems can help with the asset creation issues faced by both game and film production. The talk Procedural Mosaic Arrangement In “Rio” details Blue Sky Studios‘ art-directable procedural pipeline for sidewalk and street tile mosaics.
- Programmable tessellation is one of the primary features of DirectX11, but authoring content for it can be challenging. NVIDIA‘s talk Generating Displacement From Normal Map for Use in 3D Games describes one possible solution to this problem.
- The film industry has found the open-source Ptex (per-face texture mapping) technology developed by Walt Disney Animation extremely useful for getting rid of UV layout issues. The talk Per-Face Texture Mapping for Real-Time Rendering (jointly presented by an NVIDIA developer technology engineer and the first author of the original Ptex paper) presents a real-time implementation of this technology.
- Skinning is one of the most fundamental technologies in game rendering and has not changed much in the last twenty years. The talk Spherical Skinning With Dual Quaternions and QTangents presents some skinning improvements achieved by Crytek during development of the Crysis franchise.
Let There Be Light
This session contains three CG feature animation case studies, all with interesting information for game developers:
- I find Rango to be an intriguing case of live-action and VFX methods being used by Industrial Light and Magic to make a CG animated feature with a unique photorealistic style. The talk “Rango”: A Case of Lighting and Compositing a CG-Animated Feature in an FX-Oriented Facility appears to have some interesting information on the methods used by the lighting and compositing artists.
- Ocean Mission on “Cars 2” – this talk describes how Pixar addressed several multi-disciplinary challenges involving the ocean in the opening sequence of Cars 2.
- Hair is another area where games lag noticeably behind film, so learning about film methods is valuable. The talk Untangling Hair Rendering at Disney details technology, tools and workflow advances adopted by Walt Disney Animation for the film Tangled.
Out Of Core
The four talks in this session are all relevant for game developers or real-time rendering researchers:
- Google Body: 3D Human Anatomy in the Browser – this talk describes how Google used the WebGL API in an innovative way to create an impressive in-browser application. Browser-based games are a rapidly increasing market, making APIs such as WebGL important to many game developers. The Google Body application is also showcased in The Sandbox.
- As a possible future alternative to the traditional rendering pipeline, ray tracing sparse voxel octrees has attracted some interesting research work, including GigaVoxels (several publications on which can be found on Cyril Crassin’s webpage). The talk Interactive Indirect Illumination Using Voxel Cone Tracing: An Insight builds on the GigaVoxels work to compute indirect lighting and ambient occlusion for complex scenes in real time. A preview was presented as an I3D 2011 poster, and various materials relating to the SIGGRAPH Talk can be found on a dedicated web page.
- Rendering the Interactive Dynamic Natural World of the Game: From Dust – in this talk, Ubisoft Montpellier discusses the simulation and rendering techniques used for the dynamic world of the game From Dust.
- Out-of-Core GPU Ray Tracing of Complex Scenes – this talk covers the CentiLeo GPU ray tracer (based on Kirill Garanzha’s PhD research at the Keldysh Institute of Applied Mathematics), which can render models composed of several hundred million polygons in real-time. More information on CentiLeo (including a video of it in action) can be found here.
Pingback: Real-Time Rendering · SIGGRAPH 2011 Talks – Part 2
Pingback: Real-Time Rendering · SIGGRAPH 2011 Talks – Part 3
Hi Naty,
Regarding “Coherent Out-of-Core Point-Based Global Illumination”, Eric made some material here http://www.tabellion.org/et/paper11/index.html
Thanks François, I’ve added the link.