The full list of SIGGRAPH 2009 talks is finally up here.
Talks (formerly known as sketches) are one of my favorite parts of SIGGRAPH. They always have a lot of interesting techniques from film production (CG animation and visual effects), many of which can be adapted for real-time rendering. There are typically some research talks as well; most are “teasers” for papers from recent or upcoming conferences, and some are of interest for real-time rendering. This year, SIGGRAPH also has a few talks by game developers – hopefully next year will have even more. Unfortunately, talks have the least documentation of all SIGGRAPH programs (except perhaps panels) – just a one page abstract is published, so if you didn’t attend the talk you are usually out of luck.
The Cameras and Imaging talk session has a talk on the cameras used in Pixar‘s “Up” which may be relevant to developers of games with scripted cameras (such as God of War).
From Indie Jams to Professional Pipelines has two good game development talks: Houdini in a Games Pipeline by Paulus Bannink of Guerilla Games discusses how Houdini was used for procedural modeling in the development of Killzone 2. Although this type of procedural modeling is fairly common in films, it is not typically employed in game development. This is of particular interest since most developers are looking for ways to increase the productivity of their artists. In the talk Spore API: Accessing a Unique Database of Player Creativity Shodhan Shalin, Dan Moskowitz and Michael Twardos discuss how the Spore team exposed a huge database of player-created assets to external applications via a public API.
The Splashing in Pipelines talk session has a talk by Ken Museth of Digital Domain about DB-Grid, an interesting data structure for volumetric effects; a GPU implementation of this could possibly be useful for real-time effects. Another talk from this session, Underground Cave Sequence for “Land of the Lost” sounds like the kind of film talk which often has nuggets which can be adapted to real-time use.
Making it Move has another game development talk, Fight Night 4: Physics-Driven Animation and Visuals by Frank Vitz and Georges Taorres from Electronic Arts. Fight Night 4 is a game with extremely realistic visuals; the physics-based animation system described here is sure to be of interest to many game developers. The talk about rigging the “Bob” character from Monsters vs. Aliens also sounds interesting; the technical challenges behind the rig of such an amorphous – yet engaging – character must have been considerable.
Partly Cloudy was the short film accompanying Pixar’s 10th feature film, Up. Like all of Pixar’s short films, Partly Cloudy was a creative and technical triumph. The talk by the director, Peter Sohn, also includes a screening of the film.
Although film characters have more complex models, rigs, and shaders than game characters, there are many similarities in how a character translates from initial concept to the (big or small) screen. The session Taking Care of Your Pet has two talks discussing this process for characters from the movie Up. There is also a session dedicated to Character Animation and Rigging which may be of interest for similar reasons.
Another game development talk can be found in the Painterly Lighting session; Radially Symmetric Reflection Maps by Jonathan Stone of Double Fine Productions describes an intriguing twist on prefiltered environment maps used in the game Brutal Legend. The two talks on stylized rendering methods (Applying Painterly Concepts in a CG Film and Painting with Polygons) also look interesting; the first of these discusses techniques used in the movie Bolt.
Real-time rendering has long used techniques borrowed from film rendering. One way in which the field has “given back” is the increasing adoption of real-time pre-visualization techniques in film production. In this talk, Steve Sullivan and Michael Sanders from Industrial Light & Magic discuss various film visualization techniques.
The session Two Bolts and a Button has two film lighting talks that look interesting; one on HDRI-mapped area lights in The Curious Case of Benjamin Button, and one on lighting effects with point clouds in Bolt.
The Capture and Display session has two research talks from Paul Debevec’s group. As you would expect, they both deal with acquisition of computer models from real-world objects. One discusses tracking correspondences between facial expressions to aid in 2D parametrization (UV mapping), the other describes a method for capturing per-pixel specular roughness parameters (e.g. Phong cosine power) and is more fully described in an EGSR 2009 paper. Given the high cost of creating realistic and detailed art assets for games, model acquisition is important for game development and likely to become more so.
Flower is the second game from thatgamecompany (not a placeholder; that’s their real name), the creators of Flow. Flower is visually stunning and thematically unusual; the talk describing the creation of its impressionistic rendering style will be of interest to many.
Flower was one of two games selected for the new real-time rendering section of the Computer Animation Festival’s Evening Theater (which used to be called the Electronic Theater and was sorely missed when it was skipped at last year’s SIGGRAPH). Fight Night 4 was the other; these two are accompanied by real-time rendering demonstrations from AMD and Soka University. Several other games and real-time demos were selected for other parts of the Computer Animation Festival, including Epic Games‘ Gears of War 2 and Disney Interactive‘s Split Second. These are demonstrated (and discussed) by some of their creators in the Real Time Live talk session.
The Effects Omelette session has been presented at SIGGRAPH for a few years running; it traditionally has interesting film visual effects work. This year two of the talks look interesting for game developers: one on designing the character’s clothing in Up, and one on a modular pipeline used to collapse the Eiffel Tower in G.I. Joe: The Rise of Cobra.
Although most of the game content at SIGGRAPH is targeted at programmers and artists, there is at least one talk of interest to game designers: in Building Story in Games: No Cut Scenes Required Danny Bilson from THQ and Bob Nicoll from Electronic Arts discuss how interactive entertainment can be used to tell a story.
As one would expect, the Rendering session has at least one talk of interest to readers of this blog. Multi-Layer, Dual-Resolution Screen-Space Ambient Occlusion by Louis Bavoil and Miguel Sainz of NVIDIA uses multiple depth layers and resolutions to improve SSAO. Although not directly relevant to real-time rendering, I am also interested in the talk Practical Uses of a Ray Tracer for “Cloudy With a Chance of Meatballs” by Karl Herbst and Danny Dimian from Sony Pictures Imageworks. For years, animation and VFX houses used rasterization-based renderers almost exclusively (Blue Sky Studios, creators of the Ice Age series, being a notable exception). Recently, Sony Pictures Imageworks licensed the Arnold ray-tracing renderer and switched to using it for features; Cloudy with a Chance of Meatballs is the first result. Another talk from this session I think is interesting: Rendering Volumes With Microvoxels by Andrew Clinton and Mark Elendt from Side Effects Software, makers of the procedural modeling tool Houdini. The micropolygon-based REYES rendering system (on which Pixar’s Photorealistic Renderman is based) has fascinated me for some time; this talk discusses how to add microvoxels to this engine to render volumetric effects.
Above, I mentioned previsualization as one case where film rendering is informed by game rendering. A more direct example is shown in the talk Making a Feature-Length Animated Movie With a Game Engine (by Alexis Casas, Pierre Augeard and Ali Hamdan from Delacave), in the Doing it with Game Engines session (which I am chairing). They actually used a game engine to render their film, using it not as a real-time renderer, but as a very fast renderer enabling rapid art iteration times.
All of the talks in the Real Fast Rendering session are on the topic of real-time rendering, and are worth attending. One of these is by game developers: Normal Mapping With Low-Frequency Precomputed Visibility by Michal Iwanicki of CD Projekt RED and Peter-Pike Sloan of Disney Interactive Studios describes an interesting PRT-like technique which encodes precomputed visibility in spherical harmonics.
Finally, the Rendering and Visualization session has a particularly interesting talk: Beyond Triangles: GigaVoxels Effects In Video Games by Cyril Crassin, Fabrice Neyret and Sylvain Lefebvre from INRIA, Miguel Sainz from NVIDIA and Elmar Eisemann from MPI Informatik. Ray-casting into large voxel databases has aroused interest in the field since John Carmack made some intriguing comments on the topic (further borne out by Jon Olick’s presentation at SIGGRAPH last year). The speakers at this talk have shown interesting work at I3D this year, and I look forward to seeing their latest advances.
As a slight correction, Arnold got picked up by Imageworks for Monster House, so Cloudy is the second feature done with Arnold.
Pingback: Real-Time Rendering » Blog Archive » 7 Things for July 19th