Category Archives: Resources

CryEngine3 presentation

This detailed presentation on Crytek’s latest engine at the regional Triangle Game Conference slipped completely under my radar, but Wolfgang Engel just pointed it out to me.  It’s on Crytek’s presentations page, which has a bunch of other good stuff on it as well.

The presentation includes lots of great information on their new deferred lighting system, which is timely since I am just working on a lengthy blog post on this very subject (hopefully to be finished soon).  They also tease about their new dynamic global illumination system, to be presented at SIGGRAPH 2009.

Odds and Ends

It’s 5/7/09, a nice odd sequence, so time for a few odds and ends I’ve collected.

OK, this is worth a few minutes of your life: the elevated demo is awe-inspiring. Terrain generation (be patient when you start it), fly-by’s, and music, all in less than 4096 bytes. By way of comparison, an empty MS Word document is 9834 bytes. (thanks to Steve Worley)

Google has put out a browser-based low-level 3D graphics API called O3D. API here. Demos here. Some initial impressions here. It will be interesting to see if they succeed where so many others have failed.

There is a call for participation out for a new book series called “Game Engine Gems“, edited by Eric Lengyel. (thanks to Marwan Ansari)

The main thing I look at on the SIGGRAPH exhibition floor are the book booths. Good books are such a ridiculous bargain: if a book like Geometric Tools saves a programmer 2 hours of time, it’s paid for itself. One new book that I want to see is Real-Time Cameras, by Mark Haigh-Hutchinson, which came out this April. Looking around for more info, I noticed this sad note. I never met Mark, but we corresponded a few times. He came up with a clever idea to avoid performing division when doing a point in polygon test; I folded this into the CrossingsMultiplyTest Graphics Gems code here, crediting him.

I’ve been looking at GPU capabilities and benchmarking information lately. Some nice resources:

  • You probably know about the benchmarking group Futuremark. Me, I hadn’t realized they had useful stats at their site: see the Futuremark ORB links at the bottom of the page and start clicking.
  • Two applications that tell you a ton about your card’s capabilities: GPU-Z, with a ton of information and a statistics page & cute map of downloads at their site, and GPU Caps, which also includes CUDA-related information and some nice little OpenGL benchmarks.
  • Chris Dragan has a web database that provides a fair amount of data on card support for DirectX capabilities and OpenGL extensions.
  • The Notebook Check site had way too much information about many laptop graphics accelerators.
  • nHancer is a utility for NVIDIA cards. It lets you get at all sorts of different capabilities on your GPU, on a per-game basis. There are also interesting antialiasing and anisotropic filtering comparison pages (click on the radio buttons). (thanks to Mauricio Vives)
Some interesting libraries I ran across lately:
  • GTS is an open-source mesh manipulation package.
  • Box2D is a 2D physics engine.
  • Touchlib is a multitouch development kit. (thanks to Morgan McGuire)

Coincidental world: it turns out there’s a different “Eric Haines” out there that made a well-received 3D graphics game for the iPhone, Realmaze 3D. I’m not sure how it compares to his The Magical Flying Pink Pony Game, which looks awesome. (thanks to Nikolai Sander)

I’ve seen similar real-world illusions, but still thought this one was pretty great. (Addendum: Morgan McGuire found this even-better video of the effect.)

Eurographics Workshop on Natural Phenomena 2009

EWNP has had interesting papers in recent years, but when it skipped 2008 I thought it was gone.  However it came back in 2009 with five papers, all of which are online except for one:

Procedural Modeling of Leather Texture with Structural Elements:  Not currently available online, but judging from a previous paper by these authors this appears to be about procedural modeling of the cracks and bumps in leather surfaces.  Most real-time applications will use photographed or manually created textures for this, so it is probably not of wide interest to real-time developers.

Interactive Modeling of Virtual Ecosystems: Automatic modeling of plants taking lighting, obstacles, etc. into account.  Might be useful as an automatic modeling tool.

A Geometric Algorithm for Snow Distribution in Virtual Scenes: What the title says; might be useful for automated scene modeling, but probably not for runtime use.

Corotated SPH for Deformable Solids: Smoothed Particle Hydrodynamics (SPH) is commonly used in film production for liquids, smoke, etc.  This paper discusses how to extend the technique to model deformable solids.  Probably not real-time anytime soon.

Real-Time Open Water Environments with Interacting Objects:  This combines the Tessendorf FFT-based method for ambient waves with a different method for interactive waves (waves interacting with dynamic objects).  This is the most relevant paper for real-time rendering; worth a read.

Tessendorf’s FFT method is the current gold standard for non-interactive ocean waves, and is widely used in game and film production.  A description of it can be found on his publication page, under Simulating Ocean Surface.  Tessendorf’s publication page has many more papers of interest, including an algorithm (called iWave) for interactive waves and reports on particle and volume rendering for film production.

Insomniac have a particularly efficient and flexible implementation of a variant of Tessendorf’s method, which they extended to support interactive waves as well.  This method was used in the game Resistance 2, and Insomniac Games have kindly published not just a white paper on the technique, but actual working code! This is part of their admirable Noctural Initiative for technology sharing.  The Noctural Initiative website is highly recommended, as it includes code which has been used in successful game projects by one of the most highly-regarded studios in the industry.

Another interesting approach to interactive waves is Wave Particles, which is described here.

Graphics Interface 2009 papers

The list of papers accepted to Graphics Interface 2009 (with abstracts) is now online.  Graphics Interface has had some pretty good real-time rendering papers: here is a handful of examples from the last few years.  Judging from this year’s abstracts, the following look particularly interesting:

Fast Visualization of Complex 3D Models Using Displacement Mapping: This looks like a combination of the “sparse voxel ray casting” approach popularized by id software with “relief mapping” approaches.

Depth of Field Postprocessing for Layered Scenes Using Constant-Time Rectangle Spreading: This is closely related to one of my favorite I3D 2009 posters, “Faster Filter Spreading and Its Applications”.  The basic idea (which has also been discussed in this paper by Dan Piponi) is to “splat” rectangles in constant time (independent of the rectangle size!) by “splatting” just the corners into a buffer, from which a summed-area table is constructed (using existing fast methods), yielding the desired image.  This can be extended to more general splats.  Although there is no preprint yet, the tech report is available.

An Analytical Approach to Single Scattering for Anisotropic Media and Light Distributions:  A follow-on paper to one published in Eurographics 2009, it adds anisotropic phase functions and more general lighting.  The basic solution is similar to an earlier paper by Bo Sun and others, but using a slightly different approach that enables increased precision.

Rendering the Effect of Labradorescence: This is of interest to me as an optical reflectance geek, but I doubt anyone will be using it in a game anytime soon.  This paper presents a physically-based method of rendering a complex optical phenomena that exists in gems such as Labradorite and Spectrolite.

Ke-Sen Huang’s Graphics Interface 2009 page should be a good place to hunt for preprints of these papers as they appear.

Good list of classic graphics papers

Old graphics papers don’t get enough respect nowadays; for example, Porter and Duff’s original paper is still the best place to get a good understanding of alpha blending (which too many people get wrong nowadays). There are many more gems to be found in papers from the 70s and 80s.  A while ago, I pointed out Pixar’s online paper library, which includes a lot of “golden oldies” (as well as good new stuff).  I just saw this great list of old papers on the codersnotes blog. I heartily concur with Kayamon’s assessment of the value of an ACM digital library subscription, though I wish ACM would find a way to go the Open Access route.  It’s not just a matter of expense; the registration wall adds a huge amount of friction to the process of finding information.

Exploiting coherence at GDC 2009

A few months back, I wrote a blog post discussing techniques which exploit coherence, either spatial (like multiresolution rendering) or temporal (like reprojection caching).

Both of these were represented at GDC this year.  Jeremy Shopf presented a talk on Mixed Resolution Rendering, and the ambient occlusion technique presented in the talk Rendering Techniques in Gears of War 2 (available on the GDC Vault site) made use of both methods.  The ambient occlusion factors were rendered at a downsampled resolution. In addition, reprojection caching was used to reduce temporal aliasing.  This is the first use I have seen of reprojection caching in a shipping game.

In my previous blog post, I was skeptical of reprojection approaches, since it seemed to me that as an optimization method they did not address the worst case (where the camera angle changes abruptly).  Using such approaches to improve quality instead (as Epic did) makes more sense.

More GDC conference links

More material from GDC is coming online each day. We have already mentioned the tutorial slides, as well as Intel’s pageGDC’s Vault site has video which is only available to registered attendees (except for sponsored sessions), but the slide decks are available to everyone.  NVIDIA recently put up a new page with their material – even the material previously available from GDC’s own sites is worth getting from here, since the versions on NVIDIA’s page are significantly more up to date.  The videos for NVIDIA’s sponsored sessions are free for everyone and are linked from the NVIDIA page as well.

Lots of OpenGL and OpenCL stuff is available on the Khronos web site,  and Jeremy Shopf and Jim Tilander have their respective slides up as well. A Google Search for ‘”GDC 2009″ slides’ should turn up more as time goes by.

More With the Links

I love the movie sequel title “2 Fast 2 Furious”. How clever, and a great way to guarantee there will never be a third movie. Well, there was, but they had to go the colon route, “… : Tokyo Drift”.

Which is indicative of nothing, as I don’t think I’ve ever actually seen any of these movies. I was reminded of the title as my goal today is to whip through the backlog of 72 potential blog resource links I’ve been gathering on del.icio.us. [Well, as it turns out, I got through 39 of them (the fresher ones), 33 to go…]

ShaderX^7 has been published. We hope to give it an overview sometime soon (mine’s on backorder from Amazon.com).

From various source I heard that OnLive got a bit of notice at GDC. Think: pure server-side computation of all graphics for a game, i.e., a cloud computing model. Now even your grandma’s computer or even a rigged-out TV can play Crysis, assuming the net bandwidth is there. Which of course makes me think: what about latency? Lag for how other players see your action is always there, and causes mismatches (“how did I instantly die?”). But increasing lag for you seeing the consequences of your own actions seems like a non-starter for shooters, at least.

Mark DeLoura has a great two-part article on what game engines are licensed for titles. First part is a general survey, second is about the technology involved. I found it interesting to see what people cared about, e.g. multicore is on people’s minds. Nothing too shocking here, but it’s fantastic to see what is getting used, and why, in this marketplace.

Related to this, I happened across a list of game engines on wikipedia. Not massively useful (e.g. no sense of what’s popular), but a starting place.

John Ratcliff has a graphics math library available for download with an unrestrictive reuse license. He recently added best fit methods for AABB’s and OBB’s.

I was interested to look at the open source, cross-platform (!) model viewer GLC. I’ve wanted something like this for doing some experiments with mesh manipulation. Not a bad viewer, but that’s all it is at this point, unfortunately: you can’t even export to a different 3D format. The search continues… If you know a reasonable open source 3D file viewer/converter out there, please tell me. I should probably bite the bullet and just use Blender, but this application is way overkill.

CUDA voxel rendering – pretty impressive!

I liked this post on optimization mainly because of the line “I went in and found out that some title bar was getting rendered 140 times every time you refreshed the screen”. I can entirely relate (though 140 must be some kind of record): too many times I’ve put output debugging statements showing updates, only to see 2,3,6 updates happening. I once started on a project and in the first few weeks increased performance by 100%, simply by noting the main draw path was being executed twice each frame.

Speaking of performance, there’s an article on volume rendering optimizations when using a ray-casting approach on the GPU.

Wolfenstein source code for the free iPhone version, along with Carmack’s documentation on the project, is available.

Software patents are only slightly dumber than business method patents, which are patently absurd. I hadn’t noticed until now, but there was recently a ruling on a business method patent, In re Bilski, which has been used to strike down software patents.

A detailed data and execution flow diagram for the new DirectX 11 pipeline front-end is available from Jolly Jeffers.

People are still making ray-tracing specific hardware; witness Caustic Graphics. They have a rather amazing claim: “The CausticOne, however, thrives in incoherent raytracing situations: encouraging the use of multiple secondary rays per pixel. Its level of performance is not affected by the degree of incoherence.” Good trick. That said, I can’t say I see any large customer base for such a product. This seems like a company designed for acquisition, similar to Ageia. Fine by me, best of luck to them.

I’m happy to learn that the Humus site now has a news blog. This is a great site for demos of advanced techniques, and for honest comments about strengths and limitations of various approaches.

Another blog: The Geeks of 3D. Tracks demos, APIs, SDKs, and graphics card releases. Handy – some of the links here I found there.

There was a nice little article on data alignment on Gamasutra. Proper alignment is a key element in getting high performance.

I was trying to find the name of the projection of equidistant latitude and longitude lines for a surrounding spherical environment. From this interesting page (click on the “Wall Maps of the World” text) I found it: Plate Carrée.

Predicting the future is so much more interesting than predicting the past. I love this: MIPS per $1000. It’s entertaining to equate raw computing power with structured processing. By the same equivalence, I should be able to hook up 1700 mice in parallel to get a human brain.

A great line from a GPU review: “Nvidia’s new line of unbelievably expensive cards will block out the sun, and ray-trace its own shadow in real time.”

Faber College’s motto is “Knowledge is Good”. Learning about the idea of metamers would have saved this article from confusion. Coming back to this article now, I see all the comments have been removed, and an apologia trying to convert confusion into enlightenment added, but I think this still misses the point. Sure, there is a color associated with a single wavelength of light. But, my guess is that 99.99% of the colors we perceive arrive at any location on the eye as light with a spectral mix of wavelengths, not a single wavelength (Naty will correct me if I’m wrong). Unless you’re Dr. Evil and deal with sharks with frickin’ laser beams on their heads on a daily basis. Hmmm, I’m probably forgetting some other single-wavelength phenomena, like fluorescence. Anyway, the article did lead me to look up more information on metamers on Wikipedia, where I learnt about metameric failure, a term I hadn’t heard before. One more reason a simple RGB representation of color isn’t sufficient.

Cute thing: Snapily lets you turn some set of images or video into lenticular prints.

I don’t have a lot to say about what I do at Autodesk. Here’s a tidbit.

Art for the day, crayons as pixels.

ShaderX^8 CFP – proposals due May 17

Will there be a GPU Gems 4? I don’t know. But I do know there will be a ShaderX^7 and, with your help, a ShaderX^8. The timeline and information about this next volume is at the ShaderX^8 site. If you’re interested in submitting, one detail (currently) missing from this site is that an example ShaderX proposal, writing guidelines, and a FAQ can be downloaded from here. The key bit: proposals are due May 17th. I’m not currently associated with this series (though I was for volumes 3 & 4), I just like to see them get good submissions.

The existence of these book series – Game Programming Gems, ShaderX, GPU Gems – is a fascinating phenomenon. Conferences like SIGGRAPH are heavy on theory and cutting-edge research, light on practical advice. Books like ours can be more applied, but are survey-oriented by their nature, not spending a lot of time on any given topic. Code samples and white papers on the web from NVIDIA, AMD/ATI, etc., and independents such as Humus, they’re great stuff, but are produced by particular groups of people with specific interests. Also, sometimes just finding relevant code samples on these sites can be a serious challenge (“search” sometimes works less well than I would like).

These book series fill the gap: they go through a review and editing process, improving quality and presentation. This in turn makes them of higher average interest to the reader, vs. a random article on the web of unknown quality. They won’t disappear if someone’s domain expires or interest wanes. They can be easily accessed years later, unlike material published in ephemeral venues such as Game Developer Magazine or GDC proceedings. The titles, at least, can be surveyed in one place by sites such as IntroGameDev (though this one appears to no longer receiving updates, unfortunately, e.g. ShaderX^6 is not listed).

The major downside of these books is that they’re only available on paper, not as searchable PDFs (except the first few ShaderX books). Well, almost the entire GPU Gems series is, wonderfully, online for free, but is still not easily searchable. Now if someone could just figure out a Steam-like system that let people buy books in electronic form while protecting publishers’ monetary interests. Hmmm, maybe eye-implanted bar-code readers that check if you have access to a given piece of digital content, that’ll be non-intrusive… Anyway, this is the challenge ahead for publishers. Maybe the Kindle is the best solution, but I like the Steam games model better, where something you’ve purchased is available on any computer attached to the Internet.

Best of all for consumers is free & digital, of course, but this does trim back the pool of authors pretty drastically, as a royalty percentage of 0% is not much of an incentive (I’ve been reading too many popularized economics books late, e.g. Naked Economics, so have been thinking more in economics-speak, like “incentives”). We wrote our book for the love of the subject, but I can’t complain about also, to my surprise, earning a bit of money (enough to allow me to, what else, upgrade my computer and graphics card on a regular basis). Enough rambling, but the subject of electronic publication is one that’s been on my mind for a few decades now. I expect a solution from you all by the end of the week, then let’s create a startup and we’ll sell out by next March and make a mint.