An Intel research group has put their papers and code up for download. I had asked Alexander Reshetov about his morphological antialiasing scheme (MLAA), as it sounded interesting – it was! He generously sent a preprint, answered my many questions, and even provided source code for a demo of the method. What I find most interesting about the algorithm is that it is entirely a post-process. Given an image full of jagged edges, it searches for such edges and blends these accordingly. There are limits to such reconstruction, of course, but the idea is fascinating and most of the time the resulting image looks much better. Anyway, read the paper.
As an example, I took a public domain image from the web, converted it to a bitonal image so it would be jaggy, then applied MLAA to see how the reconstruction looked. The method works on full color images (though has to deal with more challenges when detecting edges). I’m showing a black and white version so that the effect is obvious. So, here’s a zoom in of the jaggy version:
And here are the two smoothed versions:
Which is which? It’s actually pretty easy to figure: the original, on the left, has some JPEG artifacts around the edges; the MLAA version, to the right, doesn’t, since it was derived from the “clean” bitonal image. All in all, they both look good.
Here’s the original image, unzoomed:
The MLAA version:
For comparison, here’s a 3×3 Gaussian blur of the jaggy image; blurring helps smooth edges (at a loss of overall crispness), but does not get rid of jaggies. Note the horizontal vines in particular show poor quality:
Here’s the jaggy version derived from the original, before applying MLAA or the blur:
Thanks Eric, that’s a cool technique. Do you know if they have a GPU implementation? I only saw reference to the CPU one, but I guess that’s not suprising as it’s Intel.
They don’t have a GPU-based implementation yet, AFAIK, though I know there’s interest in making one.
Let’s hope so – it would be very interesting to see how it compares to the other techniques out there. Espessially given the constraints on traditional antialiasing when using deferred shading.
Pingback: Real-Time Rendering » Blog Archive » Visual Stuff
Pingback: Saboteur’s unique PS3 anti-aliasing « Subatomic Mutant Gamers
Pingback: Real-Time Rendering · 7 things for December 23
Pingback: Real-Time Rendering · Morphological Antialiasing in God of War III
Pingback: God of War 3 God of War - Ein Buch
A GPU implementation has been released. Check on SIGGRAPH Talk ‘Games and Real Time’ and on this webpage : http://igm.univ-mlv.fr/~biri/mlaa-gpu/
A faster GPU implementation can be found here:
http://www.iryokufx.com/mlaa/
Typical execution times are 3.79 ms on Xbox 360 and 0.44 ms on a nVIDIA GeForce 9800 GTX+, for a resolution of 720p. Memory footprint is 2x the size of the backbuffer on Xbox 360 and 1.5x on the 9800 GTX+. Meanwhile, 8x MSAA takes an average of 5 ms per image on the same GPU at the same resolution, 1180% longer (i.e. processing times differ by an order of magnitude).
Pingback: Morphological Anti-aliasing - TheLab.gr
Pingback: PC To Be Primary Platform for Battlefield 3 « OXCGN – Xbox Gaming News Reviews Views
Pingback: HD Radeon 6990, un monstruo del procesamiento gráfico | WebAyunate
Pingback: Light Pre-Pass vs Deferred Renderer – Part 1 | Gamedev Coder Diary
Pingback: Real-Time Rendering · FXAA Rules, OK?
Pingback: Presentation: Anti-Aliasing | Matt Pickering