Fast Approximate Anti-Aliasing (FXAA)

Anti-aliasing has an intimidating name, but what it does for our computer displays is rather fundamental. Think of it this way -- a line has infinite resolution, but our digital displays do not. So when we "snap" a line to the pixel grid on our display, we can compensate by imagineering partial pixels along the line, pretending we have a much higher resolution display than we actually do. Like so:

This is a companion discussion topic for the original blog entry at:

SMAA is even better and faster than FXAA.

Also there is software that automatically injects SMAA into your d3d9, d3d10 or d3d11 games:

Another example of how optimization for human perception can be extremely effective even if it’s not the most accurate.

See also photographic image compression (e.g., JPEG), video compression (e.g., MPEG 4), audio compression (MPEG Layer 3).

Jeff, I am looking at your three screenshots and I have to make an effort to notice any difference between them. I mean, if you hadn’t included those big block white captions explaining which ones have AA or not I’d be pressed to make the pick… Your three screenshots are an ode to the state of the art of PC gaming. Not to play down the technical achievement of FXAA, but maybe you should ignore your barely pixeled edges and enjoy the game!

Almost as interesting is how searching for “SMAA WebGL” confuses Google but not Bing.

It’s very possible that I’m being overly prickly about terminology, but I have to take exception to calling anti-aliasing in general a ‘hack’.

Aliasing is a well-understood artefact of sampling. If your continuous (‘infinite resolution’) image contains frequency components above the nyquist frequencies of sampling, aliasing occurs. You remove aliasing by increasing your sampling frequency, or removing those high frequency components before sampling. (Or you sample above your nyquist frequency, remove high frequency components, and then resample, which is SSAA).

Anti-aliasing (in general) is an analytically rigorous solution to a well-understood problem. This is not the description of a ‘hack’.

(The things people do to fake anti-aliasing because the real thing is too expensive, on the other hand…)

Thanks for informing us about this. Hopefully, nVidia makes this an option on their graphics cards. At least for my card, the nVidia control panel gives you the option to set antialiasing for any program (so when you choose HDR in Oblivion, even if Oblivion disables antialiasing the control panels enables it). Does there have to be support for the technique in the game?

Or I could try Mārtiņš suggestion which enables SMAA in any d3d9-11 program.

FXAA looks less sharp, though. Just look at the tree leaves or wires.

This has got to be the most uninteresting thing you’ve ever blogged about (and yes, I did read about Bias Lighting). Of course, I’m just kidding. Keep up the good work.

I’m looking at the

  • slightly blurry palm fronds
  • details behind the smoke that have been wiped out
  • details on the tree trunk that have been wiped out
  • graffiti color that has been altered
  • far building that has been blurred
and not liking FXAA.

Interesting and somewhat more quantitative take on this here:
(jump to the “FXAA & Water” section; source is Nvidia, so it’s geared towards their stuff, but it applies to AMD as well)

As with so many things, there are tradeoffs, and the cost/value of those tradeoffs will vary from person to person. Personally, in the middle of the action and at high resolution I’m hard-pressed to notice a difference between MSAA, FXAA, and nothing.

That said, I always turn one of them on.

Hey, in addition to using FXAA, we should use JPG instead of PNG for screenshots!

After all, because JPG uses less resources it must be better, right?

I found it amusing you felt it necessary to spend a couple paragraphs defining what anti-aliasing is, but assumed the reader understood the intricacies of pixel shaders and fill rate. :slight_smile:

Seriously, though… great post-- and a great example of lateral thinking in software architecture.


If course, doing AA as a render-buffer pass requires a loss of detail, while doing it at fragment shading time requires an increase - FXAA might be lower requirements, but it’s objectively worse. Myself, I hate the overly even smoothness (or “blurryness”) and inconsistent element volume across movement, to the point it’s sampling AA or nothing for me. This is easier when you have a 2560x1440 monitor - still only 110 dpi, but it’s getting there (come on manufacturers, I want my 300 dpi monitor, then it doesn’t matter if I can’t render at that resolution or read unscaled text - re-sampling doesn’t look like crap!)

And yeah, don’t use .jpgs of non-integer zooms for images about image quality. And don’t have textures as a major image element - mipmaps neatly solve aliasing for them. And those wires are being drawn with a shader (a Half-life 2 technique, I think), so they don’t alias as badly, even with no AA option enabled.

I see EA still can’t get Arabic text right (and I assume Farsi too, Wikipedia tells me Battlefield 3 is set in Iraq and Iran), as seen on the shop signs on the left in the screenshot. For the millionth time, Arabic is written right-to-left, and the words must be written in cursive (it’s not optional). For comparison, if they butchered English text, the word “Best” would be written as “T S E B”. It’s terrible, unreadable, and distracting to Arabic speakers.

Honestly, why can’t they get it right?

My Commodore 64 had exceptional hardware supported high-performance full screen anti-aliasing on all games with no drop in frame rate.

It was called a TV set.

Perhaps running your monitor in a non-native resolution would have the same effect?

THANK YOU for explaining this. I never got off my lazy butt to look up what the FXAA option was in Skyrim, so I never enabled it, fearing it was some sort of Superhuman AA that would kill performance. But now that mothereffer is getting turned on all the way.

Seriously wtf? They didn’t do that already? Even I did something similar at the time I was still interested in 3D graphics programming in elder time on my Atari ST. At that time it was out of the question to use an oversampling algorithm to make some anti-aliasing. It was obvious to interpolate the colours using the adjacent pixels. The downside was a blurring of the image, which was a little bit annoying at 320x200, but at 1920x1080 it should even add realism (i.e. making it more like TV) to the picture.

You should take a look at MLAA. We shipped God of War 3 with it at the beginning of last year.

Mārtiņš, great tip on SMAA.
Noting Phil Wilkin’s comment, I see that it’s an implementation of MLAA.

This video comparing SMAA to FXAA and other techniques is impressive:

Has anyone tried InjectSMAA with DX?