16x MSAA vs 16x CSAA

Sooo, I was playing around with the settings in Counter-Strike: Source and noticed those two options on Antialiasing. I always scroll down to the last setting in games to set it to max, and I've always choosen 16x CSAA without hesitation... well, because it's the last option.
I've noticed this in several other games too and I can't really see the difference.

So I ask those of you with a little more knowledge than me, what is the difference? Which one offers the best image quality and which one offers the best performance?

Ok here are the basics. nVIDIA has LOADS of different AA settings that can be used through this program called nhancer. Ill just go over coverage sample AA, or CSAA.

You have 8xCSAA, 8xQ, 16xCSAA and 16xQ. 8xCSAA is basically 4xMSAA+ 4 "possible" coverage samples. This means that the lowest quality in a scene could be a minimum of 4xAA, and the highest quality could be 8xAA (This varies depending on the number of coverage samples within the scene). 8xQ is basically 8xMSAA. 16xCSAA follows the same principle as 8xCSAA except that it uses 8 coverage samples meaning the highest quality could be as high as 16xAA. 16xQ isnt 16xMSAA but rather a more higher level of 16xCSAA, with a minimum of 8 multi samples (so 8xAA is the minimum in any given scene) and 8 coverage samples.

So basically, CSAA is a very good AA when it comes to memory buffer efficiency and performance as 16xCSAA is usually 10~20% slower than 4xMSAA.

Shamelessly pinched from google.

Thanks ^^

 

Reply to Thread

This thread is locked