Sunday, April 11, 2010

1600x1200 AA vs noAA

Does anyone have any screenshots comparing games at 1600x1200 with no AA, and then with 4xAA?1600x1200 AA vs noAA
No but at 1920x1200 I cant see past 4x in some games and 8x is normally what I have AA on unless I cant tell between 4x and 8x1600x1200 AA vs noAA
How much of a difference does 4xAA make at 1600x1200?
[QUOTE=''Manly-manly-man'']How much of a difference does 4xAA make at 1600x1200?[/QUOTE]

Ill look tommorow but since my native is 1920x1200 it might be hard... what games we talkin tho I have prety much all of the newer ones
Still jaggy at 1600x1200 but less noticeable. I play 19' at 1600x1200 well.
Hm...still, if you could take some comparsison shots that would be great.
There is a big difference between 4x and 8x but i see no dif when I bring it up to 16x
Found it on this site.http://www.gamespot.com/features/6168650/p-2.html
Increasing the resolution doesn't eliminate aliasing. Itsimply makes the jaggies smaller. There is a big difference up to 4x, a small difference between 4x and 8x and almost no noticeable difference beyond 8x.
I'll try to get one, it makes a difference mostly in the distance. Like, in battlefield 2, people far away are clearer.
You should be able to find screenshots out there with Google...[url]http://www.pcper.com/article.php?aid=319%26type=expert%26pid=6[/url]
I know it won't eliminate the aliasing, but if it wasn't bad enough to degrade the visuals too bad it would help me decide between the 8800GTS and the HD2900XT. From those screen shots 4xAA makes a big difference, but only over the game with no AA OR CSAA. The HD2900XT, and corrct me if I am wrong, has it's own type of CSAA (wide and narrow tent) that rivals MSAA without taking a hit on performance (or a small one). Edit: I worded that badly. What I meant, in short, is that at 2xAA you can enable the CFAA, without performance hit, and make it look as good as 4xAA with better framerates. That is good for me, because that means that the HD2900XT is a better deal for me, because at 1600x1200 and just 2xAA, it performs better then the 8800GTS.
[QUOTE=''Manly-manly-man'']I know it won't eliminate the aliasing, but if it wasn't bad enough to degrade the visuals too bad it would help me decide between the 8800GTS and the HD2900XT. From those screen shots 4xAA makes a big difference, but only over the game with no AA OR CSAA. The HD2900XT, and corrct me if I am wrong, has it's own type of CSAA (wide and narrow tent) that rivals MSAA without taking a hit on performance (or a small one). [/QUOTE]I believe the new type of AA is Custom Filtering.
From HardOCP...if you wanna see more IQ SS between the two cards, just read up on their reviews.http://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfM181X2wucG5nhttp://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfM180X2wucG5nThe difference between the lower 25% grass setting and half grass setting can clearly be seen in the screenshots above. As you zoom out farther in third person view the grass disappears quicker with less grass distance. At 50% grass the grass is all visible at full zoom out.http://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfMl80X2wuanBnhttp://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfMl81X2wuanBnThe above screenshots illustrate the superior image quality of 16X Transparency Supersampling in Battlefield 2142.http://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfNF82X2wucG5nhttp://www.hardocp.com/image.html?image=MTE4MTE4Mjc5N3pDem9Ieko5REpfNF81X2wucG5nThe above screenshots illustrate the difference between full dynamic lighting and objects dynamic rendering. Shadow detail is greatly reduced with all objects that cast shadows, including trees. In the second screenshot you can really see the difference between lightmaps and real-time dynamic lightingConclusion:Overall Performance SummaryIn Oblivion we found the 320 MB and 640 MB GeForce 8800 GTS based video cards to perform faster than the ATI Radeon HD 2900 XT. Oddly enough they were able to handle higher grass distance settings in Oblivion despite the Radeon HD 2900 XT having much higher memory bandwidth. Battlefield 2142 had a large difference in the gameplay experience between the ATI Radeon HD 2900 XT and both GeForce 8800 GTS video cards. Even with the much less expensive 320 MB GeForce 8800 GTS we were able to play the game smoothly at 16X Transparency Supersampling at 1600x1200 with no problems at all in intense gun fights with massive explosions. The more expensive ATI Radeon HD 2900 XT could not handle anything higher than 4X Performance Adaptive AA at 1600x1200.S.T.A.L.K.E.R. also proved to separate these video cards by performance. The ATI Radeon HD 2900 XT was clearly the weaker performing video card. We had to lower the rendering quality to ''Objects Dynamic Lighting'' and run at 1280x1024 to receive playable performance. Unfortunately this does diminish the gameplay experience compared to the GeForce 8800 GTS based video cards. We were able to take the game up to full rendering quality and play at 1600x1200 with NVIDIA based cards. With the 320 MB version we had to drop the AF level to 4X and grass density to 50%. Lost Planet is a fun game, plain and simple; we had a blast playing through the demo. If this is the future of gaming then we are very happy. There is no question that next generation titles will require fast hardware to keep up with the intense detail. This demo presented some interesting results for us. We found that the ATI Radeon HD 2900 XT really does take a large performance hit when enabling AA; to the point where it just isn't a viable option right now. The GeForce 8800 GTS based video cards on the other hand don't take as great a hit and some gamers may find 2X AA or more playable depending on what framerates you are comfortable with. In Lost Planet's outdoor areas the ATI Radeon HD 2900 XT, without AA, is slightly better performing than both GeForce 8800 GTS based video cards. However, in that one indoor area of the performance test called ''Cave'' we saw the framerates suffer and perform slower than the GeForce 8800 GTS based video cards. We cannot wait until the full version game is released so we can test all the levels and see how the video cards really compare throughout the entire game.
LordEC is going to rip you for linking HardOCP... XD
[QUOTE=''Wesker776'']LordEC is going to rip you for linking HardOCP... XD[/QUOTE] What's wrong with HardOCP?
[QUOTE=''SDxSnOOpZ''][QUOTE=''Wesker776'']LordEC is going to rip you for linking HardOCP... XD[/QUOTE] What's wrong with HardOCP?[/QUOTE]LoL, I posted in the other thread.
I also think I have discussed this with you a few months ago.
[QUOTE=''SDxSnOOpZ''][QUOTE=''Wesker776'']LordEC is going to rip you for linking HardOCP... XD[/QUOTE] What's wrong with HardOCP?[/QUOTE]

Their review methodology happens to hit on exactly what the weak point of the HD2900XT is... they pretty much ignore base benchmarks (i.e. they will completely skip no AA) and go after max playable settings, which immediately putsthe HD2900 into the high-AA-huge-FPS-drop area that it's been well known for.



Some people like to credit that to them being nVidia fanboys, despite Brent's history running ATi websites before he went to [H]. :P



I think it's just a really nasty coincidence that these guys are set on doing things a certain way, and that certain way is exactly what the R6x-gen cards are doing badly at.
You know, they should have done more in depth testing. At 2xAA with full CFAA the HD2900XT does perform better then the GTS 640MB in most games, and it looks just as good as 4xMSAA.
[QUOTE=''Makari''][QUOTE=''SDxSnOOpZ''][QUOTE=''Wesker776'']LordEC is going to rip you for linking HardOCP... XD[/QUOTE] What's wrong with HardOCP?[/QUOTE] Their review methodology happens to hit on exactly what the weak point of the HD2900XT is... they pretty much ignore base benchmarks (i.e. they will completely skip no AA) and go after max playable settings, which immediately putsthe HD2900 into the high-AA-huge-FPS-drop area that it's been well known for. Some people like to credit that to them being nVidia fanboys, despite Brent's history running ATi websites before he went to [H]. :P I think it's just a really nasty coincidence that these guys are set on doing things a certain way, and that certain way is exactly what the R6x-gen cards are doing badly at.[/QUOTE]I see nothing wrong to test 400$ + worth card in max settings possible whit full AA as far as it goes. If you buy 400$ card you probably want highest possible settings for quiet some time, and probably as much AA as you can get too. If you dont wantAA and highest possible settings, than you can just go for 200$ card.

No comments:

Post a Comment