I understand all 8800 cards are going to have the same basic image quality, but that's not what I was getting at. I was referring to the fact that there may have easily been differences in the default settings between the nVidia and AMD/ATI drivers and that could be cause for an unfair difference in image quality. I know the AA was buggy and didn't work, I was just using AA and AF as an example of an image quality comparison (one in which both sides are evenly matched and the settings are equal). And my comments were only referring to the CoJ testing, not the Lost Planet demo.
When I said "apples to apples" I meant comparing two cards from each camp marketed and targeted at the same consumer market with the most similar features and price ranges. I know it's not possible to get exact comparisons since they are two totally different GPUs made by two different companies, but I'm referring to keeping things in perspective. It's fine to test two different products like that (a $300 current generation DX10 video card vs. a $500 current generation DX10 video card), but just be fair to each side when drawing conclusions. I personally feel that is is unfair to say that nVidia has better DX10 performance than AMD when comparing differently marketed products. Especially since it looks like the ATI card came out on top in the 1600x1200 testing. Instead, saying something like
"We can see 2900XT and 8800GTX are actually very close performers in this preliminary benchmark. Both cards show a noticeable performance hit when running in a DirectX 10 environment compared to DirectX 9 seeing as how playable framerates were barely even achieved at 1024x768 resolution. But worth noting is that the performance of the $300 2900XT compared to that of the $500 8800GTX in the 1600x1200 benchmark where the less expensive card was actually able to slightly best the 8800GTX! So from this run of preliminary tests, we can expect DX10 games to be much more taxing on both nVidia and AMD video cards than DX9 games are today."
As for the image quality rant, I just wanted to offer the possibility that maybe the default settings of the nVidia and AMD ATI driver control panels are not equal. I just feel that by manually setting the 3D graphics options for each card to equal values could have been a more accurate representation of the two card's DX10 image quality (AA or not). I think that only then would there be cause to directly compare a single image quality screenshot between nVidia and AMD to determine a "winner".
That is just my opinion on the whole deal and am by no means am I trying to tell you how to do your job. Since your forums are a public place to discuss thoughts and ideas with everyone else, I just decided I would post my thoughts on this unique and interesting article.
But since these demo benchmarks are not even final releases and AMD's hardware is so new, most any conclusions able to be drawn from the situation are just a preview and not something that should be written in stone. That said, your articles are an interesting snapshot in time to showcase the current DirectX 10 capabilities of both sides on what DirectX 10 software is even available (whether it be a final release or not). And for that, I have to thank you for taking the time to do testing and write the articles.
