ATI versus nVidia -- DirectX 10 Benchmarking

A place to give your thoughts on our reviews!
Post Reply
User avatar
Apoptosis
Site Admin
Site Admin
Posts: 33922
Joined: Sun Oct 05, 2003 8:45 pm
Location: St. Louis, Missouri
Contact:

ATI versus nVidia -- DirectX 10 Benchmarking

Post by Apoptosis » Mon May 14, 2007 7:32 am

ATI versus nVidia -- DirectX 10 Benchmarking

The day has come for ATI and NVIDIA to go head to head on the first DirectX 10 video game benchmark! Call of Juarez is going to be one of the first PC games to take advantage of DirectX 10, we took the NVIDIA GeForce 8800 GTX and ATI Radeon HD 2900 XT video cards out for a test drive on it. Don't expect high frame rates on this one though! Read on to get the low down.

Image
Article Title: ATI versus nVidia -- DirectX 10 Benchmarking
Article URL: http://www.legitreviews.com/article/504/1/

User avatar
dgood
Legit Extremist
Legit Extremist
Posts: 1190
Joined: Mon Nov 21, 2005 8:54 pm
Location: Lynnwood, WA

Post by dgood » Mon May 14, 2007 8:39 am

This could have been said on either review, but how much of a difference do you expect to see in performance when it's no longer an engineering sample? and when they release better drivers? You can see that a release of a new Nvidia driver made all the difference on the call the jaures test. I'm mildly dissapointed with the cards performance on the two tests so far since nvidia soon will have the 8900 gtx or something along those lines and this only beats the 8800 gts and sometimes the 8800gtx on x3 and rainbow six. though unreal engine is my choice more often than not. Nice job guys, on both reviews.
Image

User avatar
Apoptosis
Site Admin
Site Admin
Posts: 33922
Joined: Sun Oct 05, 2003 8:45 pm
Location: St. Louis, Missouri
Contact:

Post by Apoptosis » Mon May 14, 2007 8:43 am

dgood wrote:This could have been said on either review, but how much of a difference do you expect to see in performance when it's no longer an engineering sample? and when they release better drivers? You can see that a release of a new Nvidia driver made all the difference on the call the jaures test. I'm mildly dissapointed with the cards performance on the two tests so far since nvidia soon will have the 8900 gtx or something along those lines and this only beats the 8800 gts and sometimes the 8800gtx on x3 and rainbow six. though unreal engine is my choice more often than not. Nice job guys, on both reviews.
Our card is based off the final PCB... other than the sticker on the back it's a retail card ;) That said retail cards might have different rated memory IC's on it, but that wont impact stock performance and companies change IC's all the time.

srgess
Legit User
Legit User
Posts: 5
Joined: Sat Apr 21, 2007 7:41 am

Post by srgess » Mon May 14, 2007 6:25 pm

Is it possible to make a test with call of juarez DX10 with the Geforce 8500GT, 8600GT, 8600GTS ?

User avatar
Apoptosis
Site Admin
Site Admin
Posts: 33922
Joined: Sun Oct 05, 2003 8:45 pm
Location: St. Louis, Missouri
Contact:

Post by Apoptosis » Mon May 14, 2007 8:21 pm

I'm not the video card reviewer -- Brian Wallace is the man to talk to for that... If video cards were free we would be able to do that, but thats not the case. I wanted to get some DX10 numbers up, so I worked up the CoJ benchmarking that you read here on the only cards I have.

User avatar
Alathald
Legit Extremist
Legit Extremist
Posts: 1632
Joined: Sun Dec 17, 2006 11:55 pm
Location: Southern Ohio
Contact:

Post by Alathald » Mon May 14, 2007 8:41 pm

Wow with that translucent cover on the graphics card and the neon light on the HSF, that pic looks quite surreal...

BTW nice review, as usual. :shock:
Image

santiagodraco
Legit Little One
Legit Little One
Posts: 1
Joined: Mon May 14, 2007 9:13 pm

Post by santiagodraco » Mon May 14, 2007 9:18 pm

I find comments like "oh Snap! That didn't go as planned I wouldn't think." to be strange in reviews. First off the reviewer compared a card that is $200 more than the competitors card, secondly just because ATI has an agreement with the game publisher doesn't imply that the game runs better on their hardware.

Personally I don't think this reflects well at all on Nvidia considering that in the higher quality higher res benchmark ATI won...with a significantly lower priced card that also has much less mature drivers.

As another poster said, this should have been a comparison between the 2900 and a GTS anyway, at which point I think we'd have seen an even more drastic cost/performance difference.

User avatar
Apoptosis
Site Admin
Site Admin
Posts: 33922
Joined: Sun Oct 05, 2003 8:45 pm
Location: St. Louis, Missouri
Contact:

Post by Apoptosis » Mon May 14, 2007 9:24 pm

Thanks for the comments (i wrote the review). I wish I was able to use more video cards, but since I'm not the main video card reviewer I don't have access to all the cards that I wish I did.

As for the game running better on one companies hardware versus the other...

This is the official info from AMD. It looks like any new version will appear withing a couple of weeks according to them...
"The ATI Radeon HD X2900 XT reviewers guide shows an "N/A" score for the Nvidia GeForce 8800 GTS 640MB when testing DX10 Call of Juarez under Balanced Mode. We have found that this was caused by an application issue when MSAA is enabled. The benchmark can be run on both ATI Radeon and Nvidia hardware with MSAA disabled, so at present we would encourage you to perform any direct comparisons based on running the benchmark with MSAA disabled. This application issue will be fixed in an upcoming patch available in the next couple of weeks. The full DirectX 10 version of Call of Juarez will be available to the public sometime in June with full functionality for all DirectX 10 capable graphics cards."
The CEO of Techland (the game developer) e-mailed me this today:
The last official QA tested version is the one you have. There might be a newer internal build, but it is not QA tested as far as I know and probably for that reasons it was not send out by AMD yet. This are just my speculations, as this Benchmark is done with exclusive cooperation with AMD and they are the ones who do most of the testing and decide when it is ready to send out.

You really need to contact them regarding that.
As one person said to me... "hey don't QA their own title? AMD decides when their game is "done". WTF?" What the heck is right... I'm not sure what is going on with this benchmark, but something doesn't sit well with many in the industry. the DX10 demo of lost planet for the PC comes out tomorrow... It sounds more fair than Call of Juarez if you ask me.

Hope this helped clear things up and again sorry I didn't have an 8800 GTS 640MB video card to compare it to.

User avatar
gvblake22
Legit Extremist
Legit Extremist
Posts: 1111
Joined: Thu Feb 17, 2005 9:39 am
Location: Northern Michigan
Contact:

Post by gvblake22 » Tue May 15, 2007 3:52 pm

Very interesting situation you have uncovered there Nate! :lol:

As for your articles, I'm glad you took the time to compare two high-end DX10 capable video cards, qualitatively and quantitatively, but I am having a hard time with your conclusions. Like santiagodraco mentioned, you are comparing a ~$400 card with a ~$600 card. That is fine and dandy since it is the only thing available from the AMD camp at the moment, but don't write conclusive findings based on this hardware and software setup. Writing a conclusion that assumed the cards were of relatively equal performance and saying the nVidia card (8800GTX) having better performance than the ATI card (2900XT) was as "planned" isn't fair at all. Now please don't get me wrong, I have no issues with you comparing the two cards, but don't make blanket statements like this leading readers to believe that nVidia is absolutely better than AMD/ATI. There are a lot of shades of gray in the graphics market, so I was just a little frustrated not to see any mention of the fact you aren't comparing apples to apples here. Sure it's AMD's best card and (basically) nVidia's best card, but they are still on different levels and not directly comparable for making blanket statements.

The other issue I have is with your image quality conclusions. You said you just left driver settings at their default values and ran the tests. Well, that may give you an idea of what you can expect out of the box, but it's still not a fair-enough comparison to say that nVidia has better DX10 image quality than AMD. Like every other review done (whether it be by Legit Reviews or anyone else), you can only make conclusions like that if you know exactly what the quality settings are! Only when you are comparing (for example) 4xAA and 16xAF on both cards is it a fair fight. So saying nVidia has better image quality than AMD when no 3D setting was ever touched in the driver control panel is pretty useless. I don't know of too many (if any) people who would ever take such a comparison with any weight.

But not to end on a negative note, thanks for taking the time to run some preliminary benchmarks and post up some screenshots of DirectX 10 hardware and software. It is definitely going to be an intersting and bumpy road when the first DX10 titles start appearing on the shelves. I think we are all going to be greeted with a whole new round of incompatibilities and bugs no matter how recent the game's build is. Vista is young and DX10 hardware and software is still very young; and we all know what to expect from immature hardware and software! :lol: :rolleyes:

User avatar
Apoptosis
Site Admin
Site Admin
Posts: 33922
Joined: Sun Oct 05, 2003 8:45 pm
Location: St. Louis, Missouri
Contact:

Post by Apoptosis » Tue May 15, 2007 3:58 pm

gvblake22 wrote:Very interesting situation you have uncovered there Nate! :lol:

As for your articles, I'm glad you took the time to compare two high-end DX10 capable video cards, qualitatively and quantitatively, but I am having a hard time with your conclusions. Like santiagodraco mentioned, you are comparing a ~$400 card with a ~$600 card. That is fine and dandy since it is the only thing available from the AMD camp at the moment, but don't write conclusive findings based on this hardware and software setup. Writing a conclusion that assumed the cards were of relatively equal performance and saying the nVidia card (8800GTX) having better performance than the ATI card (2900XT) was as "planned" isn't fair at all. Now please don't get me wrong, I have no issues with you comparing the two cards, but don't make blanket statements like this leading readers to believe that nVidia is absolutely better than AMD/ATI. There are a lot of shades of gray in the graphics market, so I was just a little frustrated not to see any mention of the fact you aren't comparing apples to apples here. Sure it's AMD's best card and (basically) nVidia's best card, but they are still on different levels and not directly comparable for making blanket statements.

The other issue I have is with your image quality conclusions. You said you just left driver settings at their default values and ran the tests. Well, that may give you an idea of what you can expect out of the box, but it's still not a fair-enough comparison to say that nVidia has better DX10 image quality than AMD. Like every other review done (whether it be by Legit Reviews or anyone else), you can only make conclusions like that if you know exactly what the quality settings are! Only when you are comparing (for example) 4xAA and 16xAF on both cards is it a fair fight. So saying nVidia has better image quality than AMD when no 3D setting was ever touched in the driver control panel is pretty useless. I don't know of too many (if any) people who would ever take such a comparison with any weight.

But not to end on a negative note, thanks for taking the time to run some preliminary benchmarks and post up some screenshots of DirectX 10 hardware and software. It is definitely going to be an intersting and bumpy road when the first DX10 titles start appearing on the shelves. I think we are all going to be greeted with a whole new round of incompatibilities and bugs no matter how recent the game's build is. Vista is young and DX10 hardware and software is still very young; and we all know what to expect from immature hardware and software! :lol: :rolleyes:
ATI versus NVIDIA never can be apple to apple and since you write articles you know that just like anyone else. If an 8800 GTS was used instead of a 8800 GTX none of the image quality screen shots would be different, so what's the point? In Lost Planet it's an NVIDIA backed title, so of course it 'should' run better on those cards... As for testing with AA... In CoJ MSAA is broken for NVIDIA cards and in Lost Planet the ATI cards aren't running correctly so what's the point? The whole point of the article to just to show people where the companies are in in terms of DX10 and their higher end cards... This wasn't meant to be an HD 2900 XT versus 8800 GTX article. I think this article and the CoJ one does the job showing consumers where both companies are at in performance and image quality with these two respective cards!

User avatar
gvblake22
Legit Extremist
Legit Extremist
Posts: 1111
Joined: Thu Feb 17, 2005 9:39 am
Location: Northern Michigan
Contact:

Post by gvblake22 » Tue May 15, 2007 5:28 pm

I understand all 8800 cards are going to have the same basic image quality, but that's not what I was getting at. I was referring to the fact that there may have easily been differences in the default settings between the nVidia and AMD/ATI drivers and that could be cause for an unfair difference in image quality. I know the AA was buggy and didn't work, I was just using AA and AF as an example of an image quality comparison (one in which both sides are evenly matched and the settings are equal). And my comments were only referring to the CoJ testing, not the Lost Planet demo.

When I said "apples to apples" I meant comparing two cards from each camp marketed and targeted at the same consumer market with the most similar features and price ranges. I know it's not possible to get exact comparisons since they are two totally different GPUs made by two different companies, but I'm referring to keeping things in perspective. It's fine to test two different products like that (a $300 current generation DX10 video card vs. a $500 current generation DX10 video card), but just be fair to each side when drawing conclusions. I personally feel that is is unfair to say that nVidia has better DX10 performance than AMD when comparing differently marketed products. Especially since it looks like the ATI card came out on top in the 1600x1200 testing. Instead, saying something like "We can see 2900XT and 8800GTX are actually very close performers in this preliminary benchmark. Both cards show a noticeable performance hit when running in a DirectX 10 environment compared to DirectX 9 seeing as how playable framerates were barely even achieved at 1024x768 resolution. But worth noting is that the performance of the $300 2900XT compared to that of the $500 8800GTX in the 1600x1200 benchmark where the less expensive card was actually able to slightly best the 8800GTX! So from this run of preliminary tests, we can expect DX10 games to be much more taxing on both nVidia and AMD video cards than DX9 games are today."

As for the image quality rant, I just wanted to offer the possibility that maybe the default settings of the nVidia and AMD ATI driver control panels are not equal. I just feel that by manually setting the 3D graphics options for each card to equal values could have been a more accurate representation of the two card's DX10 image quality (AA or not). I think that only then would there be cause to directly compare a single image quality screenshot between nVidia and AMD to determine a "winner".

That is just my opinion on the whole deal and am by no means am I trying to tell you how to do your job. Since your forums are a public place to discuss thoughts and ideas with everyone else, I just decided I would post my thoughts on this unique and interesting article.

But since these demo benchmarks are not even final releases and AMD's hardware is so new, most any conclusions able to be drawn from the situation are just a preview and not something that should be written in stone. That said, your articles are an interesting snapshot in time to showcase the current DirectX 10 capabilities of both sides on what DirectX 10 software is even available (whether it be a final release or not). And for that, I have to thank you for taking the time to do testing and write the articles. :)

brites
Legit Aficionado
Legit Aficionado
Posts: 99
Joined: Wed Mar 14, 2007 7:53 am
Location: Matosinhos, Portugal

Post by brites » Wed May 16, 2007 3:01 pm

yes... the lack of good drivers is very :lame:

User avatar
Apoptosis
Site Admin
Site Admin
Posts: 33922
Joined: Sun Oct 05, 2003 8:45 pm
Location: St. Louis, Missouri
Contact:

Post by Apoptosis » Wed May 16, 2007 7:59 pm

Due to popular demand to include a GeForce 8800 GTS OC 640MB video card I have updated the benchmark results with an 8800 GTS!!!

I was able to get BFG Technologies to overnight mail me one, so now I have one to benchmark. Enjoy! I also took the time to benchmark on the new nvidia 158.43 driver and it seemed to improve CoJ performance on the GeForce 8800 GTS a bit. The difference between the Forceware version 158.42 and 158.43 was not significant though on either card.

Image

Image

User avatar
gvblake22
Legit Extremist
Legit Extremist
Posts: 1111
Joined: Thu Feb 17, 2005 9:39 am
Location: Northern Michigan
Contact:

Post by gvblake22 » Thu May 17, 2007 6:09 am

Oh wow, that's awesome man, nicely done! =D>

I find it very interesting to see how much the 2900XT gains back at the higher resolution! And just seeing the performance difference between some of those benchmarks due to drivers gives me hope that they just need to tweak some software in order to get DX10 performance up to par.

Post Reply