Company of Heroes Goes DirectX 10
- Apoptosis
- Site Admin
- Posts: 33941
- Joined: Sun Oct 05, 2003 8:45 pm
- Location: St. Louis, Missouri
- Contact:
Company of Heroes Goes DirectX 10
This past week THQ Inc. released a patch adding Microsoft DirectX 10 support to PC game title Company of Heroes. This makes Company of Heroes is the first commercially available DirectX 10 Windows PC game and we have been busy benchmarking it over the last couple days to bring you some performance numbers on the latest and greatest DX10 video cards from both ATI and NVIDIA.
Article Title: Company of Heroes Goes DirectX 10
Article URL: http://www.legitreviews.com/article/507/1/
Article Title: Company of Heroes Goes DirectX 10
Article URL: http://www.legitreviews.com/article/507/1/
- Apoptosis
- Site Admin
- Posts: 33941
- Joined: Sun Oct 05, 2003 8:45 pm
- Location: St. Louis, Missouri
- Contact:
Here is something interesting I just found out looking at the shader clocks...
The shader clocks on the two 768MB GDDR3 GeForce 8 Series cards are...
8800 Ultra - 1667 MHz
8800 GTX - 1350 MHz
The difference in the shader clocks between the Ultra and the GTX is 23.5%....
At 1600x1200 the difference in performance is 27% with no AA and 20% with 4X AA.... If you average those two scores the average performance improvement is 23.5%...
Average performance improvement at 16x12 is 23.5% and the shader clock difference is 23.5% --- To me it is interesting to that upcoming game titles won't be CPU limited or even GPU limited... "Shader Limited" looks like a term we might start to see more often. I mean what are the chances that this came out to the same percentage..
The shader clocks on the two 768MB GDDR3 GeForce 8 Series cards are...
8800 Ultra - 1667 MHz
8800 GTX - 1350 MHz
The difference in the shader clocks between the Ultra and the GTX is 23.5%....
At 1600x1200 the difference in performance is 27% with no AA and 20% with 4X AA.... If you average those two scores the average performance improvement is 23.5%...
Average performance improvement at 16x12 is 23.5% and the shader clock difference is 23.5% --- To me it is interesting to that upcoming game titles won't be CPU limited or even GPU limited... "Shader Limited" looks like a term we might start to see more often. I mean what are the chances that this came out to the same percentage..
- Digital Puppy
- Moderator
- Posts: 4649
- Joined: Tue Apr 27, 2004 12:36 pm
- Location: LA LA Land, CA
- Contact:
Are these settings overclockable?Apoptosis wrote:Here is something interesting I just found out looking at the shader clocks...
The shader clocks on the two 768MB GDDR3 GeForce 8 Series cards are...
8800 Ultra - 1667 MHz
8800 GTX - 1350 MHz
The difference in the shader clocks between the Ultra and the GTX is 23.5%....
At 1600x1200 the difference in performance is 27% with no AA and 20% with 4X AA.... If you average those two scores the average performance improvement is 23.5%...
Average performance improvement at 16x12 is 23.5% and the shader clock difference is 23.5% --- To me it is interesting to that upcoming game titles won't be CPU limited or even GPU limited... "Shader Limited" looks like a term we might start to see more often. I mean what are the chances that this came out to the same percentage..
What's the Shader clock frequency on the ATI?
Just a little puppy trying to make it in a big digital world.
http://www.legitreviews.com/article/507/3/Kougar wrote:I didn't spot any mention of drivers used? ;) I assume NOT the just released official whql Catalyst 7.5 drivers?
Just got an warranty OC'd Ultra so glad the extra shader performance will help in DX10 games. I do hope these numbers improve for both ATI and Nvidia cards thoughLegit Reviews wrote:"Once the latest Company of Heroes patch was installed we could run the game with DirectX 10 enabled to benchmark both the ATI Radeon HD 2900 XT and the NVIDIA GeForce 8800 series video cards. The ATI Radeon 2900XT used the CATALYST 7.5 drivers that were released on May 31st, 2007 and all three of the NVIDIA cards used Forceware 145.45 drivers that were also released on May 31st, 2007. "
- Apoptosis
- Site Admin
- Posts: 33941
- Joined: Sun Oct 05, 2003 8:45 pm
- Location: St. Louis, Missouri
- Contact:
srgess,
Those just happened to be the cards that I have on the test bench. That review was done prior to my leaving for Taiwan... So it was also done on a short time frame as I had a plane to catch! I'll be here (in taiwan) for 8 days to cover Computex and then visit some factories in mainland China.
Hope you all enjoyed this article... I spent my time doing that instead of packing like I should have!
Those just happened to be the cards that I have on the test bench. That review was done prior to my leaving for Taiwan... So it was also done on a short time frame as I had a plane to catch! I'll be here (in taiwan) for 8 days to cover Computex and then visit some factories in mainland China.
Hope you all enjoyed this article... I spent my time doing that instead of packing like I should have!
- maj0r_pawnage
- Legit Extremist
- Posts: 408
- Joined: Sat Feb 17, 2007 7:16 pm
- Location: Toronto Ontario
thats pretty sad, i would like to see how the 8600s do vs those games, whats the point of putting out a dx10 card like that if it cant even play dx10 at any decent FPS
E6300 @ 3.51GHz 1.392v
Asus Maximus Formula
Antec 550Watt Psu
Mushkin DDR-1206@ 5-5-5-15 / 2.28v
Zalman CPU Cooler CNPS7700
6408800GTS(OC 691MHz Core/2.132GHz Mem)
2x80mm case fans +2x120mm case fans
Asus Maximus Formula
Antec 550Watt Psu
Mushkin DDR-1206@ 5-5-5-15 / 2.28v
Zalman CPU Cooler CNPS7700
6408800GTS(OC 691MHz Core/2.132GHz Mem)
2x80mm case fans +2x120mm case fans
Maybe this time software is ahead of hardware?maj0r_pawnage wrote:thats pretty sad, i would like to see how the 8600s do vs those games, whats the point of putting out a dx10 card like that if it cant even play dx10 at any decent FPS
Also, nVidia GeForce 8 (non-8800) doesn't do very well in anything. Even if it has DX10 support, I don't think you would get playable framerates considering an overclocked 8800 Ultra can't even score 60 FPS in not-so-high resolution (1280x1024). I know the tests we're done with highest graphical options but still, a 800$ should, in my opinion, be doing better than that. Especially that one of the main reasons behind G80 is DX10.
- maj0r_pawnage
- Legit Extremist
- Posts: 408
- Joined: Sat Feb 17, 2007 7:16 pm
- Location: Toronto Ontario
i guess im gunna have to save up cash to order a 2nd 8800GTS 640 my first one hasnt even been shipped yet, pretty sad how you have to buy sli of the supposed "high-end" models in order to get decent frame rates
E6300 @ 3.51GHz 1.392v
Asus Maximus Formula
Antec 550Watt Psu
Mushkin DDR-1206@ 5-5-5-15 / 2.28v
Zalman CPU Cooler CNPS7700
6408800GTS(OC 691MHz Core/2.132GHz Mem)
2x80mm case fans +2x120mm case fans
Asus Maximus Formula
Antec 550Watt Psu
Mushkin DDR-1206@ 5-5-5-15 / 2.28v
Zalman CPU Cooler CNPS7700
6408800GTS(OC 691MHz Core/2.132GHz Mem)
2x80mm case fans +2x120mm case fans
- Apoptosis
- Site Admin
- Posts: 33941
- Joined: Sun Oct 05, 2003 8:45 pm
- Location: St. Louis, Missouri
- Contact:
srgess,srgess wrote:Oh in those circontance, yeah you had no choice. But up to now i didnt saw any dx10 benchmark of 8500, 8600 series. It was more a sugestion to be the first to do it
Let's just say NVIDIA didn't even send us an 8500 series card and no AIB partners have sent us one or even offered us one. That just goes to show that the performance is nothing to write home about. Brian Wallace the video card guru here on LR has the 8600GT and the 8600 GTS and is an avid player of CoH, so maybe in the future he will include some DX10 numbers for all to see. The 8600GTS shouldn't be too bad I wouldn't think at resolutions under 16x12