Page 1 of 1

Question about x800 series and pixil shader 3.0

Posted: Thu Feb 09, 2006 12:36 am
by dgood
So why is it that a chipset supports 3.0 or doesn't, and what in the actually gpu dictates that? I ask this because Ihave my c3d x800gto thinking its a x800xt at 573mhz/1080mhz and its 16 pipes. However a F****ng 6600gt 128mb will beat me on 3dmark06 because of its nice love and ability to do 3.0. I cannot render 3.0 on the x800 series. I was wondering if ever a driver would come out that would enable it on sucha great gpu I have. or some mod that would enable it. I really just want to understand why it has that limitation. Can someone help? got only a 2177 on 3dmark06 wiht 2.4ghz 4000+ and 2 gbs of ram.

Posted: Thu Feb 09, 2006 8:43 am
by Topher
You're not one of those guys that Nvidia hired to troll forums and hype up their cards are you? ;-)

Just kidding.....

Posted: Thu Feb 09, 2006 11:46 am
by Bwall
The short answer is that your video card will never support SM 3.0 no matter what. The X1000 series are the first ATI cards to support SM 3.0.
As for differences between SM 2.0 and SM 3.0 give this link a look...

http://www.microsoft.com/whdc/winhec/pa ... VIDIA.mspx

Posted: Thu Feb 09, 2006 12:49 pm
by kenc51
also the Gforce 6 series cards don't do SM3.0 very well either...

Remember 3Dmark is a synthetic benchmark......

Posted: Tue Feb 28, 2006 11:34 am
by Black Mesa Scientist
That's not true. The 6 series handle Sm 3.0 fine.,

Posted: Tue Feb 28, 2006 12:00 pm
by kenc51
Black Mesa Scientist wrote:That's not true. The 6 series handle Sm 3.0 fine.,
Do do SM3.0 --> but performance drops badly!!! they have 16bit precision

Posted: Wed Mar 01, 2006 12:26 pm
by Black Mesa Scientist
Oh, you might be right. I thought it was 32bit.

Posted: Thu Mar 23, 2006 3:41 am
by Immortal
The reason was that ATI didnt think it was required back then to have SM3 support... and i dont *think* any game supports it yet... but the Unreal 3 Engine will do cause its DX10 :)