Page 1 of 1

Nvidia GPUs support DX10.1 features in Far Cry 2

Posted: Fri Oct 24, 2008 1:29 am
by sushrukh
A few days ago, we learned that Ubisoft's hugely anticipated free-roaming shooter Far Cry 2 would support DirectX 10.1 extensions for cards that support the latest-available version of Microsoft's API.

Today, we have gathered some more information from Ubisoft on the implementation and it's quite an interesting one because the capabilities are also enabled on all Nvidia GeForce 8, 9 and GTX 200 GPUs, even though they don't comply with DX10.1's requirements.

"The Ubisoft team wanted to enhance the anti-aliasing through the reading of the multisampled depth Z-buffers, explained Vincent Greco, Worldwide Production Technical Coordinator at Ubisoft. "This feature was enabled by either using DX10.1 or using a DX10.0 extension supported by Nvidia DirectX 10 GPUs."
I don't understand how the heck can 10.1 work when the Nvidia hardware doesn't support that natively.It's like emulating an effect with software which should've been done by hardware.


Link :- http://www.bit-tech.net/news/2008/10/22 ... ar-cry-2/1

Re: Nvidia GPUs support DX10.1 features in Far Cry 2

Posted: Fri Oct 24, 2008 2:43 am
by DMB2000uk
For Nvidia cards to officailly be labeled DX10.1 they would have had to implement ALL of the DX10.1 features, nvidia didn't implement them all, but there are some features nvidia chose to implement that were of DX10.1 spec, but because of the classification of DX10.1 they are just there and the card is still classed as DX10 only.

So it could be this was one of the features that was implemented and ubisoft just had to hack and access it using the DX10 API.

Dan