Nvidia GPUs support DX10.1 features in Far Cry 2

Forum for all the nVidia video cards from the past, present and future!
Post Reply
sushrukh
Legit Aficionado
Legit Aficionado
Posts: 99
Joined: Thu Jan 11, 2007 3:53 am

Nvidia GPUs support DX10.1 features in Far Cry 2

Post by sushrukh »

A few days ago, we learned that Ubisoft's hugely anticipated free-roaming shooter Far Cry 2 would support DirectX 10.1 extensions for cards that support the latest-available version of Microsoft's API.

Today, we have gathered some more information from Ubisoft on the implementation and it's quite an interesting one because the capabilities are also enabled on all Nvidia GeForce 8, 9 and GTX 200 GPUs, even though they don't comply with DX10.1's requirements.

"The Ubisoft team wanted to enhance the anti-aliasing through the reading of the multisampled depth Z-buffers, explained Vincent Greco, Worldwide Production Technical Coordinator at Ubisoft. "This feature was enabled by either using DX10.1 or using a DX10.0 extension supported by Nvidia DirectX 10 GPUs."
I don't understand how the heck can 10.1 work when the Nvidia hardware doesn't support that natively.It's like emulating an effect with software which should've been done by hardware.


Link :- http://www.bit-tech.net/news/2008/10/22 ... ar-cry-2/1
User avatar
DMB2000uk
Site Admin
Site Admin
Posts: 7095
Joined: Mon Jul 18, 2005 5:36 pm
Location: UK

Re: Nvidia GPUs support DX10.1 features in Far Cry 2

Post by DMB2000uk »

For Nvidia cards to officailly be labeled DX10.1 they would have had to implement ALL of the DX10.1 features, nvidia didn't implement them all, but there are some features nvidia chose to implement that were of DX10.1 spec, but because of the classification of DX10.1 they are just there and the card is still classed as DX10 only.

So it could be this was one of the features that was implemented and ubisoft just had to hack and access it using the DX10 API.

Dan
Image (<- Clickable)
Post Reply