I don't understand how the heck can 10.1 work when the Nvidia hardware doesn't support that natively.It's like emulating an effect with software which should've been done by hardware.A few days ago, we learned that Ubisoft's hugely anticipated free-roaming shooter Far Cry 2 would support DirectX 10.1 extensions for cards that support the latest-available version of Microsoft's API.
Today, we have gathered some more information from Ubisoft on the implementation and it's quite an interesting one because the capabilities are also enabled on all Nvidia GeForce 8, 9 and GTX 200 GPUs, even though they don't comply with DX10.1's requirements.
"The Ubisoft team wanted to enhance the anti-aliasing through the reading of the multisampled depth Z-buffers, explained Vincent Greco, Worldwide Production Technical Coordinator at Ubisoft. "This feature was enabled by either using DX10.1 or using a DX10.0 extension supported by Nvidia DirectX 10 GPUs."
Link :- http://www.bit-tech.net/news/2008/10/22 ... ar-cry-2/1