Illuminati wrote:I'm not sure I follow your logic. NVIDIA is part of the development program at FutureMark... why would NVIDIA want sites to stop using a benchmark that they are spending a lot of money to look good in?
Sorry about parts of my post as I had some typos and poor grammar. I wrote most of it during my lunch hour so I was trying to multi-task with a ham sandwich and chips at the same time
What I meant was that NV has done a 360 with 3dmark2k3. Remember during the launch of the GF3 back that NV also had 3dmark launch 3dmark2001 on the same day. In fact 3dmark2k1 was used during the launch event to show us the power of the GF3 PS engine with the Nature test. Then for the next 2 years it was all 3dmark2k1. Every web site had it. NV was fully behind support of it as its showed NV in the top spot un-touchable by anyone else. NV even markets their PA that uses 3dmarks scores. So every where you looked 3dmark was da bomb. Did you see one paper from NV saying back then that 3dmark was not representative of (actual) games, nor is it a good benchmark? Nope. Everyone loved 3dmark2k1.
Then you had a few events that challenged this belief. The launch of the R300 that shocked us all. The utter failure that was the NV30 in Nov of 2002. Beyond3d in Feb of 2k3 shows us that the "8 pipe" NV30 is really a 4 pipe design (8 with stencil ops). And reviews of the NV30 show it to be poor in most aspect to the R9700Pro. Then on Feb11 we have 3dmark2k3 officially launched. And once we see the DX9 performance between the R9700pro and the NV30..well yet it was that bad. Note other DX9 tests written by other users showed this same piss poor DX9 performance (Shadermark, RightMark, ect).
The same day of 3dmark2k3 launch we have this from NV:
http://www.extremetech.com/article2/0,3 ... 239,00.asp
Seems like good o'l NV does not like 3dmark anymore and has pulled out of the beta program since dec of 2002. (which is about when the were testing the first pass of their FX lines). Their take:
"The reason that we're not all gung ho about it is that (3DMark'03) is not representative of (actual) games, nor is it a good benchmark," said Tony Tamasi, senior director of desktop product management at Nvidia. "That means Nvidia has to expend effort to make sure it runs well on our hardware. All that energy that we spend doesn't benefit the user. None. Zero. All that effort doesn't go to benefit any game, either. That's kind of depressing."
We have B3D/ExtreemTech sites that have shown how NV has cheated at that time with clips plans and all shorts of BS. On May, FM responds with their take on this mess using the word Cheat:
http://www.futuremark.com/news/?newsart ... 2003052308 along with a new patch 330 to get around those cheats.
NV claims:
Since NVIDIA is not part in the FutureMark beta program (a program which costs of hundreds of thousands of dollars to participate in) we do not get a chance to work with Futuremark on writing the shaders like we would with a real applications developer. We don't know what they did but it looks like they have intentionally tried to create a scenario that makes our products look bad. This is obvious since our relative performance on games like Unreal Tournament 2003 and Doom3 shows that The GeForce FX 5900 is by far the fastest graphics on the market today.
Then just 10 days later, NV and FM get all nice again and announce a make up:
http://www.futuremark.com/news/?newsart ... 2003060305 And only then does NV rejoin the beta program of 3dmark.
Yea optimizations? BS. There is no way in hell you can have static clip plane inserted into 3dmark be anything but a blatant attempt to inflate 3dmark score. I don't care if you don't like 3dmark at this point, but we can all see that is just flat out wrong. Way wrong. Weather we hate 3dmark2k3 or not, we still have to realize that lots of OEMs and other people use it to making buying choices off. And to mislead those people with cheats again way wrong.
The issues that NV brought up on the day of 3dmark2k3 launch are valid but they have ALWAYS been valid with EVERY version of 3dmark. Was 3dmark2k1 any better? Heck no. For example we have the K2 video card that got killed by 1000+ 3dmarks when compared to a GF2mx card. Yet in DX7/DX8 games (max payne, UT2k3) the K2 was usually about x2 faster than that GF2 MX card. That's just one example of how "good" 3dmark2k1 was. The fact that it was based off a real engine is point less. Look at q3 scores. Its a real engine right? Yet we know that every game based off the q3 engine runs differently. Thus Benchmarks from Q3 != Benches from RtCW != SOFII != COD != JK2 != JK:A != Star Trek != Alice ect. You see all of the games are based off Q3 engine (some are modified Q3 engines) but if you have such wilding varying FPS scores on the SAME engine..then does it even count? No. Every game will have its own bottle necks, slowdowns and other issues. Thus all of NV "issues" with 3dmark2k3 are moot because they have always been there. And it was not until NV said something about 3dmark2k3 that other web sites started to follow. I mean why do we all hate 3dmark2k3 so much when we loved 3kmark2k1 as they have the same issues?
Well there you have it.. Some where I was trying to make a point but I am not sure where now
This is great and helpful to us, but why don't they work out the "cheating" issue with nvidia dirctly? Is it because they are on nvidia's payroll and don't want to risk loosing funding? Is it because they can't? Or have they simply given up?
Mind you I was not part of the talks but do have some inside contacts. $$$$$$ is the reason why. NV has how much cash on hand? FM has about 1/1000 of that; for one reason. The other is legal action. Do not think for a minute the turn about was not forced by legal action. Claiming cheat is a serious thing here in the states. And NV has lawyers. FM does not per say. One of the production leaders of 3Dmark mark quit over this issue. They have tried to do but NV refuses to remove the cheats. What would you do in that case? You have seen NV optimization guide lines haven't you?
An optimization must produce the correct image.
An optimization must accelerate more than just a benchmark.
An optimization must not contain a pre-computed state.
And yet you have shown that with the last drivers they go against their OWN guide lines? If NV is not gonna to stick to THEIR own rules, how are they gonna to follow others rules then? I don't have an answer for that one.
Lastly, about the stock time demos vs. custom time demos.. Interesting stuff huh? On my upcoming prescott review I used custom demos/utilities for UT2003 and Call of Duty just to avoid possible issues. Sad how far people will go to make something look better than it is.
Yeap its sad. But until more sites stand up and take measures again INVs that allow this practice then we all will have to live with it. Pitty eh? And the whole lack of Tri-linear is an issue in my book is a pitty. I mean good grief, Tri-linear is so 1999......
It's a canned benchmark that is worthless in the real world. (or am I missing a use for it?)
Its up to your style and what info do you want to give. I still feel that all synthetics have their place. The goal of 3dmark was never to tell you what FPS you will get in the next gen games. But give you a way to compare two different cards to see which one has the potential of being "better" in those games. We all know that the FX line has an lower performing PS2.0 engine vrs the R3xx cards. How did we know this? Synthetics showed us this case 9 months ago. Recently we have developers that have also said this, Gabe at Valve and JohnC.
Hi John,
No doubt you heard about GeForce FX fiasco in Half-Life 2. In your opinion, are these results representative for future DX9 games (including Doom III) or is it just a special case of HL2 code preferring ATI features, as NVIDIA suggests?
"Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec.
John Carmack"
And today's DX9 games show they tend to run just a bit faster on the ATI cards which now makes it complete. Synthetics told us one was faster a long time ago, developers also said the same thing and now DX9 games also show the same thing. Games like the FarCry demo for example run PS2.0 on the R3xx cards but if you try this on an FX card, it drops back to PS1.1
What about your readers? If you have JoeSixPack that can only upgrade once every other year. Then he has $200 to spend on new card which what should he get? If he was in the market a few months ago, then alls he had was DX7/8 games to look at. And we know both the new cards run DX7/8 games just fine. However since JoeSixPack's card will have to survive some DX9 games as he will have that card for 2 years. Then don't you think its your job as a reviewer to let him now that one card has a potential weakness in DX9? That that card might have to drop back to partial precision hints or drop back to DX8 (PS1.1) shaders vrs running as a true DX9 card as its advertised on the box? I think JoeSixPack as a right to know this. Don't you? Until we get more DX9 games out you can not used today's DX9 games as the sole indicator of DX9 performance. Nor can you use DX7/8 games to show DX9 performance. What else can you use but synthetics to give you a general idea of where the products are in terms of DX9 power?
I am not saying use the 3dmark score (that you might as well file to /dev/null as its useless). However some of the other DX9 test that 3dmark has or the other synthetics will help to give you a clue to potential issues. The key is to know how and when to use them (as well as which is worth reporting). And last time I knew giving more info was a good thing