Page 1 of 1

How NVIDIA made the 9600 GT gain extra performance

Posted: Fri Feb 29, 2008 12:12 pm
by Apoptosis
Here is some interesting news on the NVIDIA GeForce 9600 GT and how they work when it comes to clock frequency.
When we first reviewed NVIDIA's new GeForce 9600 GT, we noticed a discrepancy between the core clock speed reported by the driver and the core clock speed read from the clock generator directly.
clocks.gif
clocks.gif (23.12 KiB) Viewed 4245 times
RivaTuner Overclocking and GPU-Z read the clocks from the NVIDIA driver, displaying whatever the driver returns as core clock. Rivatuner Monitoring however accesses the clock generator inside the GPU directly and gets its information from there. A PLL to generate clocks works as follows. It is fed a base frequency from a crystal oscillator, typically in the 13..27 MHz range. It then multiplies and divides this frequency by an integer value to reach the final clock speed. For example 630 MHz = 27 MHz * 70 / 3.

The information which crystal is used is stored inside the GPU's strap registers which are initialized from a resistor configuration on the PCB and the VGA BIOS. In case of the GeForce 9600 GT the strap says "27 MHz" crystal frequency and Rivatuner Monitoring applies that to its clock reading code, resulting frequency: 783 MHz = 27 MHz * 29 / 1. The NVIDIA driver however uses 25 MHz for its calculation: 725 MHz = 25 * 29 / 1.

This explains the clock difference and can only be seen on the core frequency (the memory PLL is always running at 27 MHz).

We verified this personally on three 9600 GT cards from various manufacturers, other users confirm this too.
source

Re: How NVIDIA made the 9600 GT gain extra performance

Posted: Fri Feb 29, 2008 2:02 pm
by vicaphit
Why would this be done? wouldn't they just rather say "clock speed is 780" to sell more units? Or are the drivers just too new to show the correct clocks?

Re: How NVIDIA made the 9600 GT gain extra performance

Posted: Fri Feb 29, 2008 4:13 pm
by ibleet
I agree, they would sell more units showing a higher clock speed. :-k

Re: How NVIDIA made the 9600 GT gain extra performance

Posted: Mon Mar 03, 2008 11:15 am
by DMB2000uk
If they did this the other way round then i could see some people being upset, but as it is, i agree that they are missing out on marketing it as faster.

Dan

Re: How NVIDIA made the 9600 GT gain extra performance

Posted: Mon Mar 03, 2008 11:24 am
by Bio-Hazard
The shaders and memory also go in "steps", some larger, some smaller. What some programs show isn't what they are exactly clocked at............... #-o

RivaTuner monitor will show you exactly what clocks you are getting, I thought that everyone already knew this ........... :mrgreen:

Re: How NVIDIA made the 9600 GT gain extra performance

Posted: Mon Mar 03, 2008 2:18 pm
by vicaphit
I knew rivatuner would show a few Mhertz difference, but 58 MHZ? there is no way that the difference is that large! maybe rivatuner needs an upgrade.

Re: How NVIDIA made the 9600 GT gain extra performance

Posted: Mon Mar 03, 2008 3:40 pm
by Bio-Hazard
It may, or I say should, last I checked, it doesn't even offically support the drivers I'm using now let alone the ones for a brand new card............... :mrgreen:

I guess I should go check if there's a new version out yet.

Just checked, they just today posted a updated version (2.7)
RivaTuner v2.07
On this page you can download RivaTuner v2.07.
Version: RivaTuner 2.07
Publisher: Unwinder
Date added: 2008-03-04 04:55:07
File Size: 2,26 MB
OS Support: Windows 2000/XP/Vista x64 & x32
License: Freeware
http://downloads.guru3d.com/download.php?det=163