Update 5.1:

Still no word from NVIDIA. It seems we have to be patient.

Update 5:


NVIDIA told us they will give us an official response on Monday, not today.
We feel bad for GeForce FX 5800 owners who are currently living in fear of the
unknown. Our advice – don’t use screensavers and you should be fine. Also, forget
about computers and enjoy Easter.

NVIDIA was extremely tight-lipped. Still, we managed to hear/sense few things:

  • NVIDIA claims 95-100C is well within 5800’s safe operating range.
  • We have an impending sense of doom. Nah, just kidding! However, we do have
    a strong feeling that NVIDIA might play on the card that the board we used
    for our testing is messed up. God, we hope that this is not the case and that
    our feeling is completely wrong!
  • Why? Because we feel its unimportant if only our board can "make"
    artifacts and other "cool" effects. The essence of the problem
    is that the fan stops when it shouldn’t and that the card heats up to an extremely high temperature.

    And this has been reproduced by many. We feel there are not too many people who can be comfortable with having a piece of silicon in their system at 100C and radiating heat like there is
    no tomorrow.
  • And that’s it :(

Update 4:

The saga continues:

  • Well, we tried a bunch off old and new games and could not reproduce the
    effect of fan stoping in them. However, we managed to get FX 5800 Ultra overheat
    in Il2 Sturmovik even though the fan was spinning. The effect was exactly
    the same as in 3dMark – the thing started stuttering until the temperature
  • We tried to simulate the problem on yet another system (EPoX 8RDA+, 2x 256MB
    Corsair XMS3500C2, Adaptec 29160N, 2 SCSI disks, Win XP, etc.) and we got
    a few BSOD’s from using the screensaver. The text on BSOD was the stuff you
    usually get when your NVIDIA GPU-based card crashes after you overclock it
    too much. We did not overclock the FX 5800 we have – there is no need for
    additional heat producing capabilities – the card has enough "firepower"
    as it is.
  • Our findings were repeated by CHIP.DE
    – and they used a Canterwood based system. They also claim that they did not
    encounter the same problem when using older Detonators 42.68. I also saw some
    posts on various forums in which people also recreated the problem. The key,
    as we have stated quite a few times, is NOT to run the screensavers in "Screen
    Saver Preview" mode, but in a regular way.
  • We think the bug is related to some miscommunication between Windows power
    management and drivers. We asked NVIDIA to send us the explanation of how
    exactly their FX cooling system works but so far nothing.

JUST IN – NVIDIA finally sent us an e-mail in which they say
they’ll give us a "full response for you by end of today, ready for communication
to your readers."

Update 3:

First thing – we’ve uploaded a new .mov file that shows what happens after
you exit the screensaver – when the temperature of the card is high – and then
run 3dMark 2001SE. To download it, click

Granted, the quality is not great (its bad to be honest) but this is the best we could do with our
camera – our camera keeps refocusing plus the monitor refresh effect is also
present. However, the poor quality of the screen also comes from the screen
distortion caused by heat. When 3dMark starts its pretty easy to see what happens.
During this period the fan starts running and starts cooling the card. After
it cools down the card, 3dMark resumes as if nothing happened. This we did not
film as the movie would have been too long.

List of things we feel are worth mentioning:

  • GeForce FX 5800 Ultra we are using is really a sturdy card. We’ve been torturing
    it quite a bit and its still here and ticking. True, we tried not to overdo
    it with heat, but still its strong as nails.
  • We feel its really impressive that 3dMark (or any other 3D application that
    you start after the screensaver does its "magic") does not crash,
    instead just hangs and then continues on when the cooling does its job.
  • When the fan kicks in, it cools down the card fast. In other words, when
    it works, the cooling is excellent. Too bad its so noisy (and annoying – sound
    is very high-pitched).
  • The effect is universal to both D3D and OpenGL screensavers. We haven’t
    been able to identify what makes the fan stop, but the most probable cause
    is some little glitch in the drivers. Other than drivers, it could be the
    BIOS of the video card or, perhaps, there is something going on between Windows
    and the drivers. Testing under some other OS would be a good thing.
  • Today we will test 3D plug-ins for various media players to see if they
    could reproduce the same effect (that the fan stops). We’ll also try as many
    different games, from older titles to new stuff to see if there is a title
    that makes the fan stop. We’ll also play with the things like turning off
    hardware T&L, etc.
  • General problem is that there are very few GF FX 5800’s around so there
    aren’t too many people that could "play" with it. Furthermore, many
    who do have a 5800 Ultra, use watercooling for it… I know that many will
    feel uncomfortable pushing their card to limits but we would really appreciate
    if you could send us feedback.
  • We didn’t have much time to test different driver versions so currently
    we are only certain that Detonators 43.45 and 43.51 are affected by the same
    thing. We have received unconfirmed reports that it also applies to Detonators
    older than 43.45 but, as I said, these reports are unconfirmed.
  • At one point we thought that there is a relation btw. ambient temperature
    and the problem. After testing we concluded there isn’t any.
  • As I said, we hope that the fix for the problem will be easy to do (new
    drivers, etc.) but also we feel this issue is very serious. We are not some
    fame seekers; we just feel that consumers have to be aware of this problem.
    This made us publish this information in English as we are a Croatian site,
    in Croatian language, and will stay that way. Good thing is that there aren’t
    that many people around who have 5800 cards. We are also not anti-NVIDIA or
    pro-ATi or whatever.


  • When we think, we think in Croatian. Our English isn’t bad but we are not
    native speakers and sometimes we make mistakes. Good example of this is that
    we said its statistically impossible we got the worst chip on our card in
    the World. Its not impossible, its just highly unlikely due to odds (a reader
    from Canada pointed that out. Thanks!).
  • Our statement from Update 2 about oven testing and fundamental flaw – not
    the smartest thing we could say. What we originally meant (during the discussion
    within the team) was something different. Oven cooking and temperatures of
    100C are usually used to kill bugs in food (for example to pasteurize milk
    – I hope we didn’t use the wrong term here). So, if 100C kills bugs, too bad
    it didn’t "kill" the bug we found – in other words, we meant it
    as a joke. We are aware that heating things to high temperatures is a normal
    quality assurance procedure. We apologize for this.


(Ovo je način na koji možete lijepo spržiti GF FX 5800 Ultra karticu. Tekst
je na engleskom jeziku jer mi se neda raditi prijevod (a pisano je na engleskom
zbog Gainwarda i NVIDIJE). Uživajte!)

Update 2: NVIDIA gave us a call and they told us they were trying
to recreate the problem and that they were partially "successful”. Basically,
they can encounter everything BUT screen corruption (artifacts and other beasts)
– i.e. the fan indeed stops blah blah. They also told us that their chips are
tested in an oven to make sure they work without any problems at temperatures
around 100C. So, their current conclusion is that our board is a black sheep…
NVIDIA couldn’t get DiveVision screensaver to work “as advertised” but OpenGL
Pipes and Matrix Reloaded worked fine.

Our comments:

  • Download/Free version of DiveVision indeed does not work. It seems we have
    a different version.
  • Even if we managed to get the absolutely worst GPU in the World on our card
    (statistically impossible) and are the only ones who can experience screen
    corruption, the fact that the cards heat up to 100C still remains. NVIDIA
    told us that only journalist touch cards with their fingers (!) and that therefore
    regular users are not in danger. But what about their whole systems? Without
    any problem we managed to simulate case ambient temperature of around 65C
    – in a full tower case with 3 fans (plus one on the PSU). Plug a few cards
    in your motherboard to restrict the airfolow even further and just watch the
    temperature rise. Watch how the temperature of your other cards rises too.
  • Anything that has to be tested in an oven has serious fundamental flaws.
  • We are trying to film the screen corruption but are having difficulties getting
    it right, as movie resolution of our card is not good enough to show details like that. With few more
    tries we might get it right.

Update: Last night we tried it on the following system:

  • P4 2.4GHz (533FSB)
  • Gigabyte 8GE667 Pro i845GE mobo
  • 1x 256MB Corsair XMS3500C2
  • CL Audigy! (drv update 31.12.2002)
  • and of course, GF FX 5800 Ultra (rest is same, DX9, Det 43.45s,…)

Same thing happened! We also tried beta 43.51 Detonators with the same result.
We made 2 .MOV’s which show how fan stops spinning and the temperature on the
cooler after the card heats up. To download these movies click

Screensaver of Death

How can a screensaver cook a GeForce FX 5800 Ultra card?
Version 1.00 by Ozren Zagmester, PC Ekspert team – www.pcekspert.com

It’s very very simple!

Test system (even though this will probably work on any system):
AthlonXP 2400+
Chaintech Zenith 7NJS nForce2 board, NVIDIA nForce 2.03 drivers
2x 256MB Infineon PC3200
Gainward GeForce FX 5800 Ultra, NVIDIA Detonator 43.45 drivers
Terratec Aeron Sky 5.1, latest XP drivers, too lazy to look up as it’s unimportant
DirectX9, Windows XP Pro SP1

To make a GeForce FX 5800 Ultra card suffer great pain, apart from Detonators
43.45 (we will test other versions of Detonators during the night) you need
a 3D screensaver. Even the OpenGL screensavers like Pipes that come with Windows
are OK, but for some serious “firepower” (to make the card extra hot in no time)
you need a screensaver like Matrix Reloaded from www.uselesscreations.com or,
the one we used the most, DiveVisions 2.0 from www.atlantis3d.com . This behavior
was first noticed by Denis Arunovic from our team, and after he called me in
the middle of the night, I joined the “hunt”.

Set up the screensaver to turn on itself after 1 minute, so you don’t have
to wait too long. After the screensaver starts, you’ll hear the fan on the GF
FX 5800 Ultra start spinning. Just wait around 5-10 secs and you’ll hear the
fan stop! The 3D screensaver will continue playing on the screen, and after
some time (depending on the screensaver) you’ll start seeing artifacts, or in
the case of DiveVisions, the screen will look like the monitor was having some
interference from a strong magnetic source. You can leave the screensaver running
as long as you feel its safe, but don’t overdo it. We tried this around 10 times
and were cautious enough not to let it run to long, as the temperature on the
card, according to the drivers, reached 95-100C. According to my finger, the
drivers were not lying (ouch!). When you end the screensaver, you’ll realize
that your 2D screen is messed up and wavy. At this point the fan still won’t
be running. To make it run, start something like 3dMark 2001. After you start
the demo/benchmark, you’ll finally hear the fan! It will seem as if the 3dMark
crashed, but just let it unattended. You’ll see some bizarre things but after
a while the card will resume working as normal – when the fan cools it down
(the cooling, when it works, is excellent, at least on the Gainward card).

This behavior was confirmed by Croteam who tried it with a NVIDIA reference
board they received from NVIDIA Developer Relations, and also by a friend of
mine with a MSI GF FX 5800 Ultra card. Also, if you try running the screensaver
in the preview mode, the fan will spin the whole time. For this to work, it
has to be done with a “real” screensaver, not with the preview of one. I am
off to try this on a different system (just to make sure for one more time)
and then I’ll start trying different Detonators.

Why this is happening? Well, we think that for some reason when
using a 3D screensaver the card, or its drivers, decide the card is not running
in 3D mode and stop the fan. The problem, of course, is that regardless of the
temperature the card reaches, the fan won’t start running again (as long as
you don’t exit the screensaver and start running 3dMark or some game like Il2
Sturmovik, Quake 3, etc.). This means that NVIDIA has made a big overlook on
how the whole system of cooling-on-demand (i.e. when you are in 3D, or when
you a reach high temperature) works. Basically the whole thing obviously works
on the basis that the card/drivers will realize they are running in 3D, instead
of also taking the temperature in consideration. I mean, you have the bloody
thing that measures the temperature, why not use it as you should?

Answer might be that NVIDIA decided to minimize noise, and to do that decided
the fan should not spin in 2D (or when the card thinks its in 2D) because it
is very likely that the user will be doing something that does not produce much
noise (for example writing a document in Word). When the user goes into 3D,
to play a game, well then the game will produce lot of sound (explosions, music,
you name it) that will diminish the noise that cooler on a GF FX 5800 Ultra
card makes. I seriously hope this is not case. We’ll never know, as I am sure
NVIDIA won’t admit such a thing even if it was true.