Quick note: I know this is a game development Q&A site but I guess you guys most of all know and have experience with graphics cards so I address this question to you. If you think this is completely off-topic, please refer me to a proper site/forum. Edit: Actually, it is gamedev-related: if a bad code can result in card overheating or breaking then game developers should be aware of that and make sure their applications don’t do that.
This might seem like a weird or stupid question but is it actually possible to write such a graphics rendering application that can break the graphics card (in any way)?
The immediate reason that made me ask this question was (no surprise) my own broken graphics card. After having it repaired the serviceman said that they tested various apps (games) on it and it worked fine. But when I launched my own app (deferred shading demo) it heated it to over 100 degrees Celsius. So my card didn’t turn out to be fixed after all but what’s important here is that the problem seemed to only occur when running my own app.
I’ve played various GPU-demanding games on it (like Crysis) and often pushed it to the limit and more (had settings so high that the games ran at 5 FPS), some benchmarks as well… So I’ve given my card, many times, so much work-load that it couldn’t catch up (hence low FPS) but it never reached dangerous temperatures. But my own application managed to achieve that (at least when the v-sync was off). :P Since it was only my own app, I don’t think a bad cooling system was the culprit.
So I ask – do you think (or maybe know) whether or not it is possible to break the graphics card (in any way, not just by overheating) by some vicious code?
Joe Swindell said that overheating may be the problem (well, it definitely can break the card). But shouldn’t a proper cooling system prevent that from happening (under any circumstances)?
Boreal pointed out another problem. If I understand correctly, FPS is bound by both CPU and GPU (is that right?). So low FPS might signal either high CPU load or high GPU load. But again – shouldn’t a proper cooling system prevent GPU from overheating even if the card is “used at 100% all the time”?