View Single Post
Old 10-14-2021, 11:25 PM   #4
master hoarder
momaka's Avatar
Join Date: May 2008
City & State: VA (NoVA)
My Country: U.S.A.
Line Voltage: 120 VAC, 60 Hz
I'm a: Hobbyist Tech
Posts: 10,862
Default Re: <play taps>RadeonHD 5770 dead at ... ~10?

Originally Posted by eccerr0r View Post
Well... it was a good run after 10 years I think.
Yup, that's not bad at all, considering these aren't know to be very reliable.

I wonder, though, did you use it for any gaming or 3D work?
IME, these cards last as long only when not used/stressed much or when ran with water blocks or equally cool-running oversized heatsinks.

Originally Posted by eccerr0r View Post
Looks like the GPU bought the farm.
Unfortunately. :\

So what, you're not gonna try and bake it (a.k.a. ghetto "reflow")?
I'm sure some of you will give me a shitstorm for suggesting that. But seriously, it's a dead GPU already. Might as well try cremating it if it's truly dead.

Originally Posted by eccerr0r View Post
Time to get a new GPU just to go dual head or something?
Well, if you're looking for dual-DVI output, and that really is the only requirement, then there's a ton of old cards that offer it. The question is, do you really need anything as powerful as that HD5770, or do you just need a display adapter? In the case of the later, there's a ton of inexpensive options. It's OK to go with an "abused used" GPU in some cases... but again, depends on the type of card specs we're talking about.

The only thing I seriously suggest to steer clear of are the Radeon HD7k series (with possibly the exception of HD7570... which are dirt-cheap and actually not that much worse than your HD5770 for specs) and R9 cards - these drop faster than flies. Also any Radeons with HBM memory... though those are the old-ish high-end stuff and still extremely pricey (if you can find a working one), so that will naturally keep you out of them.

In all honesty, maybe see if you can find a used GTX 460 or 560 (or 550 TI if bad comes to worse) for like $20-25 shipped (may have to look around for a while and/or snipe-bid to get that price)... and try to keep it as cool as possible. Then it might last a while, provided it wasn't mined with... which usually isn't the case with these, as they weren't great miners even a good few years back.

Originally Posted by Uranium-235 View Post
Buy something used on fleabay. Better yet, something nvidia.
You mean, noVideo?

Reliability-wise, they're the same turds as AMD.

Or rather, the problem really boils down to inadequate coolers (for the most part)... because why let a card run cool and "under-perform", when you can clock it further up and squeeze every bit of performance out of it. It may not seem logical, but that's what AMD and nVidia have been doing for the past 2 decades now while competing with each other. It's always been down to who has the fastest card, followed by who can offer the most bang for your buck... with reliability not even considered. So because of that, GPU manufacturers have always been pushing the highest performance (read: highest heat output) given a card's design (and cooling capacity.) And that ultimately leads to hot GPU chips, which ultimately leads to shorter silicon life. And there's no two ways about it really - just pure physics. Than nVidia or ATI/AMD tell you that it's OK for your GPU to run at 70...80...90C... is complete bullshit. Even 60C is too much in most cases, and that's actually not only due to RoHS or issues similar to the nVidia bumpgate stuff. It also has to do with the fact that silicon wears out (yes, exactly like a consumable) faster when pushed to higher temperatures. In a similar way, why do LED lights also fail / last less at higher temperatures? Think about that too.

Last edited by momaka; 10-14-2021 at 11:49 PM..
momaka is offline   Reply With Quote