> that's well beyond the expected useful lifespan of even a modern TV, let alone an older model like these
A modern TV may have an expected lifespan of five years. TVs from several decades ago had lifespans of... several decades. Quality has plummeted in that market.
5 years? Is that really true? I’m currently using an LG from 2017 and cannot imagine needing to change it. I would be shocked if it stopped working.
I don't think it is true at all.
There's nothing inside today's monitors or TVs that can't run for at least 10 years. Our main TV, 42" 720p LCD, is from 2008, and I have monitors that are just as old.
Yep. My TV, a 42" Panasonic plasma, dates from 2009 and is still working perfectly. I haven't replaced it, because why would I?
I have an LG OLED from 2017. It started getting really bad screen burn/pixel degredation just after the 6 year mark (6 year warranty), I did a quick search on Youtube, and lo-and-behold, a whole bunch of other people, with the same model, started having the same screen burn-in issues at the same age!
It covers the middle third of the screen, top to bottom, and the entire bottom 1/4 of the screen with some odd spots as well, it's really distracting and essentially makes the TV useless (to me).
OLED screens are known for having burn-in problems like this. LCDs don't, though they probably have issues with backlights becoming dim with age.
I have an LG about that vintage and it’s starting to black out when doing 4K content. All components before it switched out and up to date in firmware. Reatarting works, sometimes all day, sometimes 1 minute.
My other TV about the same vintage is starting to have stuck pixels in the corner.
Modern failure modes aren’t nearly as graceful.
But when it does, it will probably be the capacitors in the power supply that have dried out.
Is that really the case? Because if so, it seems like simply replacing the capacitors would save a lot of waste and unnecessary purchases of new TVs...
This is a very common fault, yes. Power supply issues in general. It is also not uncommon for people to replace e.g. Wifi routers because the wall warts fail.
It comes down to a few people don't knowing a lot about it - and I'm not blaming anyone for that, we all have our interests and most people have more than enough to do already to worry about what goes on inside their stuff.
Also, electronics are, to a lot of people in a lot of places, so cheap that they would rather just curse a little and buy a new thing, instead of bothering with taking the thing to a shop. And of course a few hours of skilled labour in a big city in the west might also be almost as expensive as making a whole new TV in a factory in Asia plus shipping, so it might not even make economic sense.
> And of course a few hours of skilled labour in a big city ...
In many/most places, these repair shops don't even exist any more, because the products have gotten too complicated/integrated/parts-unavailable, and the economics are nonsensical.
Electrolytic capacitors are not solid state and likely #1 failure mode for most electronics. There are options for better (e.g. Al polymer) capacitors that are rather expensive - overall good capacitors are 'expensive', e.g. more than a dollar a piece in some cases.
The 2nd most common failure mode gotta be the mlcc (multi layer ceramic capacitor) cracks/shorts.
How can I even know which capacitor is faulty?
If your model was popular, there's likely a recap kit for its power supply. It usually makes senss to swap all the capacitors in the kit, unless the kit instructions say otherwise.
You can look for physical signs of degredation (bulgy, leaky, discolored), but to really test a capacitor for capacititance, you need to take it out of the circuit, at which point, you may as well put a new, high quality capacitor in.
The OEM capacitors may likely have a just right voltage rating, a new one with a higher voltage rating (and same capacitance, compatible type) may last longer in cirucit as well.
> new one with a higher voltage rating (and same capacitance, compatible type) may last longer in cirucit as well.
That's not necessarily true, higher voltage rating equals higher ESR which means more heat.
That would require some experience, yet the most common visual clue would be 'bulging'. There are some ways to measure ESR w/o desoldering but they won't be reliable at all times.
Measuring voltages, peak to peak, is a bit more work.
A TV used to cost a few weeks pay and now you can get a TV for the equivalent of a few hours pay. There just isn't much of a market for a $3000+ TV.
Few usually means 3-5 or so, a half decent TV would be at least half a grand. That's rather high hourly pay rate.
Explain to me why this tv for $100 [1] isn't perfectly suitable to replace a 2008 40" 1080p Samsung LCD with florescent backlight that 2was a deal at $1000. Yeah, you could get something bigger and better. Yes, price comparison on a sale week is a bit unfair.
[1] https://www.bestbuy.com/site/tcl-40-class-s3-s-class-1080p-f...
Only one metric of 'quality' has plummeted.
A rock lasts billions of years, but its quality as a TV is rather questionable.