Apple should be required to do a recall for these motherboards.
If they do a recall, it will say they should be discarded. Sony has a recall on all its trinitron tvs made before the end of 1990 like this:
https://www.sony.jp/products/overseas/contents/support/infor...
This shouldn't be allowed at all: if the product was bad all along, they should be required to fix it, and shouldn't be able to say "well, it's old, so you should just trash it", which means they don't suffer any penalty whatsoever.
I don't think that's a reasonable expectation in general, and certainly not in this case. The affected TVs were all at least 20 years old - that's well beyond the expected useful lifespan of even a modern TV, let alone an older model like these. Nor is it clear what Sony could reasonably have done to repair them; even by 2010, a lot of the parts used in CRT TVs were out of production and unavailable.
Maybe you're too young to remember, but people used to keep TVs for much longer periods before HDTV and flat panels came out.
Also, these TVs are apparently fire hazards. It doesn't matter that they're 20 years old (at the point of the "recall" in 2010).
I doubt the parts necessary to fix them were out of production; you can get parts for truly ancient electronics still. Things like capacitors don't become obsolete. The recall doesn't specify exactly which component is problematic, but says it's age-related, which usually points to capacitors.
This. I’ve known a TV that was in more or less daily use for over 30 years. Not sure why we stopped expecting that from electronics.
>Not sure why we stopped expecting that from electronics.
For TVs specifically, the technology changed a lot. For a long time, everyone was stuck on the NTSC standard, which didn't change much. At first, everyone had B&W TVs, so once you had one, there was no reason to change. Then color TV came out, so suddenly people wanted those. After that, again no reason to change for a long time. Later, they got remote controls, so sometimes people would want one of those, or maybe a bigger screen, but generally a working color TV was good enough. Because TVs were glass CRTs, bigger screens cost a lot more than smaller ones, and there wasn't much change in cost here for a long time.
Then HDTV came out and now people wanted those, first in 720p, and later in 1080i/p. And flat screens came too, so people wanted those too. So in a relatively short amount of time, people went from old-style NTSC CRTs to seeing rapid improvements in resolution (480p->720p->1080->4k), screen size (going from ~20" to 3x", 4x", 5x", 6x", now up to 85"), and also display/color quality (LCD, plasma, QLED, OLED), so there were valid reasons to upgrade. The media quality (I hate the word "content") changed too, with programs being shot in HD, and lately 4k/HDR, so the difference was quite noticeable to viewers.
Before long, the improvements are going to slow or stop. They already have 8k screens, but no one buys them because there's no media for them and they can't really see the difference from 4k. Even 1080p media looks great on a 4k screen with upscaling, and not that much different from 4k. The human eye is only capable of so much, so we're seeing diminishing returns.
So I predict that this rapid upgrade cycle might be slowing, and probably stopping before long with the coming economic crash and Great Depression of 2025. The main driver of new TV sales will be people's old TVs dying from component failure.
Great points. The TV I have today is approaching my platonic ideal screen. It’s as big as it can get without having to continually look around to see the whole screen. Sit in the first row of a movie theater to understand how that can be a bad thing. The pixels are smaller than I can see, it has great dynamic range, and the colors can be as saturated as I’d ever want. There’s not much that can be improved on it as a traditional flatscreen video monitor.
> The human eye is only capable of so much, so we're seeing diminishing returns.
Or not seeing diminishing returns. Which is the point.
> At first, everyone had B&W TVs, so once you had one, there was no reason to change
Televisions improved over time:
- screens got flatter
- screens got larger
- image quality improved
- image contrast increased (people used to close their curtains to watch tv)
- televisions got preset channels
My experience of ancient CRT devices is that the display gets gradually dimmer. I once had a TV that was only really usable after dark -- but that's the only time I wanted to use it anyway -- and a huge Sun monitor that was only just about readable in total darkness, but we kept it because we also had a Sun server that we didn't know how to connect to any other monitor and we were worried that one day we wouldn't be able to SSH to it, but in fact the server never once failed.
> daily use for over 30 years
However that doesn't imply TVs were that reliable.
Before the 90s TV repairman was a regular job, and TVs often needed occasional expensive servicing. I remember a local TV repair place in the 90s which serviced "old" TVs.
> Not sure why we stopped expecting that from electronics.
Last years model only does 4k, my eyes need 8k
32K ought to be enough for anybody.
Because electronics got so much better so much faster, that the vast majority of customers did not want to use old hardware.
Especially if customers allowing shorter lifetimes allowed companies to lower the prices.
There are many use cases for which a decade-old computer is still perfectly serviceable and even where they aren't, those computers can be repurposed for the ones that are.
Moreover, we're talking about televisions and old Macs. TVs with higher resolutions might come out, but lower resolution ones continue to be sold new (implying demand exists at some price), and then why should anybody want to replace a functioning old TV with a newer one of the same resolution?
Much older computers continue to be used because they run software that newer computers can't without emulation (which often introduces bugs) or have older physical interfaces compatible with other and often extremely expensive older hardware.
If people actually wanted to replace their hardware instead of fixing it then they'd not be complaining about the inability to fix it.
>There are many use cases for which a decade-old computer is still perfectly serviceable and even where they aren't, those computers can be repurposed for the ones that are.
It depends. Older computers usually guzzle power, especially if you look at the absolutely awful Pentium4 systems. You're probably better off getting a RasPi or something, depending on what exactly you're trying to do. Newer systems have gotten much better with energy efficiency, so they'll pay for themselves quickly through lower electricity bills.
>TVs with higher resolutions might come out, but lower resolution ones continue to be sold new (implying demand exists at some price)
We're already seeing a limit here. 8k TVs are here now, but not very popular. There's almost no media in that resolution, and people can't tell the difference from 4k.
For a while, this wasn't the case: people were upgrading from 480 to 720 to 1080 and now to 4k.
>and then why should anybody want to replace a functioning old TV with a newer one of the same resolution?
They probably don't; if they're upgrading, they're getting a higher resolution (lots of 1080 screens still out there), or they're getting a bigger screen. It's possible they might want newer smart TV features too: older sets probably have support dropped and don't support the latest streaming services, though usually you can just get an add-on device that plugs into the HDMI port so this is probably less of a factor.
> Older computers usually guzzle power, especially if you look at the absolutely awful Pentium4 systems.
https://en.wikipedia.org/wiki/List_of_Intel_Pentium_4_proces...
The Northwood chips were 50 to 70 W. HT chips and later Prescott chips were more 80 to 90 W. Even the highest chips I see on the page are only 115 W.
But modern chips can use way more power than Pentium 4 chips:
https://en.wikipedia.org/wiki/Raptor_Lake
The i5-14600K has a base TDP of 125 W and turbo TDP of 181 W, and the high-end i9-14900KS is 150 W base/253 W turbo. For example, when encoding video, the mid-range 14600K pulls 146 W: https://www.tomshardware.com/news/intel-core-i9-14900k-cpu-r...
More recent processors can do more with the same power than older processors, but I think for the most part that doesn't matter. Most people don't keep their processor at 100% usage a lot anyway.
As I said in a sister comment here, you can't compare CPUs by TDP. No one runs their CPU flat-out all the time on a PC. Idle power is the important metric.
> Older computers usually guzzle power, especially if you look at the absolutely awful Pentium4 systems.
Even many Pentium 4-based systems would idle around 30 watts and peak at a little over 100, which is on par with a lot of modern desktops, and there were lower and higher power systems both then and now. The top end Pentium 4 had a TDP of 115W vs. 170W for the current top end Ryzen 9000 and even worse for current Intel. Midrange then and now was ~65W. Also, the Pentium 4 is twenty two years old.
And the Pentium 4 in particular was an atypically inefficient CPU. The contemporaneous Pentium M was so much better that Intel soon after dumped the P4 in favor of a desktop CPU based on that (Core 2 Duo).
Moreover, you're not going to be worried about electric bills for older phones or tablets with <5W CPUs, so why do those go out of support so fast? Plenty of people whose most demanding mobile workload is GPS navigation, which has been available since before the turn of the century and widely available for nearly two decades.
> For a while, this wasn't the case: people were upgrading from 480 to 720 to 1080 and now to 4k.
Some people. Plenty of others who don't even care about 4k, and then why would they want to needlessly replace their existing TV?
> They probably don't; if they're upgrading, they're getting a higher resolution (lots of 1080 screens still out there), or they're getting a bigger screen.
That's the point. 1080p TVs and even some 720p TVs are still sold new, so anyone buying one isn't upgrading and has no real reason to want to replace their existing device unless it e.g. has a design flaw that causes it to catch fire. In which case they should do a proper recall.
>The top end Pentium 4 had a TDP of 115W vs. 170W for the current top end Ryzen 9000 and even worse for current Intel.
You can't compare CPUs based on TDP; it's an almost entirely useless measurement. The only thing it's good for is making sure you have a sufficient heatsink and cooling system, because it tells you only the peak power consumption of the chip. No one runs their CPUs flat-out all the time unless it's some kind of data center or something; we're talking about PCs here.
What's important is idle CPU power consumption, and that's significantly better these days.
>older phones or tablets with <5W CPUs, so why do those go out of support so fast?
That's an entirely different situation because of the closed and vendor-controlled nature of those systems. They're not PCs; they're basically appliances. It's a shitty situation, but there's not much people can do about it, though many have tried (CyanogenMod, GrapheneOS, etc.).
>Plenty of others who don't even care about 4k
Not everyone cares about 4k, it's true (personally I like it but it's not that much better than 1080p). But if you can't tell the difference between 1080p and an NTSC TV, you're blind.
>1080p TVs and even some 720p TVs are still sold new
Yes, as I said before, we're seeing diminishing returns. (Or should I say "diminishing discernable improvements"?)
Also, the 720p stuff is only in very small (relatively) screens. You're not going to find a 75" TV with 720p or even 1080p; those are all 4k. The low-res stuff is relegated to very small budget models where it's really pointless to have such high resolution.
For most videos, the difference between 1080p and 4k ain't that large.
But for certain video games on a large screen, I can definitely tell the different between 1080p and 4k. Especially strategy games that present a lot of information.
Btw, as far as I can tell modern screens use significantly less power, especially per unit of area, than the CRTs of old; even if that CRT is still perfectly functional.
Suppose they would recall all the old tv's with known faults, can those be fixed to become conform to (today's) quality and safety standards, while being full of old components with characteristics beyond original tolerances?
> that's well beyond the expected useful lifespan of even a modern TV, let alone an older model like these
A modern TV may have an expected lifespan of five years. TVs from several decades ago had lifespans of... several decades. Quality has plummeted in that market.
5 years? Is that really true? I’m currently using an LG from 2017 and cannot imagine needing to change it. I would be shocked if it stopped working.
I don't think it is true at all.
There's nothing inside today's monitors or TVs that can't run for at least 10 years. Our main TV, 42" 720p LCD, is from 2008, and I have monitors that are just as old.
Yep. My TV, a 42" Panasonic plasma, dates from 2009 and is still working perfectly. I haven't replaced it, because why would I?
I have an LG OLED from 2017. It started getting really bad screen burn/pixel degredation just after the 6 year mark (6 year warranty), I did a quick search on Youtube, and lo-and-behold, a whole bunch of other people, with the same model, started having the same screen burn-in issues at the same age!
It covers the middle third of the screen, top to bottom, and the entire bottom 1/4 of the screen with some odd spots as well, it's really distracting and essentially makes the TV useless (to me).
OLED screens are known for having burn-in problems like this. LCDs don't, though they probably have issues with backlights becoming dim with age.
I have an LG about that vintage and it’s starting to black out when doing 4K content. All components before it switched out and up to date in firmware. Reatarting works, sometimes all day, sometimes 1 minute.
My other TV about the same vintage is starting to have stuck pixels in the corner.
Modern failure modes aren’t nearly as graceful.
But when it does, it will probably be the capacitors in the power supply that have dried out.
Is that really the case? Because if so, it seems like simply replacing the capacitors would save a lot of waste and unnecessary purchases of new TVs...
This is a very common fault, yes. Power supply issues in general. It is also not uncommon for people to replace e.g. Wifi routers because the wall warts fail.
It comes down to a few people don't knowing a lot about it - and I'm not blaming anyone for that, we all have our interests and most people have more than enough to do already to worry about what goes on inside their stuff.
Also, electronics are, to a lot of people in a lot of places, so cheap that they would rather just curse a little and buy a new thing, instead of bothering with taking the thing to a shop. And of course a few hours of skilled labour in a big city in the west might also be almost as expensive as making a whole new TV in a factory in Asia plus shipping, so it might not even make economic sense.
> And of course a few hours of skilled labour in a big city ...
In many/most places, these repair shops don't even exist any more, because the products have gotten too complicated/integrated/parts-unavailable, and the economics are nonsensical.
Electrolytic capacitors are not solid state and likely #1 failure mode for most electronics. There are options for better (e.g. Al polymer) capacitors that are rather expensive - overall good capacitors are 'expensive', e.g. more than a dollar a piece in some cases.
The 2nd most common failure mode gotta be the mlcc (multi layer ceramic capacitor) cracks/shorts.
How can I even know which capacitor is faulty?
If your model was popular, there's likely a recap kit for its power supply. It usually makes senss to swap all the capacitors in the kit, unless the kit instructions say otherwise.
You can look for physical signs of degredation (bulgy, leaky, discolored), but to really test a capacitor for capacititance, you need to take it out of the circuit, at which point, you may as well put a new, high quality capacitor in.
The OEM capacitors may likely have a just right voltage rating, a new one with a higher voltage rating (and same capacitance, compatible type) may last longer in cirucit as well.
> new one with a higher voltage rating (and same capacitance, compatible type) may last longer in cirucit as well.
That's not necessarily true, higher voltage rating equals higher ESR which means more heat.
That would require some experience, yet the most common visual clue would be 'bulging'. There are some ways to measure ESR w/o desoldering but they won't be reliable at all times.
Measuring voltages, peak to peak, is a bit more work.
A TV used to cost a few weeks pay and now you can get a TV for the equivalent of a few hours pay. There just isn't much of a market for a $3000+ TV.
Few usually means 3-5 or so, a half decent TV would be at least half a grand. That's rather high hourly pay rate.
Explain to me why this tv for $100 [1] isn't perfectly suitable to replace a 2008 40" 1080p Samsung LCD with florescent backlight that 2was a deal at $1000. Yeah, you could get something bigger and better. Yes, price comparison on a sale week is a bit unfair.
[1] https://www.bestbuy.com/site/tcl-40-class-s3-s-class-1080p-f...
Only one metric of 'quality' has plummeted.
A rock lasts billions of years, but its quality as a TV is rather questionable.
"that's well beyond the expected useful lifespan of even a modern TV, let alone an older model like these"
People still run these Trinitron TVs to this day.
It is a legitimate business decision, to sell things that last less than 20 years. Fine, I think it is lame, but it is their choice.
But, we shouldn’t let companies get away with selling products that catch fire after working fine for 20 years.
> that's well beyond the expected useful lifespan of even a modern TV
What? That's nuts. Why bother buying a tv if you're immediately going to throw it in the trash
My radial arm saw ended up getting a product recall for simply being too difficult for the average consumer to use safely. The "recall" amounted to them sending you instructions to cut off a critical power cord and mail it in to them, and they send you a $50 check.
That is completely unreasonable. Companies can't be expected to take in and repair devices that old.
They don't do recalls even on modern hardware. But soldering hacks are no longer possible, all parts are serialized.
Louis Rossmann made many videos on this.
What are you talking about? Capacitor technology hasn't changed substantially in decades, and it's just as possible to change caps with a soldering iron now as it was 20 years ago. I have no idea what you mean by "serialized".
not capacitors, but more advanced components, like the camera, have serial numbers embedded in them, and the serial number needs to match, otherwise it won't accept the component. Components off a stolen device are put on a list and won't work in admirer another phone, so stolen phones aren't even worth anything for parts, driving down the market for stolen phones. It also makes the job of repair shops harder, which is collateral damage in Apple's eyes, but is very much material for anyone running a repair shop.
The only reason this is an issue for repair shops is they can't sell you recycled stolen parts at bottom of market prices for a sky high mark up. On top of that the "non genuine parts", some of which really are utterly dire, show up in the OS as being not genuine parts. Buying genuine parts, which are available from Apple, eats into the margins. There is very little honour in the repair market, despite the makeup applied to it by a couple of prominent youtubers and organisations.
The amount of horror stories I've seen over the years from independent repairers is just terrible. Just last year a friend had a screen hot snotted back on their Galaxy.
> they can't sell you recycled stolen parts at bottom of market prices for a sky high mark up
What represents a more efficient economy. The one where broken phones get reused for parts or the one where you have to throw them away?
The economy that isn't backed with criminal activity and loss for customers.
If you think Apple's part pairing policy has anything to do with consumer benefit, I have a bridge in Arizona to sell you.
> The only reason this is an issue for repair shops is they can't sell you recycled stolen parts at bottom of market prices for a sky high mark up.
This is just incredibly dishonest framing and completely ignoring what the right to repair and third party repair shop issue is all about.
> Buying genuine parts, which are available from Apple,
It is not a margin problem, it is an availability problem. Apple does not allow third party repair shops to stock common parts, such as batteries or displays for popular iPhones. This is only possible when providing the devices serial numbers. This effectively prevents third party repair shops from competing with Apple or Apple authorized service providers because they have artificially inflated lead times.
Becoming Apple authorized isn't an option for actual repair shops because that would effectively disallow them from doing actual repairs when possible, rather than playing Dr. Part Swap. Everything what Apple does in the repair space essentially boils down to them doing everything they can to avoid having competition in the repair space.
> eats into the margins
Replacing a 45ct voltage regulator on a mainboard is cheaper than replacing the entire mainboard with everything soldered on is cheaper, but doesn't allow for very nice margins.
> There is very little honour in the repair market
There is very little honour in any market. Honour does not get rewarded nowadays, people are in <insert market> to make money, if you're lucky they still take a little pride in their work. If a repair shop offers good service or not should be up to the consumer to determine, not up to Apple (or any electriconics manufacturer that employs the same tactics).
> makeup applied to it by a couple of prominent youtubers and organisations.
That is called marketing, that's what Apple does also pretty good. They're also lying when they say they are environmentally conscious while they also have their genius bar employees recommend an entirely new screen assembly on a MacBook just because a backlight cable came loose.
> The amount of horror stories I've seen over the years from independent repairers is just terrible. J
The amount of horror stories I have experienced with Apple is no joke either. Apple is always taking the sledgehammer approach with their repairs. I've had the pleasure myself to deal with Apple repairs once for my old 2019 MBP. It wouldn't take a charge anymore, went to the Genius Bar and received a quote for a new mainboard costing well over 1000 EUR. Being familiar with some of the more technical videos of Rossmann etc, I found one electronics repair store that actually does board level stuff and got it fixed for a fraction of the price (iirc it was ~200 EUR).
Even if Apple has room for improvement here, I think it’s still worth it to try to curb the market for stolen parts, because that’s going to exist even if Apple sold spare parts in bulk at-cost simply because there exist unscrupulous repair shops that have no qualms with charging you OEM part prices while using gray market parts that cost a fraction as much on eBay, Aliexpress, etc.
For instance, maybe Apple could supply parts in bulk to repair shops but require registration of those parts prior to usage. The repaired iPhone would function regardless but loudly alert the user that unregistered parts were used to repair it. Gray market parts naturally aren’t going to be able to be registered (either due to serial not existing in their system or having been parted out from stolen devices), and thus the user is given some level of assurance that they’re not paid for questionable repair services.
I see. Yes, that is a big problem for component swapping. I was just thinking of electronics with old/faulty caps; those will still be repairable.
Doesn’t Apple offer a way to re-pair components if they are genuine and not stolen (unregistered from the previous AppleId)?
and Apple will very happily charge you for that privilege
TBH for such a critical piece of our modern lives, I would be more than fine to pay extra to be 100% sure I am getting original parts, put in professionally and in secure manner re my personal data. I wish ie Samsung had such service where I live.
We anyway talk about expensive premium phones to start with, so relatively expensive after-warranty service is not shocking.
This may actually eventually sway me into apple camp. This and what seems like much better theft discouragement.
I don't. Such mechanisms also disqualify 3rd party replacements. It is just a wasteful solution. Not that any smartphone would qualify as decent here.
But as a customer it will overall be more expensive for you.
There are things in life where amount paid is far from top priority, and phone is one these days. With sums we talk about, I just don't care anymore, and Samsung I have now is even more expensive and more wasteful.
Re wastefulness - a decent laptop causes 10x more pollution to manufacture than phone. Desktop PC 10x that. TVs. Cars. Clothing. Phones are very much down a very long line of higher priority targets for eco friendly approach.
It is not about stolen phones, it is about monetization of customer services. If stealing phones was legal, job description for procurement/purchase departments would look differently as well.