There are so many cases of this sort of stuff it's unreal. But it gets even stupider.
I found one a few years back when I repaired a linear power supply. This required me to reverse engineer it first because there was no service manual. I buzzed the whole thing out and found out that one of the electrolytic capacitors had both legs connected to ground. They must have shipped thousands of power supplies with that error in it and no one even noticed.
Yeah, I have a Performa 450, which I believe is the exact same computer sold under a different name. So this is definitely important to know. I can go back to bed now, my job for today is done.
Commodore just kept doing this. Just listing shoddy craftsmanship would take forever, and then we get to intentional bad decisions, like giving the A1200 a power supply that's both defective (capacitors ofc) and barely enough to support the basic configuration with no expansions, which is extra funny because PSUs used with weaker models (A500) had greater output...
In the mid 80's I was the head of the CS student chapter. We ran the computer rooms for the science faculty. We had a room with about 20 Mac 128k. I do not know where Apple sourced their capacitors from, but these were not A-tier. A Mac going up in a puff of white smoke was a weekly occurrence. We had a few in reserve just to cycle them in while they were out to Apple for repair.
P.S. still my favorite Mac of all time was the IIcx. That one coupled with the 'full page display' was a dream.
On the other side, we had intern at our (very small) company and he used his own mac. One time he had to debug a mains-powered device. He decided that he will try connecting it to both mains AND programming dongle without separating transformer. He fried the dongle (it literally exploded, plastic lid banging on desk in sudddenly silent office is the most memorable thing), the company provided monitor and device, but somehow his private mac mini survived all this while being in the middle.
The author seems to misunderstand PCB design flow. This is neither a "factory component placement issue" nor a silkscreen error. The error is in the schematic.
The layout CAD is often done by a different team that follows the schematic provided by design engineering. Automated workflows are common. The silk screen is predefined in a QA'd library. It is not their job to double check engineering's schematic.
The components are placed per the layout data.
Both those teams did their jobs correctly, to incorrect specifications. In fact, the factory performing assembly often is denied access to the schematic as it is sensitive IP.
If you're going to cast blame on a 30 year old computer, at least direct it at the correct group. It wasn't soldered incorrectly at the factory. They soldered it exactly how they were told to - backwards.
This shouldn't be allowed at all: if the product was bad all along, they should be required to fix it, and shouldn't be able to say "well, it's old, so you should just trash it", which means they don't suffer any penalty whatsoever.
I don't think that's a reasonable expectation in general, and certainly not in this case. The affected TVs were all at least 20 years old - that's well beyond the expected useful lifespan of even a modern TV, let alone an older model like these. Nor is it clear what Sony could reasonably have done to repair them; even by 2010, a lot of the parts used in CRT TVs were out of production and unavailable.
Maybe you're too young to remember, but people used to keep TVs for much longer periods before HDTV and flat panels came out.
Also, these TVs are apparently fire hazards. It doesn't matter that they're 20 years old (at the point of the "recall" in 2010).
I doubt the parts necessary to fix them were out of production; you can get parts for truly ancient electronics still. Things like capacitors don't become obsolete. The recall doesn't specify exactly which component is problematic, but says it's age-related, which usually points to capacitors.
>Not sure why we stopped expecting that from electronics.
For TVs specifically, the technology changed a lot. For a long time, everyone was stuck on the NTSC standard, which didn't change much. At first, everyone had B&W TVs, so once you had one, there was no reason to change. Then color TV came out, so suddenly people wanted those. After that, again no reason to change for a long time. Later, they got remote controls, so sometimes people would want one of those, or maybe a bigger screen, but generally a working color TV was good enough. Because TVs were glass CRTs, bigger screens cost a lot more than smaller ones, and there wasn't much change in cost here for a long time.
Then HDTV came out and now people wanted those, first in 720p, and later in 1080i/p. And flat screens came too, so people wanted those too. So in a relatively short amount of time, people went from old-style NTSC CRTs to seeing rapid improvements in resolution (480p->720p->1080->4k), screen size (going from ~20" to 3x", 4x", 5x", 6x", now up to 85"), and also display/color quality (LCD, plasma, QLED, OLED), so there were valid reasons to upgrade. The media quality (I hate the word "content") changed too, with programs being shot in HD, and lately 4k/HDR, so the difference was quite noticeable to viewers.
Before long, the improvements are going to slow or stop. They already have 8k screens, but no one buys them because there's no media for them and they can't really see the difference from 4k. Even 1080p media looks great on a 4k screen with upscaling, and not that much different from 4k. The human eye is only capable of so much, so we're seeing diminishing returns.
So I predict that this rapid upgrade cycle might be slowing, and probably stopping before long with the coming economic crash and Great Depression of 2025. The main driver of new TV sales will be people's old TVs dying from component failure.
There are many use cases for which a decade-old computer is still perfectly serviceable and even where they aren't, those computers can be repurposed for the ones that are.
Moreover, we're talking about televisions and old Macs. TVs with higher resolutions might come out, but lower resolution ones continue to be sold new (implying demand exists at some price), and then why should anybody want to replace a functioning old TV with a newer one of the same resolution?
Much older computers continue to be used because they run software that newer computers can't without emulation (which often introduces bugs) or have older physical interfaces compatible with other and often extremely expensive older hardware.
If people actually wanted to replace their hardware instead of fixing it then they'd not be complaining about the inability to fix it.
>There are many use cases for which a decade-old computer is still perfectly serviceable and even where they aren't, those computers can be repurposed for the ones that are.
It depends. Older computers usually guzzle power, especially if you look at the absolutely awful Pentium4 systems. You're probably better off getting a RasPi or something, depending on what exactly you're trying to do. Newer systems have gotten much better with energy efficiency, so they'll pay for themselves quickly through lower electricity bills.
>TVs with higher resolutions might come out, but lower resolution ones continue to be sold new (implying demand exists at some price)
We're already seeing a limit here. 8k TVs are here now, but not very popular. There's almost no media in that resolution, and people can't tell the difference from 4k.
For a while, this wasn't the case: people were upgrading from 480 to 720 to 1080 and now to 4k.
>and then why should anybody want to replace a functioning old TV with a newer one of the same resolution?
They probably don't; if they're upgrading, they're getting a higher resolution (lots of 1080 screens still out there), or they're getting a bigger screen. It's possible they might want newer smart TV features too: older sets probably have support dropped and don't support the latest streaming services, though usually you can just get an add-on device that plugs into the HDMI port so this is probably less of a factor.
> that's well beyond the expected useful lifespan of even a modern TV, let alone an older model like these
A modern TV may have an expected lifespan of five years. TVs from several decades ago had lifespans of... several decades. Quality has plummeted in that market.
There's nothing inside today's monitors or TVs that can't run for at least 10 years. Our main TV, 42" 720p LCD, is from 2008, and I have monitors that are just as old.
Is that really the case? Because if so, it seems like simply replacing the capacitors would save a lot of waste and unnecessary purchases of new TVs...
What are you talking about? Capacitor technology hasn't changed substantially in decades, and it's just as possible to change caps with a soldering iron now as it was 20 years ago. I have no idea what you mean by "serialized".
not capacitors, but more advanced components, like the camera, have serial numbers embedded in them, and the serial number needs to match, otherwise it won't accept the component. Components off a stolen device are put on a list and won't work in admirer another phone, so stolen phones aren't even worth anything for parts, driving down the market for stolen phones. It also makes the job of repair shops harder, which is collateral damage in Apple's eyes, but is very much material for anyone running a repair shop.
The only reason this is an issue for repair shops is they can't sell you recycled stolen parts at bottom of market prices for a sky high mark up. On top of that the "non genuine parts", some of which really are utterly dire, show up in the OS as being not genuine parts. Buying genuine parts, which are available from Apple, eats into the margins. There is very little honour in the repair market, despite the makeup applied to it by a couple of prominent youtubers and organisations.
The amount of horror stories I've seen over the years from independent repairers is just terrible. Just last year a friend had a screen hot snotted back on their Galaxy.
TBH for such a critical piece of our modern lives, I would be more than fine to pay extra to be 100% sure I am getting original parts, put in professionally and in secure manner re my personal data. I wish ie Samsung had such service where I live.
We anyway talk about expensive premium phones to start with, so relatively expensive after-warranty service is not shocking.
This may actually eventually sway me into apple camp. This and what seems like much better theft discouragement.
Typical Amiga fanboyism and Apple envy, if a Mac does something they have to prove the Amiga outdid it. “Only one model with a reverse polarity capacitor? With Commodore it was a systematic issue!”
There are so many cases of this sort of stuff it's unreal. But it gets even stupider.
I found one a few years back when I repaired a linear power supply. This required me to reverse engineer it first because there was no service manual. I buzzed the whole thing out and found out that one of the electrolytic capacitors had both legs connected to ground. They must have shipped thousands of power supplies with that error in it and no one even noticed.
Name and shame!
Voltcraft. Can't remember the model number.
Well, today I learned to install one capacitor in reverse orientation on the PCB on a 34 year old computer...
Definitely starting Wednesday off productively.
I actually have an LC III in storage, so I might actually be able to make use of this article.
I think this will allow me to classify today as productive.
Yeah, I have a Performa 450, which I believe is the exact same computer sold under a different name. So this is definitely important to know. I can go back to bed now, my job for today is done.
At least you made my Wednesday ;-)
Commodore had 3 capacitors mounted backwards on the A3640, the CPU board of the Amiga 4000 with 68040 processors: https://youtu.be/zhUpcBpJUzg?si=j6UFmIJzoC-UDS6u&t=945
Also mentioned here: https://amiga.resource.cx/exp/a3640
ZX Spectrum +2 shipped with transistors backwards: https://www.bitwrangler.uk/2022/07/23/zx-spectrum-2-video-fi... This even caused visible artifacts on the display, which was apparently not enough for the problem to be noticed at the factory.
Commodore just kept doing this. Just listing shoddy craftsmanship would take forever, and then we get to intentional bad decisions, like giving the A1200 a power supply that's both defective (capacitors ofc) and barely enough to support the basic configuration with no expansions, which is extra funny because PSUs used with weaker models (A500) had greater output...
Classic Commodore Quality :P
They also had backwards caps on the CD32 and A4000
In the mid 80's I was the head of the CS student chapter. We ran the computer rooms for the science faculty. We had a room with about 20 Mac 128k. I do not know where Apple sourced their capacitors from, but these were not A-tier. A Mac going up in a puff of white smoke was a weekly occurrence. We had a few in reserve just to cycle them in while they were out to Apple for repair.
P.S. still my favorite Mac of all time was the IIcx. That one coupled with the 'full page display' was a dream.
On the other side, we had intern at our (very small) company and he used his own mac. One time he had to debug a mains-powered device. He decided that he will try connecting it to both mains AND programming dongle without separating transformer. He fried the dongle (it literally exploded, plastic lid banging on desk in sudddenly silent office is the most memorable thing), the company provided monitor and device, but somehow his private mac mini survived all this while being in the middle.
With things like the Mac 128k, reliability issues may partly be down to Steve Job’s dislike of cooling fans.
To be honest, cooling fans never get the attention they deserve and end up whiney or buzzy.
That said, apple did a really good job with mac pro cooling fans where the shroud spun with the blades.
I think it did better than the the best PC cooling fans like noctua.
Apple should be mandated to issue a recall for these motherboards.
The author seems to misunderstand PCB design flow. This is neither a "factory component placement issue" nor a silkscreen error. The error is in the schematic.
The layout CAD is often done by a different team that follows the schematic provided by design engineering. Automated workflows are common. The silk screen is predefined in a QA'd library. It is not their job to double check engineering's schematic.
The components are placed per the layout data.
Both those teams did their jobs correctly, to incorrect specifications. In fact, the factory performing assembly often is denied access to the schematic as it is sensitive IP.
If you're going to cast blame on a 30 year old computer, at least direct it at the correct group. It wasn't soldered incorrectly at the factory. They soldered it exactly how they were told to - backwards.
Apple should be required to do a recall for these motherboards.
If they do a recall, it will say they should be discarded. Sony has a recall on all its trinitron tvs made before the end of 1990 like this:
https://www.sony.jp/products/overseas/contents/support/infor...
This shouldn't be allowed at all: if the product was bad all along, they should be required to fix it, and shouldn't be able to say "well, it's old, so you should just trash it", which means they don't suffer any penalty whatsoever.
I don't think that's a reasonable expectation in general, and certainly not in this case. The affected TVs were all at least 20 years old - that's well beyond the expected useful lifespan of even a modern TV, let alone an older model like these. Nor is it clear what Sony could reasonably have done to repair them; even by 2010, a lot of the parts used in CRT TVs were out of production and unavailable.
Maybe you're too young to remember, but people used to keep TVs for much longer periods before HDTV and flat panels came out.
Also, these TVs are apparently fire hazards. It doesn't matter that they're 20 years old (at the point of the "recall" in 2010).
I doubt the parts necessary to fix them were out of production; you can get parts for truly ancient electronics still. Things like capacitors don't become obsolete. The recall doesn't specify exactly which component is problematic, but says it's age-related, which usually points to capacitors.
This. I’ve known a TV that was in more or less daily use for over 30 years. Not sure why we stopped expecting that from electronics.
>Not sure why we stopped expecting that from electronics.
For TVs specifically, the technology changed a lot. For a long time, everyone was stuck on the NTSC standard, which didn't change much. At first, everyone had B&W TVs, so once you had one, there was no reason to change. Then color TV came out, so suddenly people wanted those. After that, again no reason to change for a long time. Later, they got remote controls, so sometimes people would want one of those, or maybe a bigger screen, but generally a working color TV was good enough. Because TVs were glass CRTs, bigger screens cost a lot more than smaller ones, and there wasn't much change in cost here for a long time.
Then HDTV came out and now people wanted those, first in 720p, and later in 1080i/p. And flat screens came too, so people wanted those too. So in a relatively short amount of time, people went from old-style NTSC CRTs to seeing rapid improvements in resolution (480p->720p->1080->4k), screen size (going from ~20" to 3x", 4x", 5x", 6x", now up to 85"), and also display/color quality (LCD, plasma, QLED, OLED), so there were valid reasons to upgrade. The media quality (I hate the word "content") changed too, with programs being shot in HD, and lately 4k/HDR, so the difference was quite noticeable to viewers.
Before long, the improvements are going to slow or stop. They already have 8k screens, but no one buys them because there's no media for them and they can't really see the difference from 4k. Even 1080p media looks great on a 4k screen with upscaling, and not that much different from 4k. The human eye is only capable of so much, so we're seeing diminishing returns.
So I predict that this rapid upgrade cycle might be slowing, and probably stopping before long with the coming economic crash and Great Depression of 2025. The main driver of new TV sales will be people's old TVs dying from component failure.
> Not sure why we stopped expecting that from electronics.
Last years model only does 4k, my eyes need 8k
Because electronics got so much better so much faster, that the vast majority of customers did not want to use old hardware.
Especially if customers allowing shorter lifetimes allowed companies to lower the prices.
There are many use cases for which a decade-old computer is still perfectly serviceable and even where they aren't, those computers can be repurposed for the ones that are.
Moreover, we're talking about televisions and old Macs. TVs with higher resolutions might come out, but lower resolution ones continue to be sold new (implying demand exists at some price), and then why should anybody want to replace a functioning old TV with a newer one of the same resolution?
Much older computers continue to be used because they run software that newer computers can't without emulation (which often introduces bugs) or have older physical interfaces compatible with other and often extremely expensive older hardware.
If people actually wanted to replace their hardware instead of fixing it then they'd not be complaining about the inability to fix it.
>There are many use cases for which a decade-old computer is still perfectly serviceable and even where they aren't, those computers can be repurposed for the ones that are.
It depends. Older computers usually guzzle power, especially if you look at the absolutely awful Pentium4 systems. You're probably better off getting a RasPi or something, depending on what exactly you're trying to do. Newer systems have gotten much better with energy efficiency, so they'll pay for themselves quickly through lower electricity bills.
>TVs with higher resolutions might come out, but lower resolution ones continue to be sold new (implying demand exists at some price)
We're already seeing a limit here. 8k TVs are here now, but not very popular. There's almost no media in that resolution, and people can't tell the difference from 4k.
For a while, this wasn't the case: people were upgrading from 480 to 720 to 1080 and now to 4k.
>and then why should anybody want to replace a functioning old TV with a newer one of the same resolution?
They probably don't; if they're upgrading, they're getting a higher resolution (lots of 1080 screens still out there), or they're getting a bigger screen. It's possible they might want newer smart TV features too: older sets probably have support dropped and don't support the latest streaming services, though usually you can just get an add-on device that plugs into the HDMI port so this is probably less of a factor.
A decade old CPU would be a Haswell, not a Pentium 4.
> that's well beyond the expected useful lifespan of even a modern TV, let alone an older model like these
A modern TV may have an expected lifespan of five years. TVs from several decades ago had lifespans of... several decades. Quality has plummeted in that market.
5 years? Is that really true? I’m currently using an LG from 2017 and cannot imagine needing to change it. I would be shocked if it stopped working.
I don't think it is true at all.
There's nothing inside today's monitors or TVs that can't run for at least 10 years. Our main TV, 42" 720p LCD, is from 2008, and I have monitors that are just as old.
Yep. My TV, a 42" Panasonic plasma, dates from 2009 and is still working perfectly. I haven't replaced it, because why would I?
But when it does, it will probably be the capacitors in the power supply that have dried out.
Is that really the case? Because if so, it seems like simply replacing the capacitors would save a lot of waste and unnecessary purchases of new TVs...
Only one metric of 'quality' has plummeted.
A rock lasts billions of years, but its quality as a TV is rather questionable.
"that's well beyond the expected useful lifespan of even a modern TV, let alone an older model like these"
People still run these Trinitron TVs to this day.
It is a legitimate business decision, to sell things that last less than 20 years. Fine, I think it is lame, but it is their choice.
But, we shouldn’t let companies get away with selling products that catch fire after working fine for 20 years.
> that's well beyond the expected useful lifespan of even a modern TV
What? That's nuts. Why bother buying a tv if you're immediately going to throw it in the trash
They don't do recalls even on modern hardware. But soldering hacks are no longer possible, all parts are serialized.
Louis Rossmann made many videos on this.
What are you talking about? Capacitor technology hasn't changed substantially in decades, and it's just as possible to change caps with a soldering iron now as it was 20 years ago. I have no idea what you mean by "serialized".
not capacitors, but more advanced components, like the camera, have serial numbers embedded in them, and the serial number needs to match, otherwise it won't accept the component. Components off a stolen device are put on a list and won't work in admirer another phone, so stolen phones aren't even worth anything for parts, driving down the market for stolen phones. It also makes the job of repair shops harder, which is collateral damage in Apple's eyes, but is very much material for anyone running a repair shop.
The only reason this is an issue for repair shops is they can't sell you recycled stolen parts at bottom of market prices for a sky high mark up. On top of that the "non genuine parts", some of which really are utterly dire, show up in the OS as being not genuine parts. Buying genuine parts, which are available from Apple, eats into the margins. There is very little honour in the repair market, despite the makeup applied to it by a couple of prominent youtubers and organisations.
The amount of horror stories I've seen over the years from independent repairers is just terrible. Just last year a friend had a screen hot snotted back on their Galaxy.
> they can't sell you recycled stolen parts at bottom of market prices for a sky high mark up
What represents a more efficient economy. The one where broken phones get reused for parts or the one where you have to throw them away?
I see. Yes, that is a big problem for component swapping. I was just thinking of electronics with old/faulty caps; those will still be repairable.
Doesn’t Apple offer a way to re-pair components if they are genuine and not stolen (unregistered from the previous AppleId)?
and Apple will very happily charge you for that privilege
TBH for such a critical piece of our modern lives, I would be more than fine to pay extra to be 100% sure I am getting original parts, put in professionally and in secure manner re my personal data. I wish ie Samsung had such service where I live.
We anyway talk about expensive premium phones to start with, so relatively expensive after-warranty service is not shocking.
This may actually eventually sway me into apple camp. This and what seems like much better theft discouragement.
For 1993 hardware?
I wonder if there were any bootleg boards that copied the silkscreen mistake, but didn't use those 16V capacitors, and ended up catching fire.
Does the -5V rail do anything other than power old RS-232 ports?
Macs have RS-422 ports, not RS-232. But, no.
Commodore struggled with same mistakes on negative rail in Audio section, but also somehow on highend expensive CPU board.
https://wiki.console5.com/wiki/Amiga_CD32 C408 C811 "original may be installed backwards! Verify orientation against cap map"
A4000 https://wordpress.hertell.nu/?p=1438 C443 C433 "notice that the 2 capacitors that originally on A4000 have the wrong polarity"
Much worse is Commodore A3640 68040 CPU board aimed at top of the line A3000 and A4000 http://amiga.serveftp.net/A3640_capacitor.html https://forum.amiga.org/index.php?topic=73570.0 C105 C106 C107 silkscreen wrong, early revisions build according to bad silkscreen.
Typical Amiga fanboyism and Apple envy, if a Mac does something they have to prove the Amiga outdid it. “Only one model with a reverse polarity capacitor? With Commodore it was a systematic issue!”
They were probably expecting these to fail a few months after the warranty expired.