I'm running a box I put together in 2014 with an i5-4460 (3.2ghz), 16 GB of RAM, GeForce 750ti, first gen SSD, ASRock H97M Pro4 motherboard with a reasonable PSU, case and a number of fans. All of that parted out at the time was $700.
I've never been more fearful of components breaking than current day. With GPU and now memory prices being crazy, I hope I never have to upgrade.
I don't know how but the box is still great for every day web development with heavy Docker usage, video recording / editing with a 4k monitor and 2nd 1440p monitor hooked up. Minor gaming is ok too, for example I picked up Silksong last week, it runs very well at 2560x1440.
For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.
I agree with you on SSDs, that was the last upgrade that felt like flipping the “modern computer” switch overnight. Everything since has been incremental unless you’re doing ML or high-end gaming.
Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum? I’m not saying to do what I do and buy a $7k laptop and a $15k desktop every year but compared to revenue it seems silly to be worrying about a few thousand dollars per year delta.
I buy the best phones and desktops money can buy, and upgrade them often, because, why take even the tiniest risk that my old or outdated hardware slows down my revenue generation which is orders of magnitude greater than their cost to replace?
Even if you don’t go the overkill route like me, we’re talking about maybe $250/month to have an absolutely top spec machine which you can then use to go and earn 100x that.
Spend at least 1% of your gross revenue on your tools used to make that revenue.
I don't spend money on my computers from a work or "revenue-generating" perspective because my work buys me a computer to work on. Different story if you freelance/consult ofc.
To be fair, Samsung's divisions having guns pointed at each other is nothing new. This is the same conglomerate that makes their own silicon division fight for placement in their own phones, constantly flip-flopping between using Samsung chips, Qualcomm chips, or both in different versions of the same device.
Basically every Galaxy phone comes in two versions. One with Exynos and one with Snapdragon. It's regional though. US always gets the Snapdragon phones while Europe and mostly Asia gets the Exynos version.
My understanding is that the Exynos is inferior in a lot of ways, but also cheaper.
Not one phone, they did this all over the place. Their flagship line did this starting with the Galaxy S7 all the way up to Galaxy S24. Only the most recent Galaxy S25 is Qualcomm Snapdragon only, supposedly because their own Exynos couldn't hit volume production fast enough.
The S23 too was Snapdragon only, allegedly to let the Exynos team catch some breath and come up with something competitive for the following generation. Which they partly did, as the Exynos S24 is about on par with its Snapdragon brother. A bit worse on photo and gaming performance, a bit better in web browsing.
Most of the things people say about efficient markets assume low barriers to entry. When it takes years and tens of billions of dollars to add capacity, it makes more sense to sit back and enjoy the margins. Especially if you think there's a non-trivial possibility that the AI build out is a bubble.
In their defense, how many $20 billion fabs do you want to build in response to the AI ... (revolution|bubble|other words)? It seems very, very difficult to predict how long DRAM demand will remain this elevated.
It's dangerous for them in both directions: Overbuilding capacity if the boom busts vs. leaving themselves vulnerable to a competitor who builds out if the boom is sustained. Glad I don't have to make that decision. :)
If it’s an AI bubble, it would be stupid to open new manufacturing capacity right now. Spend years and billions spinning up a new fab, only to have the bottom of the market drop out as soon as it comes online.
When RAM gets so expensive that even Samsung won’t buy Samsung from Samsung, you know the market has officially entered comic mode. At this rate their next quarterly report is just going to be one division sending the other an IOU.
Overleverage / debt, and refusing to sell at a certain price, are actually very different things though. OpenAI might be a tire fire, but Samsung is the gold pan seller here, and presumably has an excellent balance sheet.
A few hours ago I looked at the RAM prices. I bought
some DDR4, 32GB only, about a year or two ago. I kid
you not - the local price here is now 2.5 times as it
was back in 2023 or so, give or take.
This is important to point out. All the talk about AI companies underpricing is mistaken. The costs to consumers have just been externalized; the AI venture as a whole is so large that it simply distorts other markets in order to keep its economic reality intact. See also: the people whose electric bills have jumped due to increased demand from data centers.
And even more outrageous is the power grid upgrades they are demanding.
If they need the power grid upgraded to handle the load for their data centers, they should pay 100% of the cost for EVERY part of every upgrade needed for the whole grid, just as a new building typically pays to upgrade the town road accessing it.
Making ordinary ratepayers pay even a cent for their upgrades is outrageous. I do not know why the regulators even allow it (yeah, we all do, but it is wrong).
I feel we have a RAM price surge every four years. The excuses change, but it's always when we see a generation switch to the next gen of DDR. Which makes me believe it's not AI, or graphics cards, or crypto, or gaming, or one of the billion other conceivable reasons, but price-gouging when new standards emerge and production capacity is still limited. Which would be much harder to justify than 'the AI/Crypto/Gaming folks (who no-one likes) are sweeping the market...'
But we're not currently switching to a next gen of DDR. DDR5 has been around for several years, DDR6 won't be here before 2027. We're right in the middle of DDR5's life cycle.
That is not to say there is no price-fixing going on, just that I really can't see a correlation with DDR generations.
Regardless of whether it is Crypto/AI/etc., this would seem to be wake-up call #2. We're finding the strangle-points in our "economy"—will we do anything about it? A single fab in Phoenix would seem inadequate?
If 'the West' would be half as smart as they claim to be there would be many more fabs in friendly territory. Stick a couple in Australia and NZ too for good measure, it is just too critical of a resource now.
Why is this downvoted, this is not the first time I've heard that opinion expressed and every time it happens there is more evidence that maybe there is something to it. I've been following the DRAM market since the 4164 was the hot new thing and it cost - not kidding - $300 for 8 of these which would give you all of 64K RAM. Over the years I've seen the price surge multiple times and usually there was some kind of hard to verify reason attached to it. From flooded factories to problems with new nodes and a whole slew of other issues.
RAM being a staple of the computing industry you have to wonder if there aren't people cleaning up on this, it would be super easy to create an artificial shortage given the low number of players in this market. In contrast, say the price of gasoline, has been remarkably steady with one notable outlier with a very easy to verify and direct cause.
I'm running a box I put together in 2014 with an i5-4460 (3.2ghz), 16 GB of RAM, GeForce 750ti, first gen SSD, ASRock H97M Pro4 motherboard with a reasonable PSU, case and a number of fans. All of that parted out at the time was $700.
I've never been more fearful of components breaking than current day. With GPU and now memory prices being crazy, I hope I never have to upgrade.
I don't know how but the box is still great for every day web development with heavy Docker usage, video recording / editing with a 4k monitor and 2nd 1440p monitor hooked up. Minor gaming is ok too, for example I picked up Silksong last week, it runs very well at 2560x1440.
For general computer usage, SSDs really were a once in a generation "holy shit, this upgrade makes a real difference" thing.
I agree with you on SSDs, that was the last upgrade that felt like flipping the “modern computer” switch overnight. Everything since has been incremental unless you’re doing ML or high-end gaming.
Am I crazy for thinking that anyone using computers for doing their job and making their income should have a $5k/year computer hardware budget at a minimum? I’m not saying to do what I do and buy a $7k laptop and a $15k desktop every year but compared to revenue it seems silly to be worrying about a few thousand dollars per year delta.
I buy the best phones and desktops money can buy, and upgrade them often, because, why take even the tiniest risk that my old or outdated hardware slows down my revenue generation which is orders of magnitude greater than their cost to replace?
Even if you don’t go the overkill route like me, we’re talking about maybe $250/month to have an absolutely top spec machine which you can then use to go and earn 100x that.
Spend at least 1% of your gross revenue on your tools used to make that revenue.
I don't spend money on my computers from a work or "revenue-generating" perspective because my work buys me a computer to work on. Different story if you freelance/consult ofc.
To be fair, Samsung's divisions having guns pointed at each other is nothing new. This is the same conglomerate that makes their own silicon division fight for placement in their own phones, constantly flip-flopping between using Samsung chips, Qualcomm chips, or both in different versions of the same device.
To be honest, this actually sounds kinda healthy.
Sears would like to have a word about how healthy intra-company competition is.
> two versions of the same phone with different processors
That's hilarious, which phone is this?
Basically every Galaxy phone comes in two versions. One with Exynos and one with Snapdragon. It's regional though. US always gets the Snapdragon phones while Europe and mostly Asia gets the Exynos version.
My understanding is that the Exynos is inferior in a lot of ways, but also cheaper.
Not one phone, they did this all over the place. Their flagship line did this starting with the Galaxy S7 all the way up to Galaxy S24. Only the most recent Galaxy S25 is Qualcomm Snapdragon only, supposedly because their own Exynos couldn't hit volume production fast enough.
[delayed]
The S23 too was Snapdragon only, allegedly to let the Exynos team catch some breath and come up with something competitive for the following generation. Which they partly did, as the Exynos S24 is about on par with its Snapdragon brother. A bit worse on photo and gaming performance, a bit better in web browsing.
This is the case as recent as of S24, phones can come with exynos or snapdragon, with exynos usually featuring worse performance and battery life
Several high end Galaxy S's AFAIK.
The manufacturers are willing to quadruple the prices for the foreseeable future but not change their manufacturing quotes a bit.
So much for open markets, somebody must check their books and manufacturing schedules.
Most of the things people say about efficient markets assume low barriers to entry. When it takes years and tens of billions of dollars to add capacity, it makes more sense to sit back and enjoy the margins. Especially if you think there's a non-trivial possibility that the AI build out is a bubble.
In their defense, how many $20 billion fabs do you want to build in response to the AI ... (revolution|bubble|other words)? It seems very, very difficult to predict how long DRAM demand will remain this elevated.
It's dangerous for them in both directions: Overbuilding capacity if the boom busts vs. leaving themselves vulnerable to a competitor who builds out if the boom is sustained. Glad I don't have to make that decision. :)
If it’s an AI bubble, it would be stupid to open new manufacturing capacity right now. Spend years and billions spinning up a new fab, only to have the bottom of the market drop out as soon as it comes online.
When RAM gets so expensive that even Samsung won’t buy Samsung from Samsung, you know the market has officially entered comic mode. At this rate their next quarterly report is just going to be one division sending the other an IOU.
Overleverage / debt, and refusing to sell at a certain price, are actually very different things though. OpenAI might be a tire fire, but Samsung is the gold pan seller here, and presumably has an excellent balance sheet.
AI companies must compensate us for this outrage.
A few hours ago I looked at the RAM prices. I bought some DDR4, 32GB only, about a year or two ago. I kid you not - the local price here is now 2.5 times as it was back in 2023 or so, give or take.
I want my money back, OpenAI!
This is important to point out. All the talk about AI companies underpricing is mistaken. The costs to consumers have just been externalized; the AI venture as a whole is so large that it simply distorts other markets in order to keep its economic reality intact. See also: the people whose electric bills have jumped due to increased demand from data centers.
I think we're going to regret this.
I am so glad I built my PC back in April. My 2x16gb DDR5 sticks cost $105 all in then, now it’s $480 on amazon. That is ridiculous!
Yup.
And even more outrageous is the power grid upgrades they are demanding.
If they need the power grid upgraded to handle the load for their data centers, they should pay 100% of the cost for EVERY part of every upgrade needed for the whole grid, just as a new building typically pays to upgrade the town road accessing it.
Making ordinary ratepayers pay even a cent for their upgrades is outrageous. I do not know why the regulators even allow it (yeah, we all do, but it is wrong).
This seems to be for chips put in phones in 2026? I thought these orders were booked further in advance, or is that only for processors?
I feel we have a RAM price surge every four years. The excuses change, but it's always when we see a generation switch to the next gen of DDR. Which makes me believe it's not AI, or graphics cards, or crypto, or gaming, or one of the billion other conceivable reasons, but price-gouging when new standards emerge and production capacity is still limited. Which would be much harder to justify than 'the AI/Crypto/Gaming folks (who no-one likes) are sweeping the market...'
But we're not currently switching to a next gen of DDR. DDR5 has been around for several years, DDR6 won't be here before 2027. We're right in the middle of DDR5's life cycle.
That is not to say there is no price-fixing going on, just that I really can't see a correlation with DDR generations.
Regardless of whether it is Crypto/AI/etc., this would seem to be wake-up call #2. We're finding the strangle-points in our "economy"—will we do anything about it? A single fab in Phoenix would seem inadequate?
Micron is bringing up one in Boise Idaho as well.
If 'the West' would be half as smart as they claim to be there would be many more fabs in friendly territory. Stick a couple in Australia and NZ too for good measure, it is just too critical of a resource now.
What will we do with that fab in two years when nobody needs that excess RAM?
Sell it at lower prices. Demand is a function of price, not a scalar.
Tax write-off donations to schools and non-profits, too.
There has never been 'an excess of RAM', the market has always absorbed what was available.
Why is this downvoted, this is not the first time I've heard that opinion expressed and every time it happens there is more evidence that maybe there is something to it. I've been following the DRAM market since the 4164 was the hot new thing and it cost - not kidding - $300 for 8 of these which would give you all of 64K RAM. Over the years I've seen the price surge multiple times and usually there was some kind of hard to verify reason attached to it. From flooded factories to problems with new nodes and a whole slew of other issues.
RAM being a staple of the computing industry you have to wonder if there aren't people cleaning up on this, it would be super easy to create an artificial shortage given the low number of players in this market. In contrast, say the price of gasoline, has been remarkably steady with one notable outlier with a very easy to verify and direct cause.
This industry has a history of forming cartels.
https://en.wikipedia.org/wiki/DRAM_price_fixing_scandal