For last 2 years, I've noticed a worrying trend: the typical budget PCs (especially Laptops) are being sold at higher prices with lower RAM (just 8GB) and lower-end CPUs (and no dedicated GPUs).
Industry mandate should have become 16GB RAM for PCs and 8GB for mobile, since years ago, but instead it is as if computing/IT industry is regressing.
New budget mobiles are being launched with lower-end specs as well (e.g., new phones with Snapdragon Gen 6, UFS2.2). Meanwhile, features that were being offered in budget phones, e.g., wireless charging, NFC, UFS3.1 have silently been moved to the premium mobile segment.
Meanwhile the OSes and software are becoming more and more complex, bloated and more unstable (bugs) and insecure (security loopholes ready for exploits).
It is as if the industry has decided to focus on AI and nothing else.
And this will be a huge setback for humanity, especially the students and scientific communities.
I'm reading this thread on an 11-year-old desktop with 8GB of RAM and not feeling any particular reason to upgrade, although I've priced it out a few times just to see.
Mint 22.x doesn't appear to be demanding any more of my machine than Mint 20.x. Neither is Firefox or most websites, although YouTube chat still leaks memory horrendously. (Of course, download sizes have increased.)
I wonder what we can do to preserve personal computing, where users, not vendors, control their computers? I’m tired of the control Microsoft, Apple, Google, OpenAI, and some other big players have over the entire industry. The software has increasingly become enshittified, and now we’re about to be priced out of hardware upgrades.
The problem is coming up with a viable business model for providing hardware and software that respect users’ ability to shape their environments as they choose. I love free, open-source software, but how do developers make a living, especially if they don’t want to be funded by Big Tech?
> Question: are SoCs with on die memory be effected by this?
SoCs with on-die memory (which is, these days, exclusively SRAM, since I don't think IBM's eDRAM process for mixing DRAM with logic is still in production) will not be effected. SiPs with on-package DRAM, including Apple's A and M series SiPs and Qualcomm's Snapdragon, will be effected -- they use the same DRAM dice as everyone else.
The aforementioned Ryzen AI chip is exactly what you describe, with 128 GB on-package LPDDR5X. I have two of them.
To answer the original question: the Framework Desktop is indeed still at the (pretty inflated) price, but for example the Bosgame mini PC with the same chip has gone up in price.
Apple secured at least a year-worth supply of memory (not in actual chips but in prices).
The bigger the company = longer the contract.
However it will eventually catch up even to Apple.
It is not prices alone due to demand but the manufacturing redirection from something like lpddr in iphones to hbm and what have you for servers and gpu
I have a feeling every single supplier of DRAM is going to be far more interested in long-term contracts with Apple than with (for example) OpenAI, since there's basically zero possibility Apple goes kaput and reneges on their contracts to buy RAM.
We’ve been able to hold the same price we had at launch because we had buffered enough component inventory before prices reached their latest highs. We will need to increase pricing to cover supplier cost increases though, as we recently did on DDR5 modules.
Note that the memory is on the board for Ryzen AI Max, not on the package (as it is for Intel’s Lunar Lake and Apple’s M-series processors) or on die (which would be SRAM). As noted in another comment, whether the memory is on the board, on a module, or on the processor package, they are all still coming from the same extremely constrained three memory die suppliers, so costs are going up for all of them.
Oh no, a manufactured thing with long lead times has more demand than forecast and might be in short supply for a few years, surely a greater disaster has never befallen mankind!
Nah. The demand is driven by corporations that hoard hardware in datacenters, starving the market and increasing prices in the process - otherwise known as scalping. Then they sell you back that same hardware, in the form of cloud services, for even higher prices.
At this current pace, if "the electorate" doesn't see real benefits to any of this. 2028 is going to be referendum on AI unfortunately.
Whether you like it or not, AI right now is mostly
- high electricity prices
- crazy computer part prices
- phasing out of a lot of formerly high paying jobs
and the benefits are mostly
- slop and chatgpt
Unless OpenAI and co produce the machine god, which genuinely is possible. If most people's interactions with AI are the negative externalities they'll quickly be wondering if ChatGPT is worth this cost.
I hope they do. We live in a time of incredibly centralized wealth & power and AI and particularly "the machine god" has the potential to make things 100x worse and return us to a feudal system if the ownership and profits all go to a few capital owners.
For good measure, a bunch of this is funded through money taken directly from the electorates taxes and given to a few select companies, whose leaders then graciously donate to the latest Ballroom grift. Micron, so greedy they thought nothing of shutting down their consumer brand even when it costs them nothing at all, got $6B in Chips Act money in 2024.
Positive downstream effect: The way software is built will need to be rethought and improved to utilize efficiencies for stagnating hardware compute. Think of how staggering the step from the start of a console generation to the end used to be. Native-compiled languages have made bounding leaps that might be worth pursuing again.
Alternatively, we'll see a drop in deployment diversity, with more and more functionality shifted to centralised providers that have economies of scale and the resources to optimise.
E.g. IDEs could continue to demand lots of CPU/RAM, and cloud providers are able to deliver that cheaper than a mostly idle desktop.
If that happens, more and more of its functionality will come to rely on having low datacenter latencies, making use on desktops less viable.
Who will realistically be optimising build times for usecases that don't have sub-ms access to build caches, and when those build caches are available, what will stop the median program from having even larger dependency graphs.
I’d feel better about the RAM price spikes if they were caused by a natural disaster and not by Sam Altman buying up 40% of the raw wafer supply, other Big Tech companies buying up RAM, and the RAM oligopoly situation restricting supply.
This will only serve to increase the power of big players who can afford higher component prices (and who, thanks to their oligopoly status, can effectively set the market price for everyone else), while individuals and smaller institutions are forced to either spend more or work with less computing resources.
The optimistic take is that this will force software vendors into shipping more efficient software, but I also agree with this pessimistic take, that companies that can afford inflated prices will take advantage of the situation to pull ahead of competitors who can’t afford tech at inflated prices.
I don’t know what we can do as normal people other than making do with the hardware we have and boycotting Big Tech, though I don’t know how effective the latter is.
Asus doesn't make RAM. That's the whole problem: there are plenty of RAM retail brands, but they are all just selling products that originate from only a couple of actual fabs.
So far all I am seeing is an increase in prices, so any company claiming it will "ramp up production" here is, in my opinion, just lying for tactical reasons.
Governments need to intervene here. This is a mafia scheme now.
I purchased about three semi-cheap computers in the last ~5 years or so. Looking at the RAM prices, the very same units I bought (!) now cost 2.5x as much as before (here I refer to my latest computer model, from 2 years ago). This is a mafia now. I also think these AI companies should be extra taxed because they cause us economic harm here.
I now consider this a mafia that aims to milk us for more money. This includes all AI companies but also manufacturers who happily benefit from this. It is a de-facto monopoly. Governments need to stop allowing this milking scheme to happen.
"Monopoly" means one seller, so you can't say multiple X makes a monopoly and make sense. You probably mean collusion.
If demand exceeds supply, either prices rise or supply falls, causing shortages. Directly controlling sellers (prices) or buyers (rationing) results in black markets unless enforcement has enough strength and integrity. The required strength and integrity seems to scale exponentially with the value of the good, so it's typically effectively impossible to prevent out-of-spec behavior for anything not cheap.
If everyone wants chips, semiconductor manufacturing supply should be increased. Governments should subsidize domestic semiconductor industries and the conditions for them to thrive (education, etc.) to meet both goals of domestic and economic security, and do it in a way that works.
The alternative is decreasing demand. Governments could hold bounty and incentive programs for building electronics that last a long time or are repairable or recyclable, but it's entirely possible the market will eventually do that.
Technically it's a lot closer to monopsony (Sam Altman/OAI cornering 40% of the market on DRAM in a clever way for his interests that harms the rest of the world that would want to use it). I keep hoping that somehow necessity will spur China to become the mother of invention here and supply product to serve the now lopsided constrained supply given increasing demand but I just don't know how practical it will be.
Micron is exiting direct to consumer sales. That doesn't mean their chips couldn't end up in sticks or devices sold to consumers, just that the no-middleman Crucial brand is dead.
Also, even if no Micron RAM ever ended up in consumer hands, it would still reduce prices for consumers by increasing the supply to other segments of the market.
I've been ruminating on this past two years, with life before AI most of the compute staying cheap and pretty much 90% idle , we are finally getting to the point of using all of this compute. We probably will find more algorithms to improve efficiency of all the matrix computations, and with AI bubble same thing will happen that happened with telecom bubble and all the fiber optic stuff that turned out to be drastically over provisioned. Fascinating times!
Well thank th FSM that the article opens right up with buy now! No thanks, I'm kind of burnt out on mindless consumerism, I'll go pot some plants or something.
I highly recommend disabling javascript in your browser.
Yes, it makes many sites "look funny", or maybe you have to scroll past a bunch of screen sized "faceplant" "twitverse" and "instamonetize" icons, but, there are far fewer ads (like none).
And of course some sites won't work at all. That's OK too, I just don't read them. If it's a news article, its almost always available on another site that doesn't require javascript.
I would not be able to handle that due to video streaming, web clients for things like email, etc. And some sites I trust (including HN) provide useful functionality with JS (while degrading gracefully).
But I use NoScript and it is definitely a big help.
I whole-heartedly agree with your recommendation and join in encouraging more adopters of this philosophy and practice.
Life online without javascript is just better. I've noticed an increase in sites that are useful (readable) with javascript disabled. Better than 10 years ago, when broken sites were rampant. Though there are still the lazy ones that are just blank pages without their javascript crutch.
Maybe the hardware/resource austerity that seems to be upon us now will result in people and projects refactoring, losing some glitter and glam, getting lean. We can resolve to slim down, drop a few megs of bloat, use less ram and bandwidth. It's not a problem; it's an opportunity!
In any case, Happy New Year! [alpha preview release]
For last 2 years, I've noticed a worrying trend: the typical budget PCs (especially Laptops) are being sold at higher prices with lower RAM (just 8GB) and lower-end CPUs (and no dedicated GPUs).
Industry mandate should have become 16GB RAM for PCs and 8GB for mobile, since years ago, but instead it is as if computing/IT industry is regressing.
New budget mobiles are being launched with lower-end specs as well (e.g., new phones with Snapdragon Gen 6, UFS2.2). Meanwhile, features that were being offered in budget phones, e.g., wireless charging, NFC, UFS3.1 have silently been moved to the premium mobile segment.
Meanwhile the OSes and software are becoming more and more complex, bloated and more unstable (bugs) and insecure (security loopholes ready for exploits).
It is as if the industry has decided to focus on AI and nothing else.
And this will be a huge setback for humanity, especially the students and scientific communities.
I'm reading this thread on an 11-year-old desktop with 8GB of RAM and not feeling any particular reason to upgrade, although I've priced it out a few times just to see.
Mint 22.x doesn't appear to be demanding any more of my machine than Mint 20.x. Neither is Firefox or most websites, although YouTube chat still leaks memory horrendously. (Of course, download sizes have increased.)
I wonder what we can do to preserve personal computing, where users, not vendors, control their computers? I’m tired of the control Microsoft, Apple, Google, OpenAI, and some other big players have over the entire industry. The software has increasingly become enshittified, and now we’re about to be priced out of hardware upgrades.
The problem is coming up with a viable business model for providing hardware and software that respect users’ ability to shape their environments as they choose. I love free, open-source software, but how do developers make a living, especially if they don’t want to be funded by Big Tech?
Run a lightweight Linux distro on older hardware maybe?
This is it. Buy used Dell and HP hardware with 32 GB of RAM and swap the pcie ssd for 4 TB.
Question: are SoCs with on die memory be effected by this?
Looks like the frame.work desktop with Ryzen 128GB is shipping now at same price it was on release, Apple is offering 512GB Mac studios
Are snapdragon chips the same way?
> Question: are SoCs with on die memory be effected by this?
SoCs with on-die memory (which is, these days, exclusively SRAM, since I don't think IBM's eDRAM process for mixing DRAM with logic is still in production) will not be effected. SiPs with on-package DRAM, including Apple's A and M series SiPs and Qualcomm's Snapdragon, will be effected -- they use the same DRAM dice as everyone else.
The aforementioned Ryzen AI chip is exactly what you describe, with 128 GB on-package LPDDR5X. I have two of them.
To answer the original question: the Framework Desktop is indeed still at the (pretty inflated) price, but for example the Bosgame mini PC with the same chip has gone up in price.
https://en.wiktionary.org/wiki/die#Noun
"dice" is the plural for the object used as a source of randomness, but "dies" is the plural for other noun uses of "die".
Apple secured at least a year-worth supply of memory (not in actual chips but in prices).
The bigger the company = longer the contract.
However it will eventually catch up even to Apple.
It is not prices alone due to demand but the manufacturing redirection from something like lpddr in iphones to hbm and what have you for servers and gpu
I have a feeling every single supplier of DRAM is going to be far more interested in long-term contracts with Apple than with (for example) OpenAI, since there's basically zero possibility Apple goes kaput and reneges on their contracts to buy RAM.
Yes, but OpenAI wants $200 billion in RAM and Apple wants $10.
We’ve been able to hold the same price we had at launch because we had buffered enough component inventory before prices reached their latest highs. We will need to increase pricing to cover supplier cost increases though, as we recently did on DDR5 modules.
Note that the memory is on the board for Ryzen AI Max, not on the package (as it is for Intel’s Lunar Lake and Apple’s M-series processors) or on die (which would be SRAM). As noted in another comment, whether the memory is on the board, on a module, or on the processor package, they are all still coming from the same extremely constrained three memory die suppliers, so costs are going up for all of them.
Oh no, a manufactured thing with long lead times has more demand than forecast and might be in short supply for a few years, surely a greater disaster has never befallen mankind!
> has more demand than forecast
Nah. The demand is driven by corporations that hoard hardware in datacenters, starving the market and increasing prices in the process - otherwise known as scalping. Then they sell you back that same hardware, in the form of cloud services, for even higher prices.
More explanations here:
https://news.ycombinator.com/item?id=46416934
At this current pace, if "the electorate" doesn't see real benefits to any of this. 2028 is going to be referendum on AI unfortunately.
Whether you like it or not, AI right now is mostly
- high electricity prices - crazy computer part prices - phasing out of a lot of formerly high paying jobs
and the benefits are mostly - slop and chatgpt
Unless OpenAI and co produce the machine god, which genuinely is possible. If most people's interactions with AI are the negative externalities they'll quickly be wondering if ChatGPT is worth this cost.
I hope they do. We live in a time of incredibly centralized wealth & power and AI and particularly "the machine god" has the potential to make things 100x worse and return us to a feudal system if the ownership and profits all go to a few capital owners.
For good measure, a bunch of this is funded through money taken directly from the electorates taxes and given to a few select companies, whose leaders then graciously donate to the latest Ballroom grift. Micron, so greedy they thought nothing of shutting down their consumer brand even when it costs them nothing at all, got $6B in Chips Act money in 2024.
the first stages of the world being turned into computronium.
next stage is paving everything with solar panels.
Positive downstream effect: The way software is built will need to be rethought and improved to utilize efficiencies for stagnating hardware compute. Think of how staggering the step from the start of a console generation to the end used to be. Native-compiled languages have made bounding leaps that might be worth pursuing again.
Alternatively, we'll see a drop in deployment diversity, with more and more functionality shifted to centralised providers that have economies of scale and the resources to optimise.
E.g. IDEs could continue to demand lots of CPU/RAM, and cloud providers are able to deliver that cheaper than a mostly idle desktop.
If that happens, more and more of its functionality will come to rely on having low datacenter latencies, making use on desktops less viable.
Who will realistically be optimising build times for usecases that don't have sub-ms access to build caches, and when those build caches are available, what will stop the median program from having even larger dependency graphs.
I’d feel better about the RAM price spikes if they were caused by a natural disaster and not by Sam Altman buying up 40% of the raw wafer supply, other Big Tech companies buying up RAM, and the RAM oligopoly situation restricting supply.
This will only serve to increase the power of big players who can afford higher component prices (and who, thanks to their oligopoly status, can effectively set the market price for everyone else), while individuals and smaller institutions are forced to either spend more or work with less computing resources.
The optimistic take is that this will force software vendors into shipping more efficient software, but I also agree with this pessimistic take, that companies that can afford inflated prices will take advantage of the situation to pull ahead of competitors who can’t afford tech at inflated prices.
I don’t know what we can do as normal people other than making do with the hardware we have and boycotting Big Tech, though I don’t know how effective the latter is.
Some Soviet humor will help you understand the true course of events:
A dad comes home and tells his kid, “Hey, vodka’s more expensive now.” “So you’re gonna drink less?” “Nope. You’re gonna eat less.”
"May rise"?
Prices are already through the roof...
https://www.tomsguide.com/news/live/ram-price-crisis-updates
Big companies secure long-term pricing (multi-year), so iPhones probably won’t feel this in 2026 (or even 2027).
2028 is another story depending on whether this frenzy continues / fabs being built (don’t know whether they are as hard as cpu)
Asus is ramping up production of ram...
So lets see if they might "save us"
Asus doesn't operate fabs and has denied the rumor
https://www.tomshardware.com/pc-components/dram/no-asus-isnt...
Asus doesn't make RAM. That's the whole problem: there are plenty of RAM retail brands, but they are all just selling products that originate from only a couple of actual fabs.
Three major ones: Micron, Samsung, SK Hynix
And a couple of smaller ones: CXMT (if you’re not afraid of the sanctions), Nanya, and a few others with older technology
Is this glofo's time to shine?
So far all I am seeing is an increase in prices, so any company claiming it will "ramp up production" here is, in my opinion, just lying for tactical reasons.
Governments need to intervene here. This is a mafia scheme now.
I purchased about three semi-cheap computers in the last ~5 years or so. Looking at the RAM prices, the very same units I bought (!) now cost 2.5x as much as before (here I refer to my latest computer model, from 2 years ago). This is a mafia now. I also think these AI companies should be extra taxed because they cause us economic harm here.
I now consider this a mafia that aims to milk us for more money. This includes all AI companies but also manufacturers who happily benefit from this. It is a de-facto monopoly. Governments need to stop allowing this milking scheme to happen.
When it's more than one company working together in a monopoly-like fashion, the term is "oligopoly".
https://www.merriam-webster.com/dictionary/oligopoly
There is another word for it: cartel.
e.g., the Phoebus cartel https://en.wikipedia.org/wiki/Phoebus_cartel
"Monopoly" means one seller, so you can't say multiple X makes a monopoly and make sense. You probably mean collusion.
If demand exceeds supply, either prices rise or supply falls, causing shortages. Directly controlling sellers (prices) or buyers (rationing) results in black markets unless enforcement has enough strength and integrity. The required strength and integrity seems to scale exponentially with the value of the good, so it's typically effectively impossible to prevent out-of-spec behavior for anything not cheap.
If everyone wants chips, semiconductor manufacturing supply should be increased. Governments should subsidize domestic semiconductor industries and the conditions for them to thrive (education, etc.) to meet both goals of domestic and economic security, and do it in a way that works.
The alternative is decreasing demand. Governments could hold bounty and incentive programs for building electronics that last a long time or are repairable or recyclable, but it's entirely possible the market will eventually do that.
Why would government officials and politicians want to stop making money?
There is no monopoly in AI. I can name at least 10 big actors worldwide.
Technically it's a lot closer to monopsony (Sam Altman/OAI cornering 40% of the market on DRAM in a clever way for his interests that harms the rest of the world that would want to use it). I keep hoping that somehow necessity will spur China to become the mother of invention here and supply product to serve the now lopsided constrained supply given increasing demand but I just don't know how practical it will be.
AI needs data and data that comes from consumer devices.
> She said the next new factory expected to come online is being built by Micron in Idaho. The company says it will be operational in 2027
Isn't Micron stopping all consumer RAM production? So their factories won't help anyway.
Micron is exiting direct to consumer sales. That doesn't mean their chips couldn't end up in sticks or devices sold to consumers, just that the no-middleman Crucial brand is dead.
Also, even if no Micron RAM ever ended up in consumer hands, it would still reduce prices for consumers by increasing the supply to other segments of the market.
I've been ruminating on this past two years, with life before AI most of the compute staying cheap and pretty much 90% idle , we are finally getting to the point of using all of this compute. We probably will find more algorithms to improve efficiency of all the matrix computations, and with AI bubble same thing will happen that happened with telecom bubble and all the fiber optic stuff that turned out to be drastically over provisioned. Fascinating times!
I don't think any of this is "fascinating" - it is more of a racket scheme. They push the prices up. Governments failed the people here.
Isn't this more easily explained by supply-demand? Supply can't quickly scale, and so with increased demand there will be increased prices.
Well thank th FSM that the article opens right up with buy now! No thanks, I'm kind of burnt out on mindless consumerism, I'll go pot some plants or something.
I didn't see any of that.
I highly recommend disabling javascript in your browser.
Yes, it makes many sites "look funny", or maybe you have to scroll past a bunch of screen sized "faceplant" "twitverse" and "instamonetize" icons, but, there are far fewer ads (like none).
And of course some sites won't work at all. That's OK too, I just don't read them. If it's a news article, its almost always available on another site that doesn't require javascript.
I would not be able to handle that due to video streaming, web clients for things like email, etc. And some sites I trust (including HN) provide useful functionality with JS (while degrading gracefully).
But I use NoScript and it is definitely a big help.
I whole-heartedly agree with your recommendation and join in encouraging more adopters of this philosophy and practice.
Life online without javascript is just better. I've noticed an increase in sites that are useful (readable) with javascript disabled. Better than 10 years ago, when broken sites were rampant. Though there are still the lazy ones that are just blank pages without their javascript crutch.
Maybe the hardware/resource austerity that seems to be upon us now will result in people and projects refactoring, losing some glitter and glam, getting lean. We can resolve to slim down, drop a few megs of bloat, use less ram and bandwidth. It's not a problem; it's an opportunity!
In any case, Happy New Year! [alpha preview release]
Probably using reader mode by default would be less guttural experience (and you’ll have an easy fallback).