Potentially unpopular take: memory manufacturers have been operating on the margins of profitability for quite a while now. Their products are essentially an indistinguishable commodity. Memory from Samsung or Micron or another manufacturer may have slight differences in overclockability, but that matters little to folks who just want a stable system. Hopefully the shortage leads large purchasers to engage in long-term contracts with the memory manufacturers which give them the confidence to invest in new fabs and increased capacity. That would be great for everyone. Additionally, we're likely to see Chinese fab'd DRAM now, which they've been attempting since the '70s but never been competitive at. With these margins, any new manufacturer could gain a foothold.
If LLMs' utility continues to scale with size (which seems likely as we begin training embodied AI on a massive influx of robotic sensor data) then it will continue to gobble up memory for the near future. We may need both increased production capacity _and_ a period of more efficient software development techniques as was the case when a new 512kb upgrade cost $1,000.
> Hopefully the shortage leads large purchasers to engage in long-term contracts with the memory manufacturers which give them the confidence to invest in new fabs and increased capacity.
Most DRAM is already purchased through contracts with manufacturers.
Manufacturers don't actually want too many extremely long term contracts because it would limit their ability to respond to market price changes.
Like most commodities, the price you see on places like Newegg follows the "spot price", meaning the price to purchase DRAM for shipment immediately. The big players don't buy their RAM through these channels, they arrange contracts with manufacturers.
The contracts with manufacturers will see higher prices in the future, but they're playing the long game and will try to delay or smooth out purchasing to minimize exposure to this spike.
> Additionally, we're likely to see Chinese fab'd DRAM now, which they've been attempting since the '70s but never been competitive at.
Companies like Samsung and SK Hynix have DRAM fabs in China already. This has been true for decades. You may have Chinese fab'd DRAM in the computer you're using right now.
Are you referring to complete home-grown DRAM designs? That, too, was already in the works.
> Manufacturers don't actually want too many extremely long term contracts because it would limit their ability to respond to market price changes.
I don't agree with this sentence. Why would not the same apply advice to oil and gas contracts? If you look at the size and duration of oil and gas contracts for major energy importers, they often run 10 years or more. Some of the contracts in Japan and Korea are so large, that a heavy industrial / chemical customers will take an equity stake in the extraction site.
Except silicon, power, and water (and a tiny amount of plastic/paper for packaging), what else does a fab need that only produces DRAM? If true, then power is far and away the most variable input cost.
> Why would not the same apply advice to oil and gas contracts?
Because oil & gas suppliers only ever sell one product, and memory fabs can dynamically switch product mix in response to supply & demand to optimize profits. The same sand, power and water can make DDR4, HBM or DDR5
As I mentioned, various groups within China have been working on China-native DRAM since the '70s. What's new are the margins and market demand to allow them to be profitable with DRAM which is still several years behind the competition.
Well, what really prompted this crisis is AI, as well as Samsung shutting down some production (and I have to say I don't think they mind that the pricing has skyrocketed as a result!)
If the shortage of RAM is because of AI (so servers/data centers I presume?), wouldn't that mean the shortage should be localized to RDIMM rather than the much more common UDIMM that most gaming PCs use? But it seems to me like the pricing is going up more for UDIMM than RDIMM.
UDIMM and RDIMM use the same DRAM chips. And my understanding is that the fabs can switch between DDR5, LPDDR5, and maybe HBM as needed. This means high demand for one type can create a shortage of the others.
> This means high demand for one type can create a shortage of the others.
Wouldn't that mean that a shortage of DRAM chips should cause price difference in all of them? Not sure that'd explain why RDIMM prices aren't raising as sharply as UDIMM. That the fab and assembly lines have transitioned into making other stuff makes sense why'd there be a difference though, as bradfa mentioned in their reply.
It's a valid question if you're not familiar with the RAM market. Sorry you're getting downvoted for it.
The manufacturers make the individual chips, not the modules (DIMMs). (EDIT: Some companies that make chips may also have business units that sell DIMMS, to be pedantic.)
The R in RDIMM means register, aka buffer. It's a separate chip that buffers the signals between the memory chips and the controller.
Even ECC modules use regular memory chips, but with extra chips added for the ECC capacity.
It can be confusing. The key thing to remember is that the price is driven by the price of the chips. The companies that make DIMMs are buying chips in bulk and integrating them on to PCBs.
> Even ECC modules use regular memory chips, but with extra chips added for the ECC capacity.
Quite a few unbuffered designs in the past had a "missing chip". If you ever wondered why a chip was missing on your stick, it's missing ECC. Don't know if it's still the case with DDR5 though.
I have not seen that yet DDR5, I think the signal integrity requirements are too high now to even have unused pads open. Most sticks don’t appear to have many traces at all on the top/bottom sides, just big power/ground planes.
Also with DDR5 each stick is actually 2 channels so you get 2 extra dies.
Because manufacturers transitioned fab and assembly lines from low margin dram to higher margin products like hbm, hence reducing dram supply. But the demand for consumer grade dram hasn’t changed much so prices for it go up.
The “regular” and “premium” label at the pump is misleading. The premium gas isn’t better. It’s just different. Unless your car specifically requires higher octane fuel, there is no benefit to paying for it. https://www.kbb.com/car-advice/gasoline-guide/
It's a sad trend for "the rest of us" and history in general. The economic boom of the 80's thru the 2010s has been a vast democratization of computation - hardware became more powerful and affordable, and algorithms (at least broadly if not individually) became more efficient. We all had supercomputers in our pockets. This AI movement seems to move things in the opposite direction, in that us plebeians have less and less access to RAM, computing power and food and...uh...GPUs to play Cyberpunk; and are dependent on Altermanic aristocracy to dribble compute onto us at their leisure and for a hefty tithe.
I am hoping some of that Clayton Christensen disruption the tech theocracy keep preaching about comes along with some O(N) decrease in transformer/cDNN complexity that disrupts the massive server farms required for this AI boom/bubble thing.
One can see it that way, granted. When I zoom all the way out, all of consumer computation has existed as sort of an addendum or ancillary organ to the big customers: government, large corporations, etc. All our beloved consumer tech started out as absurdly high priced niche stuff for them. We've been sold the overflow capacity and binned parts. And that seems to be a more-or-less natural consequence of large purchasers signing large checks and entering predictable contracts. Individual consumers are very price sensitive and fickle by comparison. From that perspective, anything that increases overall capacity should also increase the supply of binned parts and overflow. Which will eventually benefit consumers. Though the intervening market adjustment period may be painful (as we are seeing). Consumers have also benefited greatly from the shrinking of component sizes, as this has had the effect of increasing production capacity with fixed wafer volume.
> When I zoom all the way out, all of consumer computation has existed as sort of an addendum or ancillary organ to the big customers: government, large corporations, etc.
Perfectly stated. I think comments like the one above come from a mentality that the individual consumer should be the center of the computing universe and big purchasers should be forced to live with the leftovers.
What's really happening is the big companies are doing R&D at incredible rates and we're getting huge benefits by drafting along as consumers. We wouldn't have incredible GPUs in our gaming systems and even cell phones if the primary market for these things was retail entertainment purchases that people make every 5 years.
Advances in video cards and graphics tech were overwhelmingly driven by video games. John Carmack, for instance, was directly involved in these processes and 'back in the day' it wasn't uncommon for games, particularly from him, to be developed to run on tech that did not yet exist, in collaboration with the hardware guys. Your desktop was outdated after a year and obsolete after 2, so it was a very different time than modern times where you example is not only completely accurate, but really understating it - a good computer from 10 years ago can still do 99.9% of what people need, even things like high end gaming are perfectly viable and well dated cards.
The iPhone wasn't designed or marketed to large corporations. 3dfx didn't invent the voodoo for B2B sales. IBM didn't branch out from international business machines to the personal computer for business sales. The compact disc wasn't invented for corporate storage.
Computing didn't take off until it shrank from the giant, unreliable beasts of machines owned by a small number of big corporations to the home computers of the 70s.
There's a lot more of us than them.
There's a gold rush market for GPUs and DRAM. It won't last forever, but while it does high volume sales at high margins will dominate supply. GPUs are still inflated from the crypto rush, too.
> We wouldn't have incredible GPUs in our gaming systems and even cell phones if the primary market for these things was retail entertainment purchases that people make every 5 years.
Arguably we don't. Most of the improvements these days seem to be on the GPGPU side with very little gains in raster performance this decade.
SGI and 3Dfx made high-end simulators for aerospace in the beginning. Gaming grew out of that. Even Intel's first GPU (the i740) came from GE Aerospace.
Wolfenstein 3d was released before 3DFx existed, was purely CPU rendered, and generally considered the father of modern 3d shooters. Even without the scientific computing angle, GPUs would have been developed for gaming simply because it was a good idea that clearly had a big market.
Flight simulators just had more cash for more advanced chips, but arcade games like the Sega Model 1 (Virtua Racing) was via Virtua Fighter an inspiration for the Playstation, and before that there was crude games on both PC and Amiga.
Games were always going to go 3d sooner or later, the real pressure of the high volume competitive market got us more and more capable chips until they were capable enough for the kind of computation needed for neural networks faster than a slow moving specialty market could have.
> Flight simulators just had more cash for more advanced chips
Yes. That is my point. The customers willing to pay the high initial R+D costs opened up the potential for wider adoption. This is always the case.
Even the gaming GPUs which have grown in popularity with consumers are derivatives of larger designs intended for research clusters, datacenters, aerospace, and military applications.
No question that chip companies are happy to take consumers money. But I struggle to think of an example of a new technology which was invented and marketed to consumers first.
100%. We’ve seen crazy swings in RAM prices before.
A colleague who worked with me about 10 years ago on a VDI project ran some numbers and showed that if a Time Machine were available, we could have brought like 4 loaded MacBook Pros back and replaced a $1M HP 3PAR ssd array :)
> This AI movement seems to move things in the opposite direction, in that us plebeians have less and less access to RAM, computing power and food and...uh...GPUs to play Cyberpunk; and are dependent on Altermanic aristocracy to dribble compute onto us at their leisure and for a hefty tithe.
Compute is cheaper than ever. The ceiling is just higher for what you can buy.
Yes, we have $2000 GPUs now. You don't have to buy it. You probably shouldn't buy it. Most people would be more than fine with the $200-400 models, honestly. Yet the fact that you could buy a $2000 GPU makes some people irrationally angry.
This is like the guy I know who complains that pickup trucks are unfairly priced because a Ford F-150 has an MSRP of $80,000. It doesn't matter how many times you point out that the $80K price tag only applies to the luxury flagship model, he anchors his idea of how much a pickup truck costs to the highest number he can see.
Computing is cheaper than ever. The power level is increasing rapidly, too. The massive AI investments and datacenter advancements are pulling hardware development forward at an incredible rate and we're winning across the board as consumers. You don't have to buy that top of the line GPU nor do you have to max out the RAM on your computer.
Some times I think people with this mentality would be happier if the top of the line GPU models were never released. If nVidia stopped at their mid-range cards and didn't offer anything more, the complaints would go away even though we're not actually better off with fewer options.
> Some times I think people with this mentality would be happier if the top of the line GPU models were never released. If nVidia stopped at their mid-range cards and didn't offer anything more, the complaints would go away even though we're not actually better off with fewer options.
If the result was that games were made and optimised for mid-range cards, maybe regular folks actually would be better off.
-the whole reason why the GPU is $2000 is because of said AI bubble sucking up wafers at TSMC or elsewhere, with a soupçon of Jensen's perceived monopoly status...
-for a good part of the year, you could not actually buy said $2000 GPU (I assume you are referring to the 5090) also because of said AI bubble
(granted, while Jensen does not want to sell me his GPU, I would like to point out that Tim Cook has no problem taking my money).
on that point, I can go and buy a Ford F150 tomorrow. Apparently, per the article, I would have problems buying bog standard DDR5 DIMMS to build my computer.
Well put. Since the 1980's consumer has been driving the segment. Even supercomputers were built out of higher end consumer hardware (or playstations in one example).
The move to cloud computing and now AI mean that we're back in the mainframe days.
You still do. There is no "AI movement" you need to participate in. You can grab a copy of SICP and a banged up ten year old thinkpad and compute away, your brain will thank you. It's like when people complain that culture is unaffordable because the newest Marvel movie tickets cost 50 bucks, go to the library or standardebooks.org, the entire Western canon is free
It's not like you need 64GB to have "democratized computation". We used to have 64MB and that was plenty. Unfortunately, software got slower more quickly than hardware got quicker.
> Unfortunately, software got slower more quickly than hardware got quicker.
Hard disagree. A $600 Mac Mini with 16GB of RAM runs everything insanely faster than even my $5000 company-purchased developer laptops from 10 years ago. And yes, even when I run Slack, Visual Studio Code, Spotify, and a gazillion Chrome tabs.
The HN rhetoric about modern computing being slow is getting strangely disconnected from the real world. Cheap computers are super fast like they've never been before, even with modern software.
It is pretty important if you are doing things like 3d animation, video editing, or advanced CAD software. Plus software in general has ballooned its memory requirements and expectations. Even my 11 year old PC had to have a RAM upgrade a few years ago just because software updates suck up so much extra memory, and there is almost nothing consumers can do about it.
Bullshit. It was cramped and I wasn't able to do half of what I was wanting to actually do. Maybe it was plenty for your usecases, but such a small amount of memory was weak for my needs in the late 90s and 2000s. 64MB desktops struggled to handle the photo manipulations I wanted to do with scanned images. Trying to do something like edit video on a home PC was near impossible with that limited amount of memory. I was so happy when we managed to get a 512MB machine a few years later, it made a lot of my home multimedia work a lot better.
I wonder how long it will take for China to flood the market with state-of-the-art modules. It's a pretty decent opportunity for them. They probably can hasten the build of new fabs more than many other nations.
But my guess is that this shortage is short-lived (mostly because of the threat above). There's no OPEC for tech.
I don't disagree per-se, but this is the sort of thing which happens when only a few businesses exist in a commodity market with high entry costs. IOW, it's not great, but it is predictable. See: Oil.
Letting things go this unmanaged with a 3 year run way for AI demand seems a little hard to understand. In this case, not anticipating demand seems to creates more profit.
So, like, we were already pretty much priced out of higher-end graphic cards, and now it's happening to RAM. All this while jobs are disappearing, layoffs are ongoing and CEOs are touting AI's 'capabilities' left and right.
Next is probably CPUs, even if AIs don't use them that much, manufactures will shift production to something more profitable, then gouge prices so that only enterprises will pay for them.
What's next? Electricity?
Where the f*k is all the abundance that AI was supposed to bring into the world? /rant
Maybe that is the answer to how things are supposed to work if AI replaces everyone and no one can afford to buy their stuff.
Things being too cheap allows money to pool at the bottom in little people's hands in the forms of things like "their homes" and "their computers" and "their cars".
You don't really want billions in computing hardware (say) being stashed down there in inefficient, illiquid physical form, you want it in a datacentre where it can be leveraged, traded, used as security, etc. If it has to be physically held down there, ideally it should be expensive, leased and have a short lifespan. The higher echelons seem apparently to think they can drive economic activity by cycling money at a higher level amongst themselves rather than looping in actual people.
This exact price jump seems largely like a shock rather then a slow squeeze, but I think seeing some kind of reversal of the unique 20th century "life gets better/cheaper/easier every generation".
I very much disagree that consumers holding more hardware capabilities than they need is a bad thing. Replace computing hardware with mechanical tools, because they are basically tools, and consider if consumers be better off if wrenches and saw blades and machine tools were held more exclusively by business and large corporations. Would corporations use them more often? Probably. And yet it seems pretty clear that it would hurt the capabilities of regular people to not be able to fix things themselves or innovate outside of a corporate owned lab.
To me the #1 most important factor in a maintaining a prosperous and modern society is common access to tools by the masses, and computing hardware is just the latest set of tools.
It's only meaningful if you have enough disposable income to invest that it eventually (and I don't mean in 50 years) makes a dent against your living expenses.
If you make $4k/mo and rent is $3k, it's pretty silly to state that it's a meaningful thing for someone to scrimp and invest $100/mo into a brokerage account.
They definitely should do this, but it's not going to have any meaningful impact on their life for decades at best. Save for a decade to get $12k in your brokerage account, say it doubles to $24k. If you then decide you can get a generous 5% withdrawal rate you are talking $600/yr against rent that is now probably $3500/mo or more. Plus you're killing your compounding.
It's good to have so emergencies don't sink you - but it's really an annoying talking point I hear a lot lately. Eye rolling when you are telling someone struggling this sort of thing.
It really only makes a major impact if you can dump large amounts of cash into an account early in life - or if you run into a windfall.
Yes electricity by more short-sighted, dirty methods (remember when crypto clowns bought an old fossil plant just for coin mining?) but more alarming, fresh water.
Actually, this seems to be mostly a spike in retail prices, not wholesale DRAM contracts that are only up 60% or so in the past few months according to Samsung. So we should most likely place at least some fraction of the blame on our fellow consumers for overreacting to the news and hoarding RAM at overinflated prices. DRAM sticks are the new toilet paper.
> Actually, this seems to be mostly a spike in retail prices, not wholesale DRAM contracts that are only up 60% or so in the past few months according to Samsung. So we should most likely place at least some fraction of the blame on our fellow consumers for overreacting to the news and hoarding RAM at overinflated prices. DRAM sticks are the new toilet paper.
What is your source on that? Moore's Law is Dead directly contradicts your claims by saying that OpenAI has purchased unfinished wafers to squeeze the market.
Note the consistent "up to 60% since September" figure in the above recent reports. That's for one module capacity, with others being up 30% to 50% - and it certainly isn't the 200% or more we're apparently seeing now in the retail market. That's pure panic hoarding, which is actually a very common overreaction to a sudden price spike.
In world history, the vast majority of abundance is downstream of conquest, not innovation. Plunder makes profit. Even in weird moments like today, where innovation is (or, at least, was) genuinely the driving force of abundance, that innovation would not have come about without the seed capital of Europe plundering Africa and the Americas.
Abundance isn't even the right framing. What most people actually want and need is a certain amount of resources - after which their needs are satiated and they move onto other endeavors. It's the elites that want abundance - i.e. infinite growth forever. The history of early agriculture is marked by hunter-gatherers outgrowing their natural limits, transitioning to farming, and then people figuring out that it's really fucking easy to just steal what others grow. Abundance came from making farmers overproduce to feed an unproductive elite. Subsistence farming gave way to farming practices that overtaxed the soil or risked crop failure.
The history of technology had, up until recently, bucked this trend. Computers got better and cheaper every 18 months because we had the time and money to exploit electricity and lithography to produce smaller computers that used less energy. This is abundance from innovation. The problem is, most people don't want abundance; the most gluttonous need for computational power can be satisfied with a $5000 gaming rig. So the tech industry has been dealing with declining demand, first with personal computers and then with smartphones.
AI fixes this problem, by being an endless demand for more and more compute with the economic returns to show for it. When AI people were talking about abundance, they were primarily telling their shareholders: We will build a machine that will make us kings of the new economy, and your equity shares will grant you seats in the new nobility. In this new economy, labor doesn't matter. We can automate away the entire working and middle classes, up to and including letting the new nobles hunt them down from helicopters for sport.
Ok, that's hyperbole. But assuming the AI bubble doesn't pop, I will agree that affordable CPUs are next on the chopping block. If that happens, modular / open computing is dead. The least restrictive computing environment normal people can afford will be a Macbook, solely because Apple has so much market power from iPhones that they can afford to keep the Mac around for vanity. We will get the dystopia RMS warned about, not from despotic control over computing, but from the fact that nobody will be able to afford to own their own computer anymore. Because abundance is very, very expensive.
> Where the f*k is all the abundance that AI was supposed to bring into the world? /rant
It may have been a bit self-deprecating, but I think your “rant” is a more than justified question that really should be expanded well beyond just this matter. It’s related to a clear fraud that has been perpetrated upon the people of the western world in particular for many decades and generations now in many different ways. We have been told for decades and generations that “we have to plunder your money and debase of and give it to the rich that caused the {insert disaster caused by the ruling class} misery and we have to do it without any kind of consequences for the perpetrators and no, you don’t get any kind of ownership or investment and we have to do it now or the world will end”
I used to sell 64kbit (yes, bit) DRAM at $7 in 1982. 1 year later was <$0.50.
The memory business is a pure commodity and brutally cyclic. Big profit => build a fab => wait 2 years => oh shit, everyone else did it => dump units at below cost. Repeat.
Then you have “acts of God” like that time when a flood or fire or something caused the only factory in the world that produced some obscure part of memory chips to stop production and memory costs double almost overnight. I remember my 4 -> 32 MB back in the 90s cost a fortune because of this.
Same! And then they made new eDRAM for a hot minute as part of Crystal Well. It'd be fun to see them get back into the game in a serious way, but their on-again-off-again attitude toward dGPUs does not give me confidence in their ability to execute on such long-term plans.
Built my son's first gaming PC 2 months ago. Figured it would be cheaper around Black Friday, but the prices were reasonable enough that we didn't wait. Turned out to be a huge savings to buy that fast DDR5 in September.
This is really crazy. I built my first computer not too long ago; like I'm talking less than a month maybe, definitely less than 2. I paid $320 for 64GB Kit (CMK64GX5M2B5600C40) at Microcenter. It is now sold out in Chicago store and listed at $530.
Just went through this today for my daughter- struggled to find an i5 not on pre-order, and the RAM was crippling- ended up going Ryzen 7 for a first time and 2x8Gb DDR5 6000 @ £119 - looking forward to building it with her!
I just got one of the last beelink ser-8s with 64gb for $750. They sold out by the time my order arrived. The newer ones are starting around 830 for a 32gb machine (admittedly with newer everything)
It will be interesting to see the knock on effect of some upcoming consumer electronics; for example Apple was rumored to be working on a cheaper MacBook that uses an iPad CPU, and Valve is working on a SteamOS based gaming machine. Both will likely live/die based on price.
It's way too early to assume these prices are permanent. It's a supply crunch meeting a demand spike. The market will find equilibrium.
Big manufacturers also order their DRAM in advance with contractually negotiated pricing. They're not paying these spot market prices for every computer they ship.
edit: looks like i had the wrong understanding, thanks to the comments below for explaining
~~~~~helps that Apple's SoC has the RAM on the main die itself. They're probably immune from these price hikes, but a lot of the PC/Windows vendors would, which would only make Apple's position even stronger~~~~
They're probably immune for a while because they're probably using a long term contract, but when it comes time to renew they'll have to offer close to market price to convince the manufacturers not to use that fab space for more profitable memory.
How does that make a difference? It's not like the price change is on DIMMs. The price change is on the DRAM, which is a commodity item. It's not like someone is going to discount it if you tell them "nah, I'm going to solder this one to my SoC".
If Apple is insulated it is likely because Apple signs big contracts for large supply and manufacturers would prefer to be insulated from short-term demand shocks and have some reliability that their fabs can keep running and producing profitable chips.
I also had that misunderstanding, so after seeing this comment I looking up info. In this article you can see the xray of the m1 chip composited onto the photo of the chip, which has external memory components. You can also see in the architecture diagram that the memory is attached from outside the area where the Fabric, CPU, GPU, NPU, cache, and some other unlabeled things are located. https://www.macrumors.com/guide/m1/
Maybe if they expect to upgrade within a few years it would be fine. But when I built my current computer 11 years ago I also didn't expect to need 16 gb of ram and only bought 8. 5 years later 16 gb of memory was a requirement for both software and games I was playing. And now 11 years later 16 gigs is not enough for fairly "simple" 3d modelling and 32 gigs is pretty close to the minimum requirement to fully utilize other modern hardware.
Speaking of 11 years old, I just put my 4790k out to pasture. It was a good CPU for a long time, but it got a little long in the tooth for modern workloads.
I think it's intended as a comparison of cost when building a gaming-capable computer vs. a console of somewhat equivalent power.
It used to be a general rule of thumb that you could build a computer of roughly equivalent power for the cost of a game console, or a little more — now the memory costs more than the whole console.
Thank you for mentioning this. Not knowing the specs of a PS5, I'd assumed that the comparison was made because the PS5 now sold for less than the RAM it contains, and scalpers were now hungrily buying up all available PlayStation inventory just to tear them open and feast on the RAM inside.
But since it's 16 GB, the comparison doesn't really make sense.
It still is a rule of thumb, you dont need DDR5 for a gaming computer let alone 64gb. A low end am4 cpu + 16gb of DD4 3600 and a decent gpu will beat a ps5 in performance and cost. I dont understand why the headline made this strange comparison.
It doesn't help that GPUs have also generally gone up over the past decade because there's more market for them besides gaming, along with how they benefit from being hugely parallel and the larger you can make them the better, and fabrication costs are shooting up. I think there was a GamersNexus video at the launch of one of the previous GPU generations that noted that there was a move from "more for your money" each generation towards "more for more", i.e. keeping the value roughly static and increasing the amount they charged for a more capable product.
That's a an analogy-- a literary technique the writer is using, to show the correspondence between the price of a specific amount of DDR5 RAM to a fully integrated system, so the reader can follow the conclusions of their article easier.
Lots of people are speculating that the price spike is AI related. But it might be more mundane:
I'd bet that a good chunk of the apparently sudden demand spike could be last month's Microsoft Windows 10 end-of-support finally happening, pushing companies and individuals to replace many years worth of older laptops and desktops all at once.
I worked in enterprise laptop repair two decades ago — I like your theory (and there's definitely meat there) but my experience was that if a system's OEM configuration wasn't enough to run modern software, we'd replace the entire system (to avoid bottlenecks elsewhere in the architecture).
I just checked how much I paid around 12 months ago for Crucial 96GB kit (2x48GB ddr5 5600 so-dimm). Was $224, same kit today I see listed at $592, wild :/
Same, bought in August for $250 (EU), now it's ~$840. I ended up returning the laptop I'd bought it for and thought 'why hold on to the RAM, it'll only depreciate in value,' so I returned that too. Better hold on to my PS5, I guess.
I did buy 384 GB worth of Samsung DDR5-4800 sticks for my homelab a few months ago. I was wondering at the time if I really needed it, well ended up using it anyway, and turns out, dodged a bullet big time.
RAM has been cheap long enough and now no one remembers how to write efficient GUI apps.
I'm joking, but only kind of. It's not a domain that I do a lot of, but I haven't touched Qt in so long that it would basically be starting from scratch if I tried to write an app with it; I could write an Election app in like an hour.
never fully, like with GPU, it a semi-cartel, it's in everything including you high performance SSD (as cache) they have a reason for them being supper high for ~2 years then they will go down but only "somewhat", lets say if the peak it >2x pricing the price in 2027 will be ~1.5x-1.8x price.
And because everything needs prices expect all electronics to be ~20%-80% more expensive in 2027 compared to today, naturally this includes the profit margin.
and naturally every regulation related companies don't like will supposedly be at fault for this (e.g. right to repair)
RAM prices are cyclical. We are in the under supply part of the cycle.
People just have to wait. As prices are sky high, production capacity will likely increase. Some AI companies will go bust. Demand will plummet and we will buy RAM for pennies while the market consolidates.
That's historically what happened when we had proper competition. Now we have a 3 party oligopoly and massive barriers to entry. Now at least 1 of the 3 is actively signalling than they're not going to not going to spend 100s of billions to expand fab capacity that will lower their profits because if one does it they'll all do it. It's a prisoner dilemma, and they're co-operating. When they co-operate we all lose.
The entry of Chinese DRAM into the market may spur increased competition. If not for competition's sake alone, for national security and domestic supply chain integrity concerns.
That is also somewhat true for GPUs, hard drives and SSDs. They all usually have different cycles but today AI is making them peak all at the same time.
It's due to every hyperscalar building out new AI datacenters. For example you have Google recently saying things like "Google tells employees it must double capacity every 6 months to meet AI demand", and that they need to increase capacity by 1000x within 4-5 years.
If you are a scifi author, it's a mistake to give any hard numbers in real-world units. You will, most likely, greatly underestimate. Even trying to greatly overestimate, you will underestimate.
Commander Data's specifications in the Star Trek TNG episode The Measure of a Man from 1989: 800 quadrillion bits of storage, computing at 60 trillion operations per second.
100 petabytes. That's a big machine. A very big machine. But supercomputers now have memories measured in petabytes.
They never used "bits" again in any Star Trek script. It was kiloquads and gigaquads from then on.
That's fun! To further prove your point I saw this and thought "yeah maybe 100 PB is more common these days but 60 trillion ops / second seems like a lot"
Then I did some googling and it turns out that a single 5090 GPU has a peak FP32 performance of over 100 TFLOPS!
I picked up a PS5 today on a Black Friday deal for 350EUR. 32GB DDR5 is at around 280EUR at the moment.
I have a gaming PC, it runs Linux because (speaking as a Microsoft sysadmin with 10 years under my belt) I hate what Windows has become, but on commodity hardware it’s not quite there for me. Thought I’d play the PlayStation backlog while I wait for the Steam Machine.
I bought 32GB of DDR5 SODIMM last year for 108€ on Amazon. The exact same product that I bought back then is now 232€ on Amazon. I don't like this ride.
Yeah, similar for me. I bought 64 gigs of DDR5 laptop RAM about a year ago; it ended up costing about $190. Now the exact same listing is going for $470. https://a.co/d/fJH1GkW
I guess I'm glad I bought when I did; didn't realize how good of a deal I was getting.
The article references the original coverage which talks to this:
> Despite server-grade RDIMM memory and HBM being the main attractions for hardware manufacturers building AI servers, the entire memory industry, including DDR5, is being affected by price increases. The problem for consumers is that memory manufacturers are shifting production prioritization toward datacenter-focused memory types and producing less consumer-focused DDR5 memory as a result.
But I'm sure the hysteria around that isn't helping prices come back down either.
Is it really a shortage, rather than unfair order fulfillment, when it's just four companies buying up everything? There's plenty of RAM, it's just getting sold to the people who yell the loudest instead of going "sorry we have more customers than just you" and fulfilling orders to everyone.
Call me old-fashioned, but I shouldn't have to have a stock broker to buy a computer. Maybe we could re-organize society to be a bit less ridiculous. "Quick reminder that you could've been born rich instead of a povvo."
Potentially unpopular take: memory manufacturers have been operating on the margins of profitability for quite a while now. Their products are essentially an indistinguishable commodity. Memory from Samsung or Micron or another manufacturer may have slight differences in overclockability, but that matters little to folks who just want a stable system. Hopefully the shortage leads large purchasers to engage in long-term contracts with the memory manufacturers which give them the confidence to invest in new fabs and increased capacity. That would be great for everyone. Additionally, we're likely to see Chinese fab'd DRAM now, which they've been attempting since the '70s but never been competitive at. With these margins, any new manufacturer could gain a foothold.
If LLMs' utility continues to scale with size (which seems likely as we begin training embodied AI on a massive influx of robotic sensor data) then it will continue to gobble up memory for the near future. We may need both increased production capacity _and_ a period of more efficient software development techniques as was the case when a new 512kb upgrade cost $1,000.
> Hopefully the shortage leads large purchasers to engage in long-term contracts with the memory manufacturers which give them the confidence to invest in new fabs and increased capacity.
Most DRAM is already purchased through contracts with manufacturers.
Manufacturers don't actually want too many extremely long term contracts because it would limit their ability to respond to market price changes.
Like most commodities, the price you see on places like Newegg follows the "spot price", meaning the price to purchase DRAM for shipment immediately. The big players don't buy their RAM through these channels, they arrange contracts with manufacturers.
The contracts with manufacturers will see higher prices in the future, but they're playing the long game and will try to delay or smooth out purchasing to minimize exposure to this spike.
> Additionally, we're likely to see Chinese fab'd DRAM now, which they've been attempting since the '70s but never been competitive at.
Companies like Samsung and SK Hynix have DRAM fabs in China already. This has been true for decades. You may have Chinese fab'd DRAM in the computer you're using right now.
Are you referring to complete home-grown DRAM designs? That, too, was already in the works.
Except silicon, power, and water (and a tiny amount of plastic/paper for packaging), what else does a fab need that only produces DRAM? If true, then power is far and away the most variable input cost.
> Why would not the same apply advice to oil and gas contracts?
Because oil & gas suppliers only ever sell one product, and memory fabs can dynamically switch product mix in response to supply & demand to optimize profits. The same sand, power and water can make DDR4, HBM or DDR5
Oil and gas suppliers have several products: gas, diesel, jet a, propane, naptha, asphalt etc.
Aren't the proportions is those essentially static?
> Except silicon, power, and water
Various chemicals too, https://haz-map.com/Processes/97
> Are you referring to complete home-grown DRAM designs? That, too, was already in the works.
Yes, via cxmt as discussed by Asianometry here: https://www.youtube.com/watch?v=mt-eDtFqKvk
As I mentioned, various groups within China have been working on China-native DRAM since the '70s. What's new are the margins and market demand to allow them to be profitable with DRAM which is still several years behind the competition.
Lots of low end android boxes use cxmt ddr4.
Well, what really prompted this crisis is AI, as well as Samsung shutting down some production (and I have to say I don't think they mind that the pricing has skyrocketed as a result!)
But yes we're going to need more fabs for sure
> Well, what really prompted this crisis is AI,
If the shortage of RAM is because of AI (so servers/data centers I presume?), wouldn't that mean the shortage should be localized to RDIMM rather than the much more common UDIMM that most gaming PCs use? But it seems to me like the pricing is going up more for UDIMM than RDIMM.
UDIMM and RDIMM use the same DRAM chips. And my understanding is that the fabs can switch between DDR5, LPDDR5, and maybe HBM as needed. This means high demand for one type can create a shortage of the others.
> This means high demand for one type can create a shortage of the others.
Wouldn't that mean that a shortage of DRAM chips should cause price difference in all of them? Not sure that'd explain why RDIMM prices aren't raising as sharply as UDIMM. That the fab and assembly lines have transitioned into making other stuff makes sense why'd there be a difference though, as bradfa mentioned in their reply.
It's a valid question if you're not familiar with the RAM market. Sorry you're getting downvoted for it.
The manufacturers make the individual chips, not the modules (DIMMs). (EDIT: Some companies that make chips may also have business units that sell DIMMS, to be pedantic.)
The R in RDIMM means register, aka buffer. It's a separate chip that buffers the signals between the memory chips and the controller.
Even ECC modules use regular memory chips, but with extra chips added for the ECC capacity.
It can be confusing. The key thing to remember is that the price is driven by the price of the chips. The companies that make DIMMs are buying chips in bulk and integrating them on to PCBs.
> Even ECC modules use regular memory chips, but with extra chips added for the ECC capacity.
Quite a few unbuffered designs in the past had a "missing chip". If you ever wondered why a chip was missing on your stick, it's missing ECC. Don't know if it's still the case with DDR5 though.
I have not seen that yet DDR5, I think the signal integrity requirements are too high now to even have unused pads open. Most sticks don’t appear to have many traces at all on the top/bottom sides, just big power/ground planes.
Also with DDR5 each stick is actually 2 channels so you get 2 extra dies.
There's some new half assed ECC type of RAM, not sure the name.
Was reading a series of displeased posts about it. Can't seem to find it now.
On die ECC for DDR5. Which corrects locally but does not signal the host or deal with data between the die and the CPU.
There's 9 bits in an ECC byte.
Because manufacturers transitioned fab and assembly lines from low margin dram to higher margin products like hbm, hence reducing dram supply. But the demand for consumer grade dram hasn’t changed much so prices for it go up.
The chips come from same factory. And difference betweeen those two is... a buffer chip. And extra ram die for ECC
Same chips in both, made in the same fabs. Any relative price difference is like the difference between regular and premium gas/petrol.
wait, are you saying that there's no difference between regular and premium gas?
The “regular” and “premium” label at the pump is misleading. The premium gas isn’t better. It’s just different. Unless your car specifically requires higher octane fuel, there is no benefit to paying for it. https://www.kbb.com/car-advice/gasoline-guide/
You get slightly better mpg on premium, just not enough to justify the cost.
Not unless you’re adjusting timing. Premium gas has lower energy per unit mass and per unit volume than standard gas.
> Not unless you’re adjusting timing.
Which, every modern ECU will do automatically based on output from the knock sensors.
This may surprise you, but gas stations typically only have two grades of fuel stored in tanks. Mid-grade gas is mixed at the pump from the other two.
no, but made in same place with mostly same ingredients, just different ratio to hit higher octane (and in some cases some extra additives).
Also vary a bit between winter a summer, basically in winter they can get away with putting a bit more volatile compounds coz it's colder
It's a sad trend for "the rest of us" and history in general. The economic boom of the 80's thru the 2010s has been a vast democratization of computation - hardware became more powerful and affordable, and algorithms (at least broadly if not individually) became more efficient. We all had supercomputers in our pockets. This AI movement seems to move things in the opposite direction, in that us plebeians have less and less access to RAM, computing power and food and...uh...GPUs to play Cyberpunk; and are dependent on Altermanic aristocracy to dribble compute onto us at their leisure and for a hefty tithe.
I am hoping some of that Clayton Christensen disruption the tech theocracy keep preaching about comes along with some O(N) decrease in transformer/cDNN complexity that disrupts the massive server farms required for this AI boom/bubble thing.
One can see it that way, granted. When I zoom all the way out, all of consumer computation has existed as sort of an addendum or ancillary organ to the big customers: government, large corporations, etc. All our beloved consumer tech started out as absurdly high priced niche stuff for them. We've been sold the overflow capacity and binned parts. And that seems to be a more-or-less natural consequence of large purchasers signing large checks and entering predictable contracts. Individual consumers are very price sensitive and fickle by comparison. From that perspective, anything that increases overall capacity should also increase the supply of binned parts and overflow. Which will eventually benefit consumers. Though the intervening market adjustment period may be painful (as we are seeing). Consumers have also benefited greatly from the shrinking of component sizes, as this has had the effect of increasing production capacity with fixed wafer volume.
> When I zoom all the way out, all of consumer computation has existed as sort of an addendum or ancillary organ to the big customers: government, large corporations, etc.
Perfectly stated. I think comments like the one above come from a mentality that the individual consumer should be the center of the computing universe and big purchasers should be forced to live with the leftovers.
What's really happening is the big companies are doing R&D at incredible rates and we're getting huge benefits by drafting along as consumers. We wouldn't have incredible GPUs in our gaming systems and even cell phones if the primary market for these things was retail entertainment purchases that people make every 5 years.
Advances in video cards and graphics tech were overwhelmingly driven by video games. John Carmack, for instance, was directly involved in these processes and 'back in the day' it wasn't uncommon for games, particularly from him, to be developed to run on tech that did not yet exist, in collaboration with the hardware guys. Your desktop was outdated after a year and obsolete after 2, so it was a very different time than modern times where you example is not only completely accurate, but really understating it - a good computer from 10 years ago can still do 99.9% of what people need, even things like high end gaming are perfectly viable and well dated cards.
The iPhone wasn't designed or marketed to large corporations. 3dfx didn't invent the voodoo for B2B sales. IBM didn't branch out from international business machines to the personal computer for business sales. The compact disc wasn't invented for corporate storage.
Computing didn't take off until it shrank from the giant, unreliable beasts of machines owned by a small number of big corporations to the home computers of the 70s.
There's a lot more of us than them.
There's a gold rush market for GPUs and DRAM. It won't last forever, but while it does high volume sales at high margins will dominate supply. GPUs are still inflated from the crypto rush, too.
> The iPhone wasn't designed or marketed to large corporations.
The iPhone isn't exactly a consumer computation device. From that perspective, it does less work at a higher cost.
> We wouldn't have incredible GPUs in our gaming systems and even cell phones if the primary market for these things was retail entertainment purchases that people make every 5 years.
Arguably we don't. Most of the improvements these days seem to be on the GPGPU side with very little gains in raster performance this decade.
Gaming drove the development of GPUs which led to the current AI boom. Smartphones drove small process nodes for power efficiency.
SGI and 3Dfx made high-end simulators for aerospace in the beginning. Gaming grew out of that. Even Intel's first GPU (the i740) came from GE Aerospace.
Wolfenstein 3d was released before 3DFx existed, was purely CPU rendered, and generally considered the father of modern 3d shooters. Even without the scientific computing angle, GPUs would have been developed for gaming simply because it was a good idea that clearly had a big market.
Flight simulators just had more cash for more advanced chips, but arcade games like the Sega Model 1 (Virtua Racing) was via Virtua Fighter an inspiration for the Playstation, and before that there was crude games on both PC and Amiga.
Games were always going to go 3d sooner or later, the real pressure of the high volume competitive market got us more and more capable chips until they were capable enough for the kind of computation needed for neural networks faster than a slow moving specialty market could have.
> Flight simulators just had more cash for more advanced chips
Yes. That is my point. The customers willing to pay the high initial R+D costs opened up the potential for wider adoption. This is always the case.
Even the gaming GPUs which have grown in popularity with consumers are derivatives of larger designs intended for research clusters, datacenters, aerospace, and military applications.
No question that chip companies are happy to take consumers money. But I struggle to think of an example of a new technology which was invented and marketed to consumers first.
It's symbiotic, I suppose.
3dfx didnt. They had a subsidiary? spinoff? Quantum3D that reused 3dfx commodity chips to build cards for simulators.
100%. We’ve seen crazy swings in RAM prices before.
A colleague who worked with me about 10 years ago on a VDI project ran some numbers and showed that if a Time Machine were available, we could have brought like 4 loaded MacBook Pros back and replaced a $1M HP 3PAR ssd array :)
> This AI movement seems to move things in the opposite direction, in that us plebeians have less and less access to RAM, computing power and food and...uh...GPUs to play Cyberpunk; and are dependent on Altermanic aristocracy to dribble compute onto us at their leisure and for a hefty tithe.
Compute is cheaper than ever. The ceiling is just higher for what you can buy.
Yes, we have $2000 GPUs now. You don't have to buy it. You probably shouldn't buy it. Most people would be more than fine with the $200-400 models, honestly. Yet the fact that you could buy a $2000 GPU makes some people irrationally angry.
This is like the guy I know who complains that pickup trucks are unfairly priced because a Ford F-150 has an MSRP of $80,000. It doesn't matter how many times you point out that the $80K price tag only applies to the luxury flagship model, he anchors his idea of how much a pickup truck costs to the highest number he can see.
Computing is cheaper than ever. The power level is increasing rapidly, too. The massive AI investments and datacenter advancements are pulling hardware development forward at an incredible rate and we're winning across the board as consumers. You don't have to buy that top of the line GPU nor do you have to max out the RAM on your computer.
Some times I think people with this mentality would be happier if the top of the line GPU models were never released. If nVidia stopped at their mid-range cards and didn't offer anything more, the complaints would go away even though we're not actually better off with fewer options.
The thing about being annoyed about the top of the range prices, for me, it irritates as it feels like it drags the lower models prices upwardsz
But it’s not like the lower priced models are subsidizing the high-end models (probably the opposite; the high-end ones have greater margins).
> Some times I think people with this mentality would be happier if the top of the line GPU models were never released. If nVidia stopped at their mid-range cards and didn't offer anything more, the complaints would go away even though we're not actually better off with fewer options.
If the result was that games were made and optimised for mid-range cards, maybe regular folks actually would be better off.
I would take this argument more seriously if
-the whole reason why the GPU is $2000 is because of said AI bubble sucking up wafers at TSMC or elsewhere, with a soupçon of Jensen's perceived monopoly status...
-for a good part of the year, you could not actually buy said $2000 GPU (I assume you are referring to the 5090) also because of said AI bubble
(granted, while Jensen does not want to sell me his GPU, I would like to point out that Tim Cook has no problem taking my money).
on that point, I can go and buy a Ford F150 tomorrow. Apparently, per the article, I would have problems buying bog standard DDR5 DIMMS to build my computer.
Well put. Since the 1980's consumer has been driving the segment. Even supercomputers were built out of higher end consumer hardware (or playstations in one example).
The move to cloud computing and now AI mean that we're back in the mainframe days.
>We all had supercomputers in our pockets.
You still do. There is no "AI movement" you need to participate in. You can grab a copy of SICP and a banged up ten year old thinkpad and compute away, your brain will thank you. It's like when people complain that culture is unaffordable because the newest Marvel movie tickets cost 50 bucks, go to the library or standardebooks.org, the entire Western canon is free
It's not like you need 64GB to have "democratized computation". We used to have 64MB and that was plenty. Unfortunately, software got slower more quickly than hardware got quicker.
> Unfortunately, software got slower more quickly than hardware got quicker.
Hard disagree. A $600 Mac Mini with 16GB of RAM runs everything insanely faster than even my $5000 company-purchased developer laptops from 10 years ago. And yes, even when I run Slack, Visual Studio Code, Spotify, and a gazillion Chrome tabs.
The HN rhetoric about modern computing being slow is getting strangely disconnected from the real world. Cheap computers are super fast like they've never been before, even with modern software.
It is pretty important if you are doing things like 3d animation, video editing, or advanced CAD software. Plus software in general has ballooned its memory requirements and expectations. Even my 11 year old PC had to have a RAM upgrade a few years ago just because software updates suck up so much extra memory, and there is almost nothing consumers can do about it.
> We used to have 64MB and that was plenty.
Bullshit. It was cramped and I wasn't able to do half of what I was wanting to actually do. Maybe it was plenty for your usecases, but such a small amount of memory was weak for my needs in the late 90s and 2000s. 64MB desktops struggled to handle the photo manipulations I wanted to do with scanned images. Trying to do something like edit video on a home PC was near impossible with that limited amount of memory. I was so happy when we managed to get a 512MB machine a few years later, it made a lot of my home multimedia work a lot better.
"memory manufacturers have been operating on the margins of profitability for quite a while now."
The manufacturers are scumbags is more likely answer.
https://en.wikipedia.org/wiki/DRAM_price_fixing_scandal
I wonder how long it will take for China to flood the market with state-of-the-art modules. It's a pretty decent opportunity for them. They probably can hasten the build of new fabs more than many other nations.
But my guess is that this shortage is short-lived (mostly because of the threat above). There's no OPEC for tech.
I don't disagree per-se, but this is the sort of thing which happens when only a few businesses exist in a commodity market with high entry costs. IOW, it's not great, but it is predictable. See: Oil.
Looking forward to the "Organization of Processor-Etching Corporations".
It's usually not only illegal, but also a crime.
Anyway, that's the kind of market that governments always need to act upon and either supply directly or regulate intensively.
It's not just predictable, it's illegal. Of course, if you have an executive that actually cares about enforcing the law.
You mean “capitalists”.
Maximizing profit is the only sane way to play a rigged game
https://videocardz.com/newz/chinese-cxmt-to-produce-ddr5-800...
Manufacturing is based on anticipating demand.
Unforseen things like the pandemic hurt profits.
Letting things go this unmanaged with a 3 year run way for AI demand seems a little hard to understand. In this case, not anticipating demand seems to creates more profit.
I find it hard to believe the pandemic hurt the profits of computing hardware, demand went up from it not down.
I'm not sure if profits were hurt, but the manufacturing did slow and stop and take some time to get going again.
I guess we'll just have to stop making computer memory if it ceases to be profitable. The market is so efficient.
So, like, we were already pretty much priced out of higher-end graphic cards, and now it's happening to RAM. All this while jobs are disappearing, layoffs are ongoing and CEOs are touting AI's 'capabilities' left and right.
Next is probably CPUs, even if AIs don't use them that much, manufactures will shift production to something more profitable, then gouge prices so that only enterprises will pay for them.
What's next? Electricity?
Where the f*k is all the abundance that AI was supposed to bring into the world? /rant
Maybe that is the answer to how things are supposed to work if AI replaces everyone and no one can afford to buy their stuff.
Things being too cheap allows money to pool at the bottom in little people's hands in the forms of things like "their homes" and "their computers" and "their cars".
You don't really want billions in computing hardware (say) being stashed down there in inefficient, illiquid physical form, you want it in a datacentre where it can be leveraged, traded, used as security, etc. If it has to be physically held down there, ideally it should be expensive, leased and have a short lifespan. The higher echelons seem apparently to think they can drive economic activity by cycling money at a higher level amongst themselves rather than looping in actual people.
This exact price jump seems largely like a shock rather then a slow squeeze, but I think seeing some kind of reversal of the unique 20th century "life gets better/cheaper/easier every generation".
I very much disagree that consumers holding more hardware capabilities than they need is a bad thing. Replace computing hardware with mechanical tools, because they are basically tools, and consider if consumers be better off if wrenches and saw blades and machine tools were held more exclusively by business and large corporations. Would corporations use them more often? Probably. And yet it seems pretty clear that it would hurt the capabilities of regular people to not be able to fix things themselves or innovate outside of a corporate owned lab.
To me the #1 most important factor in a maintaining a prosperous and modern society is common access to tools by the masses, and computing hardware is just the latest set of tools.
I think you missed some sarcasm there.
Until I got the last part, I actually thought you were being serious.
> Where the f*k is all the abundance that AI was supposed to bring into the world?
In the hands of the owners of the AI, as a direct consequence of the economic system. It was never going to play out any other way.
> What's next? Electricity?
Yes. My electricity prices jumped 50% in 3 years.
Rooftop solar doesn’t get more expensive over time.
In the share prices. Hope you're rich, because that's the only thing the economy cares about.
Many apps support buying fractional shares. You don't have to be rich to buy shares in public companies.
It's only meaningful if you have enough disposable income to invest that it eventually (and I don't mean in 50 years) makes a dent against your living expenses.
If you make $4k/mo and rent is $3k, it's pretty silly to state that it's a meaningful thing for someone to scrimp and invest $100/mo into a brokerage account.
They definitely should do this, but it's not going to have any meaningful impact on their life for decades at best. Save for a decade to get $12k in your brokerage account, say it doubles to $24k. If you then decide you can get a generous 5% withdrawal rate you are talking $600/yr against rent that is now probably $3500/mo or more. Plus you're killing your compounding.
It's good to have so emergencies don't sink you - but it's really an annoying talking point I hear a lot lately. Eye rolling when you are telling someone struggling this sort of thing.
It really only makes a major impact if you can dump large amounts of cash into an account early in life - or if you run into a windfall.
It's not like the individual share price of e.g. VTI or FZROX is all that high to begin with.
Ram is a commodity they have spikes in demand/shortage of supply which drive the price up like this.
I remember when there was a flood somewhere in Thailand in the 2011 and the prices of hardisks went up through the roof.
https://www.forbes.com/sites/tomcoughlin/2011/10/17/thailand...
Sounds like a business opportunity for someone to come in and scale better.
The abundance is there, it just isn't for you or other working class people
Electricity prices are already skyrocketing, it's present - not next.
Yes electricity by more short-sighted, dirty methods (remember when crypto clowns bought an old fossil plant just for coin mining?) but more alarming, fresh water.
That can be a bigger problem for civilization.
I picked a really great time to give up PC Gaming for writing.
Actually, this seems to be mostly a spike in retail prices, not wholesale DRAM contracts that are only up 60% or so in the past few months according to Samsung. So we should most likely place at least some fraction of the blame on our fellow consumers for overreacting to the news and hoarding RAM at overinflated prices. DRAM sticks are the new toilet paper.
> Actually, this seems to be mostly a spike in retail prices, not wholesale DRAM contracts that are only up 60% or so in the past few months according to Samsung. So we should most likely place at least some fraction of the blame on our fellow consumers for overreacting to the news and hoarding RAM at overinflated prices. DRAM sticks are the new toilet paper.
What is your source on that? Moore's Law is Dead directly contradicts your claims by saying that OpenAI has purchased unfinished wafers to squeeze the market.
https://www.youtube.com/watch?v=BORRBce5TGw
Reuters: “Samsung hikes memory chip prices by up to 60% as shortage worsens, sources say,” Nov 14 2025. https://www.reuters.com/world/china/samsung-hikes-memory-chi...
Tom’s Hardware: “Samsung raises memory chip prices by up to 60% since September as AI data-center buildout strangles supply,” Nov 2025. https://www.tomshardware.com/tech-industry/samsung-raises-me...
Note the consistent "up to 60% since September" figure in the above recent reports. That's for one module capacity, with others being up 30% to 50% - and it certainly isn't the 200% or more we're apparently seeing now in the retail market. That's pure panic hoarding, which is actually a very common overreaction to a sudden price spike.
"only" ? Nobody is hoarding RAM (at least yet, consumers seem mostly blindsided by it), this is directly caused by industry thirst for AI
Lenovo Stockpiling PC Memory Due to 'Unprecedented' AI Squeeze - https://news.ycombinator.com/item?id=46041505
>Where the f*k is all the abundance that AI was supposed to bring into the world?
That'll come with the bubble bursting and the mass sell off.
"Where the f*k is all the abundance that AI was supposed to bring into the world?"
more money for shareholder, 5 Trillion Nvidia???? more like a quadrillion for nvidia market cap
In world history, the vast majority of abundance is downstream of conquest, not innovation. Plunder makes profit. Even in weird moments like today, where innovation is (or, at least, was) genuinely the driving force of abundance, that innovation would not have come about without the seed capital of Europe plundering Africa and the Americas.
Abundance isn't even the right framing. What most people actually want and need is a certain amount of resources - after which their needs are satiated and they move onto other endeavors. It's the elites that want abundance - i.e. infinite growth forever. The history of early agriculture is marked by hunter-gatherers outgrowing their natural limits, transitioning to farming, and then people figuring out that it's really fucking easy to just steal what others grow. Abundance came from making farmers overproduce to feed an unproductive elite. Subsistence farming gave way to farming practices that overtaxed the soil or risked crop failure.
The history of technology had, up until recently, bucked this trend. Computers got better and cheaper every 18 months because we had the time and money to exploit electricity and lithography to produce smaller computers that used less energy. This is abundance from innovation. The problem is, most people don't want abundance; the most gluttonous need for computational power can be satisfied with a $5000 gaming rig. So the tech industry has been dealing with declining demand, first with personal computers and then with smartphones.
AI fixes this problem, by being an endless demand for more and more compute with the economic returns to show for it. When AI people were talking about abundance, they were primarily telling their shareholders: We will build a machine that will make us kings of the new economy, and your equity shares will grant you seats in the new nobility. In this new economy, labor doesn't matter. We can automate away the entire working and middle classes, up to and including letting the new nobles hunt them down from helicopters for sport.
Ok, that's hyperbole. But assuming the AI bubble doesn't pop, I will agree that affordable CPUs are next on the chopping block. If that happens, modular / open computing is dead. The least restrictive computing environment normal people can afford will be a Macbook, solely because Apple has so much market power from iPhones that they can afford to keep the Mac around for vanity. We will get the dystopia RMS warned about, not from despotic control over computing, but from the fact that nobody will be able to afford to own their own computer anymore. Because abundance is very, very expensive.
It’s not hyperbole
https://www.bbc.co.uk/news/articles/c246pv2n25zo
> Where the f*k is all the abundance that AI was supposed to bring into the world? /rant
It may have been a bit self-deprecating, but I think your “rant” is a more than justified question that really should be expanded well beyond just this matter. It’s related to a clear fraud that has been perpetrated upon the people of the western world in particular for many decades and generations now in many different ways. We have been told for decades and generations that “we have to plunder your money and debase of and give it to the rich that caused the {insert disaster caused by the ruling class} misery and we have to do it without any kind of consequences for the perpetrators and no, you don’t get any kind of ownership or investment and we have to do it now or the world will end”
I used to sell 64kbit (yes, bit) DRAM at $7 in 1982. 1 year later was <$0.50.
The memory business is a pure commodity and brutally cyclic. Big profit => build a fab => wait 2 years => oh shit, everyone else did it => dump units at below cost. Repeat.
Then you have “acts of God” like that time when a flood or fire or something caused the only factory in the world that produced some obscure part of memory chips to stop production and memory costs double almost overnight. I remember my 4 -> 32 MB back in the 90s cost a fortune because of this.
A great example for the pork cycle.
https://en.wikipedia.org/wiki/Pork_cycle
Old enough here to remember Intel exiting the DRAM business.
Same! And then they made new eDRAM for a hot minute as part of Crystal Well. It'd be fun to see them get back into the game in a serious way, but their on-again-off-again attitude toward dGPUs does not give me confidence in their ability to execute on such long-term plans.
Old enough here to remember Intel entering the DRAM business :-)
Built my son's first gaming PC 2 months ago. Figured it would be cheaper around Black Friday, but the prices were reasonable enough that we didn't wait. Turned out to be a huge savings to buy that fast DDR5 in September.
This is really crazy. I built my first computer not too long ago; like I'm talking less than a month maybe, definitely less than 2. I paid $320 for 64GB Kit (CMK64GX5M2B5600C40) at Microcenter. It is now sold out in Chicago store and listed at $530.
$760 on NewEgg! Yeeezus Christ. Literally 50% of what I spent on the entire build just on RAM.
Just went through this today for my daughter- struggled to find an i5 not on pre-order, and the RAM was crippling- ended up going Ryzen 7 for a first time and 2x8Gb DDR5 6000 @ £119 - looking forward to building it with her!
I wish I would have done what you did. Especially since I wanted 128GB. Now I am probably going to settle for 64GB or maybe 96GB.
The better play would've been to buy Bay Area real estate in the 1970s, but what're you gonna do? lights cigarette
I bought 32GB x 2 of G.SKILL a year ago and paid $204. Now it's $600. Insane
Just checked my invoice from last year, $316 CAD for 2x32GB 5600MHz DDR5 ECC UDIMMs.
I'm now seeing $480 CAD for a single stick.
I just got one of the last beelink ser-8s with 64gb for $750. They sold out by the time my order arrived. The newer ones are starting around 830 for a 32gb machine (admittedly with newer everything)
I bought 64GB of DDR5 two weeks ago. That same RAM is now twice the price.
It will be interesting to see the knock on effect of some upcoming consumer electronics; for example Apple was rumored to be working on a cheaper MacBook that uses an iPad CPU, and Valve is working on a SteamOS based gaming machine. Both will likely live/die based on price.
It's way too early to assume these prices are permanent. It's a supply crunch meeting a demand spike. The market will find equilibrium.
Big manufacturers also order their DRAM in advance with contractually negotiated pricing. They're not paying these spot market prices for every computer they ship.
edit: looks like i had the wrong understanding, thanks to the comments below for explaining
~~~~~helps that Apple's SoC has the RAM on the main die itself. They're probably immune from these price hikes, but a lot of the PC/Windows vendors would, which would only make Apple's position even stronger~~~~
They're probably immune for a while because they're probably using a long term contract, but when it comes time to renew they'll have to offer close to market price to convince the manufacturers not to use that fab space for more profitable memory.
How does that make a difference? It's not like the price change is on DIMMs. The price change is on the DRAM, which is a commodity item. It's not like someone is going to discount it if you tell them "nah, I'm going to solder this one to my SoC".
If Apple is insulated it is likely because Apple signs big contracts for large supply and manufacturers would prefer to be insulated from short-term demand shocks and have some reliability that their fabs can keep running and producing profitable chips.
I also had that misunderstanding, so after seeing this comment I looking up info. In this article you can see the xray of the m1 chip composited onto the photo of the chip, which has external memory components. You can also see in the architecture diagram that the memory is attached from outside the area where the Fabric, CPU, GPU, NPU, cache, and some other unlabeled things are located. https://www.macrumors.com/guide/m1/
And in this article you can see a photo of the memory chips attached outside of the Apple component https://www.gizmochina.com/2020/11/19/apple-mac-mini-teardow...
Perhaps on the same package (stacked) but absolutely not on the same die.
Which Apple product is this? Memory dies and logic dies require entirely different factories to make, so I doubt this SoC exists.
In case anyone else wanted to check, PS5 has[1]:
> Memory: 16 GB GDDR6 SDRAM
So unless the RAM price jumps to 4x the price of a PS5, getting a PS5 is not the most cost efficient way to get to 64 GB of RAM.
In comparison, PS3 has been used to build cheap clusters[2].
[1]: https://en.wikipedia.org/wiki/PlayStation_5
[2]: https://en.wikipedia.org/wiki/PlayStation_3_cluster
Yes, stupid comparison really. Also 64GB is pretty high-end from a consumer perspective. Most would do just fine with 32 as 2x16GB.
Maybe if they expect to upgrade within a few years it would be fine. But when I built my current computer 11 years ago I also didn't expect to need 16 gb of ram and only bought 8. 5 years later 16 gb of memory was a requirement for both software and games I was playing. And now 11 years later 16 gigs is not enough for fairly "simple" 3d modelling and 32 gigs is pretty close to the minimum requirement to fully utilize other modern hardware.
I bought a 16gb desktop for work in 2011, plenty for Visual Studio at the time. 8 is a bit skimpy for a desktop build even in 2014.
Speaking of 11 years old, I just put my 4790k out to pasture. It was a good CPU for a long time, but it got a little long in the tooth for modern workloads.
When did a PS5 become a unit of cost? For reference seems to be about 0.002 London buses
I think it's intended as a comparison of cost when building a gaming-capable computer vs. a console of somewhat equivalent power.
It used to be a general rule of thumb that you could build a computer of roughly equivalent power for the cost of a game console, or a little more — now the memory costs more than the whole console.
> I think it's intended as a comparison of cost when building a gaming-capable computer vs. a console of somewhat equivalent power.
The PS5 has 16GB of RAM. One can buy 16GB of RAM for ~$100 [1].
[1] https://pcpartpicker.com/product/9fgFf7/kingston-fury-beast-...
Thank you for mentioning this. Not knowing the specs of a PS5, I'd assumed that the comparison was made because the PS5 now sold for less than the RAM it contains, and scalpers were now hungrily buying up all available PlayStation inventory just to tear them open and feast on the RAM inside.
But since it's 16 GB, the comparison doesn't really make sense.
It still is a rule of thumb, you dont need DDR5 for a gaming computer let alone 64gb. A low end am4 cpu + 16gb of DD4 3600 and a decent gpu will beat a ps5 in performance and cost. I dont understand why the headline made this strange comparison.
It doesn't help that GPUs have also generally gone up over the past decade because there's more market for them besides gaming, along with how they benefit from being hugely parallel and the larger you can make them the better, and fabrication costs are shooting up. I think there was a GamersNexus video at the launch of one of the previous GPU generations that noted that there was a move from "more for your money" each generation towards "more for more", i.e. keeping the value roughly static and increasing the amount they charged for a more capable product.
To be fair, if this keeps up, expect the price of a PS5 to skyrocket too.
Hopefully Sony has long-term contracts for their components. I presume they have an idea of how many PS5s they're going to be making still.
Never.
That's a an analogy-- a literary technique the writer is using, to show the correspondence between the price of a specific amount of DDR5 RAM to a fully integrated system, so the reader can follow the conclusions of their article easier.
Give it a month or two and it might be cheaper to get the bus.
It's a more stable unit than US dollars.
> When did a PS5 become a unit of cost? For reference seems to be about 0.002 London buses
Gaming consoles are something people buy. Any parent or gamer has an idea what they cost.
People do not buy London Buses themself
Seems like an American thing. We measure distances in football fields and volumes in olympic pools, seems we now measure money in PS5s. It tracks...
Approximately the instant when a single component (RAM) of a comparable product (Gaming PC) became more expensive than the entirety of said product.
I wonder what you'd think if bus tires exploded in price and started costing .25 London busses per tire.
The tech equivalent to the Big Mac index? https://en.wikipedia.org/wiki/Big_Mac_Index
Lots of people are speculating that the price spike is AI related. But it might be more mundane:
I'd bet that a good chunk of the apparently sudden demand spike could be last month's Microsoft Windows 10 end-of-support finally happening, pushing companies and individuals to replace many years worth of older laptops and desktops all at once.
Perhaps the memory manufacturers have seen how much Apple gets away with charging for the memory on their laptops and have decided to copy them ;-)
It’s not speculation, but it could also be both.
I worked in enterprise laptop repair two decades ago — I like your theory (and there's definitely meat there) but my experience was that if a system's OEM configuration wasn't enough to run modern software, we'd replace the entire system (to avoid bottlenecks elsewhere in the architecture).
I just checked how much I paid around 12 months ago for Crucial 96GB kit (2x48GB ddr5 5600 so-dimm). Was $224, same kit today I see listed at $592, wild :/
This is insane!
I got 2 sticks of 16GB DDR4 SODIMM for €65.98 back in February. The same two sticks in the same store now cost €186
Same, bought in August for $250 (EU), now it's ~$840. I ended up returning the laptop I'd bought it for and thought 'why hold on to the RAM, it'll only depreciate in value,' so I returned that too. Better hold on to my PS5, I guess.
I bought 2x 32 GB DDR5 in august for $150, Now its $440. I dodged a HUGE bullet.
96GB 6400, 380€ 2023-11
I did buy 384 GB worth of Samsung DDR5-4800 sticks for my homelab a few months ago. I was wondering at the time if I really needed it, well ended up using it anyway, and turns out, dodged a bullet big time.
The silver lining is that hopefully it’ll become too expensive to ship new Electron apps
You have too much faith in the software industry.
RAM has been cheap long enough and now no one remembers how to write efficient GUI apps.
I'm joking, but only kind of. It's not a domain that I do a lot of, but I haven't touched Qt in so long that it would basically be starting from scratch if I tried to write an app with it; I could write an Election app in like an hour.
Noticed SSDs went up too. There's a "black friday" sales price for a 4TB crucial external drive that's at its highest price in 90 days.
Bad time if you need to build a computer.
Article says:
Looking at it optimistically, you're probably going to find DDR5 at bargain prices again in 2027.
When do you think prices will recede again?
never fully, like with GPU, it a semi-cartel, it's in everything including you high performance SSD (as cache) they have a reason for them being supper high for ~2 years then they will go down but only "somewhat", lets say if the peak it >2x pricing the price in 2027 will be ~1.5x-1.8x price.
And because everything needs prices expect all electronics to be ~20%-80% more expensive in 2027 compared to today, naturally this includes the profit margin.
and naturally every regulation related companies don't like will supposedly be at fault for this (e.g. right to repair)
at least that is a wild speculation on my side
I built 4 systems between Jan-May for myself and family, very fortuitous timing, because no way would I be doing it now.
bc some people up top loved the idea of the old Smoot Hawley thing
RAM prices are cyclical. We are in the under supply part of the cycle.
People just have to wait. As prices are sky high, production capacity will likely increase. Some AI companies will go bust. Demand will plummet and we will buy RAM for pennies while the market consolidates.
That's historically what happened when we had proper competition. Now we have a 3 party oligopoly and massive barriers to entry. Now at least 1 of the 3 is actively signalling than they're not going to not going to spend 100s of billions to expand fab capacity that will lower their profits because if one does it they'll all do it. It's a prisoner dilemma, and they're co-operating. When they co-operate we all lose.
The entry of Chinese DRAM into the market may spur increased competition. If not for competition's sake alone, for national security and domestic supply chain integrity concerns.
That is also somewhat true for GPUs, hard drives and SSDs. They all usually have different cycles but today AI is making them peak all at the same time.
Great, we can eat the RAM when we're all unemployed.
Dude - I laughed out loud on this comment (I guess we're at the cry or laugh stage). Almost woke up my sleeping kiddo.
Crypto: GPUs
AI: RAM
Thanks for taking away years of affordable computing from people. Time is more valuable; there's no getting it back.
It's crazy how much RAM has inflated in the last month. I checked the price history of a few DDR5 kits and most have tripled since September.
Why specifically just now? It doesn't seem that much has materially changed very recently.
It's due to every hyperscalar building out new AI datacenters. For example you have Google recently saying things like "Google tells employees it must double capacity every 6 months to meet AI demand", and that they need to increase capacity by 1000x within 4-5 years.
The oft-snickered-at "smuggling 3mb of hot RAM" line from Neuromancer may have been prophetic after all.
If you are a scifi author, it's a mistake to give any hard numbers in real-world units. You will, most likely, greatly underestimate. Even trying to greatly overestimate, you will underestimate.
Commander Data's specifications in the Star Trek TNG episode The Measure of a Man from 1989: 800 quadrillion bits of storage, computing at 60 trillion operations per second.
100 petabytes. That's a big machine. A very big machine. But supercomputers now have memories measured in petabytes.
They never used "bits" again in any Star Trek script. It was kiloquads and gigaquads from then on.
That's fun! To further prove your point I saw this and thought "yeah maybe 100 PB is more common these days but 60 trillion ops / second seems like a lot"
Then I did some googling and it turns out that a single 5090 GPU has a peak FP32 performance of over 100 TFLOPS!
I thought a big factor with the AI hype is that hardware costs always go down. Is this not a huge red flag to investors?!
Everyone should unsub from this AI frenzy, this is ridiculous
Sign me up. Where do I turn it off?
In addition to the demand from AI data centres, I believe this also contributes to the high price. https://www.tomshardware.com/pc-components/ssds/intel-samsun...
I picked up a PS5 today on a Black Friday deal for 350EUR. 32GB DDR5 is at around 280EUR at the moment.
I have a gaming PC, it runs Linux because (speaking as a Microsoft sysadmin with 10 years under my belt) I hate what Windows has become, but on commodity hardware it’s not quite there for me. Thought I’d play the PlayStation backlog while I wait for the Steam Machine.
Don't forget to play Astro's Playroom, it comes with the system and it's a blast.
Thanks, will do!
Also, Spiderman 2 since it’s a Playstation exclusive. Incredible game worth every penny.
I bought 32GB of DDR5 SODIMM last year for 108€ on Amazon. The exact same product that I bought back then is now 232€ on Amazon. I don't like this ride.
Yeah, similar for me. I bought 64 gigs of DDR5 laptop RAM about a year ago; it ended up costing about $190. Now the exact same listing is going for $470. https://a.co/d/fJH1GkW
I guess I'm glad I bought when I did; didn't realize how good of a deal I was getting.
Ouch. Wondering if homelabs will be scavenged for unused RAM as even DDR4 is going up in price :(
I’ve been selling the ddr4 I had lying around. Also consider removing some from desktop since I don’t really use 64gb.
I'm personally waiting for the first DDR5 heist. Breaking into a computer store and taking all of the RAM that isn't soldered down.
Wow, I only paid $265 for 96GB of DDR5 back in April. Same brand (G.SKILL) as the kit in the article too.
"2026: Cost of manufacturing PC cases increases 60% due to increased demand from Optimus production line" or some other dumb shit
AI is a net negative fnyor anyone not in on the grift.
Hang in there, abundance is on its way! /s
Abundance of PRs waiting for reviewers!
Holy shit the 32GB DDR5 I bought late october for $110 is now $300
Felt like I overpaid at the time too. Wow
I'm waiting for the Apple TV 4k 4th gen. I think it might might be one or two more years, on top of the now three years from 3rd gen (2022).
AI/LLM companies will pay TSMC more than Apple is willing to further subsidize this neat little box.
This is purely price gouging because these rams are not ECC and server grade.
The article references the original coverage which talks to this:
> Despite server-grade RDIMM memory and HBM being the main attractions for hardware manufacturers building AI servers, the entire memory industry, including DDR5, is being affected by price increases. The problem for consumers is that memory manufacturers are shifting production prioritization toward datacenter-focused memory types and producing less consumer-focused DDR5 memory as a result.
But I'm sure the hysteria around that isn't helping prices come back down either.
Except when you have datacenters also building racks with desktop hardware. I believe that was hetzner?
for anyone looking for a deal, thank me later, buy asap
(no association, just trying to help, I am still using DDR4)Is it really a shortage, rather than unfair order fulfillment, when it's just four companies buying up everything? There's plenty of RAM, it's just getting sold to the people who yell the loudest instead of going "sorry we have more customers than just you" and fulfilling orders to everyone.
Quick reminder that DRAM futures have existed since the 1980s so you all could have protected your price with calls.
where?
Call me old-fashioned, but I shouldn't have to have a stock broker to buy a computer. Maybe we could re-organize society to be a bit less ridiculous. "Quick reminder that you could've been born rich instead of a povvo."
but can 64GB of DDR5 memory run Crysis? …I’ll see myself out