Hand-held devices for testing concrete properties would be more useful. Most concrete problems come from a bad mix - too much water, not enough cement, etc. Concrete testing usually involves cutting a core out of the poured slab and sending it to a lab. Something where you stick a probe in the mix and can reject it before pouring would help. Here are some on-site concrete testers.[1] They're heavy and a pain to use.
There should be an app for this. But that's so last-decade.
Working with multiple tons of material that dries out as you move it around is hard. There are a lot of steps between the concrete being mixed and when it finally reaches the pour.
Cutting out a piece of a slab and sending it to a lab is for post-pour validation in serious construction. There are pre-pour tests that are much simpler depending on the seriousness of what you’re building.
They are standardized for a given mix. A mix design that is based on a trial badge is submitted to the SEOR prior to pouring anything. The mix design shows the ratios ingredients (cementitious materials, find and coarse aggregates, water, air, admixtures). But Concrete is still a non-homogeneous material with lots of variations. Take for instance aggregates, if it rained the last two weeks, the moisture content will be higher but it may only be a layer on that pile. Same goes for gradation (particle size of the rock). Sometimes you get a batch with smaller rock. There are a 100 things that can go wrong to get bad mud.
But yeah, there are concrete plants that cut corners and try to save on cement (the most expensive part of the mix), which depending on the project may bite them in the ass when they have to pay to fixing it.
Our work on concrete here differs in that the problem is both
1) an inherently time-varying, and
2) multi-objective.
See our write-up here for details: https://arxiv.org/pdf/2310.18288
> As a result, producers need a way to rapidly explore and validate new formulations without spending months in the lab.
How do you bypass the normal process of pouring test articles and testing them months and years after cure? This is fundamentally a research activity that needs to conduct verifiable science. Not something you can guess at with an LLM.
Hi, I developed the model. We are not bypassing the regular testing process, and are not using LLMs, but Gaussian processes with vetted test data. The predictions are used as recommendations for onsite testing, to accelerate finding mixtures with optimal strength-speed-sustainability trade-offs.
Somebody needs to coin a new term for the scattershot zero-thought AI griping that is pervasive in online comments these days. Meatslop?
Obviously it's going to be more productive for a manufacturer to do a years-long curing test on 100 likely candidates instead of 100 random mixes. They obviously already screen candidates through traditional methods, but if this AI technique improves accuracy, all the better.
Awesome. People take concrete for granted. Even at small scales (e.g. your patio) with formulas provided on the cement bag, concrete can go wrong (crazing, scaling, cracks). There's a lot of unappreciated craft in the work, not only in the composition and mixing, which is what this research seems dedicated to, but also in the placing, leveling, curing, finishing.
Civil Engineering is hard, and concrete is a perfect example of how something as "simple" as concrete in reality requires significant interdisciplinary collaboration with domain experts in ChemE, MatSE, Physics, Applied Math, and CS.
Some of the most robust HPC applications I saw back when I was an undergrad were done by Civil and Structural Engineers in the ONG space.
Tangentially related, but there is a new generation of trucks that mix the concrete on-site. They can output small batches and change the mix on the fly. They solve a lot of headaches!
This may work on a small scale, not in most commercial use cases. A typical deck pour (400cy) will pour at 70-80cy/hr. you got 9-10cy/truck. Meaning you have 7 to 8 minutes to back in the truck, empty it into the hopper and leave. You barely have time to add water to the mix.
Most high-volume concrete plants are "dry-batch", which means all the ingredients get dumped into the drum and the concrete will get mixed while driving to the project site. Also, changing mixes on the fly will not "fly". No one is going to authorize the adjustment, because what happens when the mix doesn't meet specs... It will need to get chipped out.
Traditional trucks pick up cement from a facility and rotate it to keep it from setting. They don't mix it on the fly. Any extra is considered waste is poured out.
> Meta’s AI for concrete model can help suppliers more quickly incorporate U.S. materials into their mixes through an approach called adaptive experimentation.
> Proposes high-potential candidates: The AI suggests new mixes most likely to meet target specifications and can compare performance between U.S.-made and foreign materials
US imports 22% of its cement
> In 2024, Portland and blended cement were produced in 99 plants in 34 U.S. states, led by Texas, Missouri, California, and Florida. Nevertheless, there was significant import reliance. Net imports were 22% of total consumption, with the major source countries being Turkey (32%), Canada (22%), and Vietnam (10%). U.S. exports of cement last year were negligible.
I'm assuming this isn't for national security reasons, probably more to help the domestic industry deal with tariffs. I hope Meta used their extensive connections to the government.
First there was the rampocalypse. Then there was cementpocalypse. Let just hope the AI datacenters don't latch on to biofuel to supplement their energy requirements. It's just more profitable for farmers to sell calories to the AI overlords, the consumer food market is just a low margin grind.
Apologies for the sarcasm. I appreciate the drive for renewables the current AI DC buildout brings with it.
I have real fears that building materials will experience the same inflationary pressures computer memory is currently experiencing. The U.S. TSMC and Intel fab construction alone in the last couple years has had an outsized impact on building costs.
Jesus I hope they do proper testing for these experimental mixes and don't trust whatever random garbage AI decides you should mix in. This is exactly the kind of thing AI is absolutely terrible at because it has no logical skills or direct experience or ability to test it. If your AI coded stuff goes belly up, you get to try again. If your multi million dollar cement foundation turns out to be sub-par, thats multi million dollars to tear it out and then millions more to do it again right, and that is a best case scenario. The alternative is people dieing when their apartment building collapses.
We use Gaussian processes trained on vetted test data from academic and industry partners. We use these predictions to recommend mixes for onsite testing to accelerate finding mixtures with optimal strength-speed-sustainability trade-offs. None of the data and predictions go untested. The blog post goes into this in more detail.
Can you at least read the article before criticizing them? They explicitly call out that they use Bayesian Optimization (Gaussian process) thing for this. It is "AI" but not "LLM" like you think it is.
Hand-held devices for testing concrete properties would be more useful. Most concrete problems come from a bad mix - too much water, not enough cement, etc. Concrete testing usually involves cutting a core out of the poured slab and sending it to a lab. Something where you stick a probe in the mix and can reject it before pouring would help. Here are some on-site concrete testers.[1] They're heavy and a pain to use.
There should be an app for this. But that's so last-decade.
[1] https://store.forneyonline.com/concrete-testing-equipment/fr...
On-site, before pouring, they use the slump test: https://en.wikipedia.org/wiki/Concrete_slump_test
Glad to see someone pointed this out. The test consists of a bucket, plywood board & a stopwatch.
I'm surprised the ratios for a given situation isn't standardized by now. Is it just people cutting corners?
Working with multiple tons of material that dries out as you move it around is hard. There are a lot of steps between the concrete being mixed and when it finally reaches the pour.
Cutting out a piece of a slab and sending it to a lab is for post-pour validation in serious construction. There are pre-pour tests that are much simpler depending on the seriousness of what you’re building.
The slump test is rather simple, for example: https://en.wikipedia.org/wiki/Concrete_slump_test
It’s basically a cone with handles and a procedure that’s easy to learn.
They are standardized for a given mix. A mix design that is based on a trial badge is submitted to the SEOR prior to pouring anything. The mix design shows the ratios ingredients (cementitious materials, find and coarse aggregates, water, air, admixtures). But Concrete is still a non-homogeneous material with lots of variations. Take for instance aggregates, if it rained the last two weeks, the moisture content will be higher but it may only be a layer on that pile. Same goes for gradation (particle size of the rock). Sometimes you get a batch with smaller rock. There are a 100 things that can go wrong to get bad mud.
But yeah, there are concrete plants that cut corners and try to save on cement (the most expensive part of the mix), which depending on the project may bite them in the ass when they have to pay to fixing it.
When you're making tons of something process variations get magnified.
I had to double check that this wasn't an April Fools joke. The GitHub project has commits from 2 weeks ago, so it's not.
Looking more closely though, this looks a lot like the Google "AI Cookie" from 2017, which also used Bayesian Optimization: https://blog.google/innovation-and-ai/technology/research/ma...
Google's "Smart Cookie" indeed also used techniques from Bayesian Optimization. For some technical detail, see their write-up here: https://static.googleusercontent.com/media/research.google.c...
Our work on concrete here differs in that the problem is both 1) an inherently time-varying, and 2) multi-objective. See our write-up here for details: https://arxiv.org/pdf/2310.18288
> As a result, producers need a way to rapidly explore and validate new formulations without spending months in the lab.
How do you bypass the normal process of pouring test articles and testing them months and years after cure? This is fundamentally a research activity that needs to conduct verifiable science. Not something you can guess at with an LLM.
Hi, I developed the model. We are not bypassing the regular testing process, and are not using LLMs, but Gaussian processes with vetted test data. The predictions are used as recommendations for onsite testing, to accelerate finding mixtures with optimal strength-speed-sustainability trade-offs.
Somebody needs to coin a new term for the scattershot zero-thought AI griping that is pervasive in online comments these days. Meatslop?
Obviously it's going to be more productive for a manufacturer to do a years-long curing test on 100 likely candidates instead of 100 random mixes. They obviously already screen candidates through traditional methods, but if this AI technique improves accuracy, all the better.
I call it pseudo-critique — active stupidity in the name of critical thinking — but that’s too general.
https://danluu.com/cocktail-ideas/
hn discourse is not nearly as high-quality as people would like to believe.
It’s very bimodal.
just like everywhere else? reddit has fairly good wheat among the chaff just the same?
What part of move fast and break things did you not understand?
It doesn't use an LLM
What could possibly go wrong?
https://dailygalaxy.com/2026/03/rubber-used-in-undersea-tunn...
All the chemical companies do it. They pair it with testing, but still.
They have a new scapegoat to blame if things turn out badly.
Why do they need AI for that? Just create another LLC, manslaughter any number of people, then have that LLC declare bankruptcy. Zero consequences.
Emitting a shrug and "AI made me do it" is cheaper.
Awesome. People take concrete for granted. Even at small scales (e.g. your patio) with formulas provided on the cement bag, concrete can go wrong (crazing, scaling, cracks). There's a lot of unappreciated craft in the work, not only in the composition and mixing, which is what this research seems dedicated to, but also in the placing, leveling, curing, finishing.
^ This.
Civil Engineering is hard, and concrete is a perfect example of how something as "simple" as concrete in reality requires significant interdisciplinary collaboration with domain experts in ChemE, MatSE, Physics, Applied Math, and CS.
Some of the most robust HPC applications I saw back when I was an undergrad were done by Civil and Structural Engineers in the ONG space.
They sure are stretching to find a way to make this have something to do with being pro-America.
Increasing the quality and quantity of domestic cement output will provide a pretty clear national benefit.
Tangentially related, but there is a new generation of trucks that mix the concrete on-site. They can output small batches and change the mix on the fly. They solve a lot of headaches!
https://cementech.com/volumetric-technology/
This may work on a small scale, not in most commercial use cases. A typical deck pour (400cy) will pour at 70-80cy/hr. you got 9-10cy/truck. Meaning you have 7 to 8 minutes to back in the truck, empty it into the hopper and leave. You barely have time to add water to the mix. Most high-volume concrete plants are "dry-batch", which means all the ingredients get dumped into the drum and the concrete will get mixed while driving to the project site. Also, changing mixes on the fly will not "fly". No one is going to authorize the adjustment, because what happens when the mix doesn't meet specs... It will need to get chipped out.
Concrete mixer trucks are not new at all actually, they've been around for a long time.
Traditional trucks pick up cement from a facility and rotate it to keep it from setting. They don't mix it on the fly. Any extra is considered waste is poured out.
> Meta’s AI for concrete model can help suppliers more quickly incorporate U.S. materials into their mixes through an approach called adaptive experimentation.
> Proposes high-potential candidates: The AI suggests new mixes most likely to meet target specifications and can compare performance between U.S.-made and foreign materials
US imports 22% of its cement
> In 2024, Portland and blended cement were produced in 99 plants in 34 U.S. states, led by Texas, Missouri, California, and Florida. Nevertheless, there was significant import reliance. Net imports were 22% of total consumption, with the major source countries being Turkey (32%), Canada (22%), and Vietnam (10%). U.S. exports of cement last year were negligible.
https://www.constructconnect.com/construction-economic-news/....
I'm assuming this isn't for national security reasons, probably more to help the domestic industry deal with tariffs. I hope Meta used their extensive connections to the government.
Wet cement is kind of sloppy, so this makes some sense.
I hate April Fools day so much. Is this a joke? I genuinely cannot tell.
It's not a joke - but it sure feels suspicious :D
Not nearly entertaining enough to be one.
The date on the article is March 30th.
I honestly thought this was going to be an April Fools gag.
First there was the rampocalypse. Then there was cementpocalypse. Let just hope the AI datacenters don't latch on to biofuel to supplement their energy requirements. It's just more profitable for farmers to sell calories to the AI overlords, the consumer food market is just a low margin grind.
Most large scale DC projects I've know are primarily leveraging solar with grid batteries because of the low upfront cost and state incentives.
Apologies for the sarcasm. I appreciate the drive for renewables the current AI DC buildout brings with it.
I have real fears that building materials will experience the same inflationary pressures computer memory is currently experiencing. The U.S. TSMC and Intel fab construction alone in the last couple years has had an outsized impact on building costs.
The masons just showed up their involvement with AI and everything wrong in our times. The masks have fallen. /s
Jesus I hope they do proper testing for these experimental mixes and don't trust whatever random garbage AI decides you should mix in. This is exactly the kind of thing AI is absolutely terrible at because it has no logical skills or direct experience or ability to test it. If your AI coded stuff goes belly up, you get to try again. If your multi million dollar cement foundation turns out to be sub-par, thats multi million dollars to tear it out and then millions more to do it again right, and that is a best case scenario. The alternative is people dieing when their apartment building collapses.
We use Gaussian processes trained on vetted test data from academic and industry partners. We use these predictions to recommend mixes for onsite testing to accelerate finding mixtures with optimal strength-speed-sustainability trade-offs. None of the data and predictions go untested. The blog post goes into this in more detail.
AI isn’t just LLMs.
Can you at least read the article before criticizing them? They explicitly call out that they use Bayesian Optimization (Gaussian process) thing for this. It is "AI" but not "LLM" like you think it is.