But even if you have a working stellarator that's a very long way from an economically viable energy source. You've still got to a) figure out how to cheaply convert the released energy into electricity (and the baseline way of doing that in D-T fusion is...a steam turbine), and b) figure out materials that can survive the radiation bombardment for a sufficiently long time.
In sunny places (and I fully acknowledge that's not all of the world) it's still going to be hard to beat sticking bits of special glass out in a field and connecting wires to it.
But we should sure as heck keep tinkering away at it!
I don't think the point of this project is to be closer to economic viability, but to demonstrate an approach that would lead to faster economic viability due to allowing faster iteration and evaluation of small scale experimental designs.
In HN terms they are demonstrating a significantly faster REPL by keeping the project small and minimising use of esoteric or highly bespoke components.
It's the closest you can get to building your own stellarator by walking into radioshack. I think it's a pretty cool idea.
Not sure why steam engine is written like it's some crazy reveal. It's the accepted way of doing it in most other fossile fuel plants, to the point that there's a saying for it - someone, somewhere X is boiling water so you can boil water. There are other designs that have been proposed, but I'm not aware of anything in production that takes heat and turn it into electricity at powerplant scale. Eg RTGs exist, but only if you need a computer power supply's worth of power for decades.
Perhaps because the steam part alone, even if powered by pixie dust and magic, can't compete on price with solar, and probably solar and batteries either (varies with latitude and time but the cost is going down and the latitude is getting wider).
I think OpenAI is investing into a fusion design that avoids steam for exactly this reason, so it's not just an anti-nuclear talking point.
Newer solar panels don’t need full sun to function. It’s economically viable to place them further north and in cloudier climates now. So the area where these are alternatives are viable may be shrinking faster than you would expect.
Indeed, though in colder climates you do have the problem that peak electricity demand is precisely when you get minimum solar production.
But in sunnier, warmer parts of the world (which notably includes India, Pakistan, Bangladesh, Indonesia, Nigeria, Egypt, Ethiopia, Iran, Mexico, and Brazil, amongst others), over the next few decades it's hard to see anything much competing against solar and batteries for the bulk of energy usage.
That isn’t true yet, because most cold places don’t use electricity for heating; in Boston for example the norm is oil or gas furnaces. Europe typically uses gas for heating, as we all learned during the Ukraine war’s beginning. When I lived in the Boston area, our electricity use, peaked in the summer during air-conditioning season. That is, until we got a heat pump, then our electricity spiked like crazy in the winter. but we are long way from having the majority of homeowners using heat pumps for their primary heat source. (Also heat pumps are kinda awful compared to furnaces in my experience so we might never get there).
Heat pumps work great under the right circumstances. In cold climates like Boston and north of it, ideally you want ground source, which means digging deep (or wide) holes to flow water through instead of just pulling heat from outside air. Alternatively, (or additionally, ideally) they're best used in homes with very tight air envelopes, good insulation, and heat/energy recovery ventilation (HRV/ERV) systems. Installed in radiant heating systems in well-built houses, heat pumps are fantastic and you'll be comfortable all year for pennies on the dollar.
The reality is that most (US) construction, especially older, is just terrible in terms of air seal and insulation. Couple that with a potentially undersized air-source heat pump, which gets inefficient as the outside temp gets near the low end of its operating range, and you will not have a great time. But even then they are a good way to supplement gas heating, so you can limit furnace use to the coldest weeks of the year and do the environment a favor, as well as cool in the summer.
Though solar panels prefer if it isn't too hot. Still, any such inefficiencies are easily outweighed by their low cost. In Netherlands it is common to install practice (and advised) more Wp than the inverter can handle. Maybe also a solution if it gets too hot, make it up by having more panels.
As someone living in the further north, the issue of sun not going above the horizon for months at a time makes them a non-starter without 100% coverage from other sources, making them rather expensive without externalizing the costs to someone else
Depending on the location, there could be sources of geothermal energy, like Iceland. There are also various prototypical approaches at storing solar energy for the winter months, like molten salt (obviously a bit too spicy outside of industrial settings), or generating hydrogen and storing it in iron ore. The latter approach was recently pioneered by ETH Zürich.
I don't live nearly that far north and the low sun angle for much of the year frequent cloud cover and couple weeks a year that a panel will have snow on it adds up to it just not being economically viable vs other generation methods. If you DIY the whole thing out of used panels and other low cost parts then it makes good sense but that is a huge time sink but that's kind of the exception. I hate my utility and look into stuff like this every year or so so it's not like I'm using 2010 numbers for panel prices either.
The other half needs energy too though! And high-latitude areas are known to have dense enough populations and exceedingly high economic productivity, from Sweden to Ireland to New England.
>It’s economically viable to place them further north and in cloudier climates now
Which it really isn't unless we're neglecting to count in the need for long-term backup power. There still exists some amount of solar even now, though government subsidies and government provided peaker plants may have something to do with it
Globally it's the US that is the exception. Solars fall apart elsewhere be it solar irradiance or population density or prevalence of flying debris or whatever it might be.
Only in the US, Australia, and few other places it makes sense to just put up some panels for free energy. Incidentally also sometimes apply to EV arguments.
The next option for you would be to take the panels that you would have used and put them in an area where they work better.
Use the extra electricity to power machines that hydrogenate CO2 extracted from the atmosphere and turn it into methane.
Methane generated this way, (being the principal content of natural gas), would allow us to deliver nearly carbon-free fossil fuels to you and people in your biome, and everyone wins.
Well as long as you have vast amount of storage capacity + overprovision, or an alternative source of on-demand electricity, the cost of which I never see included when comparing to on-demand energy sources like nuclear of fossil fuels.
> Well as long as you have vast amount of storage capacity + overprovision
Why would that need to be? The full needs need to be taken into account. But that's a TCO calculation, not something to add to the solar cost.
Nuclear energy in Europe tends to be way more expensive than initially budgeted. Resulting in a crazy difference in the kWh cost vs solar/wind. And there's more ways to store "electricity" than just batteries.
Because what is implicit in saying that solar (or wind) is cheaper than nuclear per W produced is that it is a viable alternative to nuclear. But to be an alternative you need to be able to meet the demand whether there is light/wind or not (with a near zero probability of a blackout). And if you need to do that then you need either a bunch of alternative on demand sources of energy (UK is using LNG) or some big storage capacity (+overprovision to fill them when it is sunny/windy). Nuclear doesn't need that. If you don't factor those costs you are comparing apples and oranges.
That was true in the 70s. I think anything built in europe since the 80s can do load following and France does that every day. I think by regulation new plants must be able to adapt their production by something like 5% per 10min (or something of that order of magnitude), well within the range you need to meet intra day variations.
Now whether it is optimal economically is another question. If you have some sources of energy in your grid that cost per usage (eg fossil fuels), you should rather switch those off than nuclear which costs the same whether you use it or not. But if your grid is almost all nuclear (eg France), you do load following.
The newer ones can. Though that's partly my dislike. There's always yet another new technological solution for the various drawbacks. Which often are unproven (e.g. SMRs) and that often results in crazy cost overruns.
Nuclear has a crazy high fixed cost so ideally you still run it continuously.
Steam turbines are very good at what they do, so I don't think that's the problem. Turbines are well understood technology so they're they're the major impediment to economic viability then everything else must be working amazing.
I do think economic viability will be a major problem though. The fusion hype crowd focus on Q>1 being their major milestone but that's still a long way away from it being profitable to operate a fusion plant.
IIRC, a cascade of turbines, going from high-temperature water steam to basically room-temperature ammonia, can reach a total efficiency exceeding 50%.
Of course, ammonia is chemically active, and water at 700K is also chemically active, so the turbines, as usual, require special metallurgy and regular maintenance.
Turbines are indeed very well understood and mature technology, and even where the fuel is almost free and the reactor (a coal burner) is cheap, steam turbines are too expensive to compete with solar electricity in most places now.
How many more trillion $$$ are going to be pissed away on this?
I'd say, shelve the idea for 50 to 75 years and then look at it again.
In the mean time, I think we could make major headway on a global elecrtic grid, that connects whatever part of the planet is sunny with all the rest. Add to this some major storage capacity, and I think we could resolve almost all of our energy problems with the money that would be wasted on further fusion efforts.
> In sunny places (and I fully acknowledge that's not all of the world) it's still going to be hard to beat sticking bits of special glass out in a field and connecting wires to it.
Why not both? Can't we utilize the special glass to fetch energy from a man-made fusion reactor, in similar ways as we use it to fetch energy from the natural fusion reactor in space?
Treating this as a serious question: because the big reactor in the sky has a built-in gravity well and a very big sphere of vacuum that works as a free containment system while still letting just enough energy out to be useful. Also we don’t need to refill it or handle the spent fuel.
D + T or D + D fusion reaction produces neutrons and hard gamma radiation. The special glass cannot capture the energy of these directly.
The natural fusion reactor has a blanket that is literally thousands of kilometers thick. It effectively converts much of this into longer-wavelength electromagnetic radiation, from ultraviolet to infrared, with quite some visible light. That's what the special glass can make use of.
Also, the highly radioactive blanket is kept hundreds of millions km away from consumers, which helps alleviate the problem of disposing of radioactive waste. With a planet-based fusion reactor, we'd have to think where to put thousands and thousands of tons of slightly radioactive concrete which would realistically serve as a blanket.
I'm having trouble understanding what's actually been accomplished here. The article provides a good overview of Tokamak vs Stellarator, but seems to jump back and forth between proclaiming this as an innovative breakthrough and saying it's just a framework to test ideas.
> In terms of its ability to confine particles, Muse is two orders of magnitude better than any stellarator previously built
Is it? It doesn't seem as if they have reached first plasma or have plans to do so anytime soon. Using electromagnets to not only confine but also to control the the plasma is a big selling point of the stellarator design, and they don't seem to address this.
This seems really cool, and I love the idea of lower-cost fusion. (Or even just functional fusion.) There are about a dozen companies making real progress in fusion, but I can't quite figure out what this team has actually accomplished.
Seems like the premise is that building these small experimental stellarators is a major cost in doing fusion research and therefore if we can bring the cost of these down from billions to less than a million, more teams can do more research faster, even if this specific design never generates any economic power. I have no idea if this premise is true or not -- I'm just a layman who read the article.
“PPPL researchers say their simpler machine demonstrates a way to build stellarators far more cheaply and quickly, allowing researchers to easily test new concepts for future fusion power plants.”
This quote reminded of the SpaceX’s approach to engineering and why they have leapfrogged past Boeing. Instead of spending 10-20 years and billions into a single design, SpaceX iterates.
I want to like this but at break even temperatures these things just melt. How about making ship stones from nuclear waste? You could make car engine sized batteries that would effectively last for years if properly shielded and which would provide essentially free power to run a dozen homes.
how one can estimate a progress of a given design? For example on the photo there are no walls protecting the people in the lab from neutrons. Even the fusor 60 years ago running at 100KV already generated neutron flux requiring such walls.
There's absolutely no way to get fusion with permanent magnets and copper coils.
This is a plasma test stand, and because it is so simple, you can potentially quickly iterate through different field configurations. This is at least a little bit useful, because a full Stellerator is extremely complicated to take apart, so you can't just change the coils around if you want to change something.
People don't understand the fundamental problem of fusion. It's a problem of energy loss. Of enormous energy losses.
Roughly speaking energy can be mechanical, for particles or radiative, for photons. The first one is proportional to the temperature (the famous NRT) and the second is proportional to the fourth power of the temperature. The constant of proportionality is very small, and at regular temperatures we generally don't think of it that much. But at millions of degrees Kelvin, it starts to dominate all considerations.
The heat always moves form hot to cold. In the case of particles the heat flow is proportional to the difference in temperature, and in the case of radiation with the difference in temperature to the power 4. But heat also travels from particles to photons and vice-versa. It doesn't matter how.
The problem with fusion is now this. Suppose that you have a super-duper device, let's call it brompillator. It brings an amount of deuterium-tritium mix at the required temperature, let's say 10 million Kelvin. Now that volume of plasma is surrounded by cold stuff. You can imagine that you have some mirrors, or magnetic fields, or some magic stuff, but the cold hard stuff is that that plasma will want to radiate to the exterior and the flow of heat would be proportional to the surface area times the fourth power of the difference in temperature. Since for all practical purposes the outer temperature is zero, we are talking about the fourth power of 10 million Kelvin. Now that constant of porportionality is very small, it is called the Stefan-Boltzman constant and has a value of about 10^7 W m^-2 K^-4. Let's say the surface area is 1 square meter. So the heat loss happens at a rate of 10^-7 times (10^7)^4 = 10^21 Watts. That is 10^12 GigaWatts. One GW is the output of a decent sized nuclear power plant.
Of course, you can try to shield that plasma, but that shield has to be 99.99999....9% effective, where the number of 9s needs to be about 15 or so.
That is the immensity of the challenge that nobody is willing to tell you about.
How was this overcome in the case of the thermonuclear bomb? People imagine that once you have a fission bomb, you just put some deuterium-tritium mix next to it, and voila, you have a fusion bomb. No. The world's greatest minds have worked at this issue for about 5 years. The solution was something like this: if you first compress significantly the volume of fusion fuel, then the heat losses are much smaller (remember they are proportional to the area, and that's proportional to the square of the radius). They will still be tremendous, but you don't even aim to keep the reaction going for a long time. The duration of the fusion reaction in a thermonuclear bomb is still classified information, but public sources put it at the order of 1 microsecond. The heat losses are still tremendous, but for a short moment the heat gains from the fusion reaction are even greater, so ignition is achieved.
In the NIF experiment that achieved more than breakeven 2 years ago, the fusion lasted less than 10 nanoseconds [1].
If someone thinks the brompillator will achieve fusion and that will run for years, or even hours, or seconds, they don't understand the fundamental problem. Unfortunately, nobody is willing to ask hard questions about this, not even Sabine Hossenfelder.
I don't disagree with this statement, fusion researchers do care about energy loss when they're evaluating fusion reactor feasibility. They talk more about neutron loss, bremsstrahlung radiation and synchrotron radiation instead of blackbody radiation. A paper on this: https://arxiv.org/pdf/2104.06251
I tried to search more about plasma energy losses, and it becomes extremely complicated with insane amount of equations. One thing that I can get is that you can't model fusion reactor plasmas as a blackbody radiator because plasma is that complicated. If plasma is simpler then we should either have fusion already or we have given up on fusion research a long time ago
> because fusion reactor plasma is optically thin, it doesn't radiate following blackbody radiation law.
It still follows the laws of blackbody radiation - it's just that the emissivity of the body is part of the equation.
A classical blackbody has a emissivity of 1. This means not only that it absorbs radiation really well, it also means it's really good at radiating energy away.
Things that have low emissivity (all things transparent and all things reflective) are also really bad at radiating energy away. This is used for solar-thermal collectors today: you make them from an engineered material that is completely black at in the visible range, but highly reflective in the infra-red. That way, they absorb sunlight and get hot, but they don't lose heat energy because they can't radiate it away as heat radiation.
And yes, fusion plasma is extremely, extremely transparent. Not only is it extremely thin (ITER or Wendeltstein 7-X contain only 1-2g of hydrogen during operation), hydrogen is also extremely bad at absorbing gamma-rays (black body radiation at 1e8 K).
Inertial confinement fusion like the NIF is not intended to run continuously, so the 2ns duration is irrelevant. The surface area for the calculation is not the surface area of the machine, but the surface area of the volume in which the fusion is occurring which could be very much smaller than that.
The heat loss is practically limited by the mass of hydrogen fusing in the machine. To have a continuous heat flux of 10^21 watts you would need to fuse ~4*10^5 kg of hydrogen every second. Which clearly these machines are not intended to do.
> Inertial confinement fusion like the NIF is not intended to run continuously, so the 2ns duration is irrelevant.
Indeed. I do think ICF has a future. The issue I described applies to machines that attempt to achieve sustained fusion. Pulsed fusion is ok.
> The heat loss is practically limited by the mass of hydrogen fusing in the machine.
Yes, but it goes the other way too. If the heat loss is to high you can't sustain fusion because you can't stay at the required temperature for long enough.
> People don't understand the fundamental problem of fusion. It's a problem of energy loss. Of enormous energy losses.
I'm not sure that's even true, because if you manage to crack that, you still have the problem that your sustainable reaction is pumping out most of its energy in the form of very fast neutrons, which are (a) very hard to harvest energy from and (b) extremely bad for people and materials if you don't. You could have a self-sustaining reaction that you can't actually use!
It requires much higher temperatures, and thus suffers much higher Bremsstrahlung. You can sniff out BS quickly with anyone claiming a steady state aneutronic reactor. A working aneutronic design would necessarily be pulsed. Not that it can't be done, but you'd first need to pass through DT and DD performance metrics and then go another order of magnitude. No one's done that yet.
High energy neutrons leave the system. They cause damage to the container (ie neutron embrittlement) and that's a separate problem. But the real problem is the energy loss from the system.
Charged particles can be contained. Personally I think there are limits to even that because a high-temperature plasma is turbulent [1]. Containing that is just a hugely difficult problem.
I'm not convinced that nuclear fusion will ever be commercially viable.
Also, the fusion reactors will inevitably have poor volumetric power density, due to limits on power/area through the first wall and the square-cube law.
Engineering studies of stellarators found they tend to be larger and have worse economics than tokamaks.
We need a liquid with high heat capacity, large hydrogen content (for high neutron interaction rate) and a solid 300 year engineering history in heat engine applications. Better if it is nontoxic and environmentally friendly as well.
I'm a physics layman, and I'm having some trouble with uniting the content of your comment with the fact that existing magnetic confinement experiments have reported maintaining a plasma at the right temperature for longer times (not with fusion, but with microwave heating, and with the power of those heaters in the 10MW range).
Have I understood the consequences of those reports wrong? Does the heat loss you talk about only occur with fusion? (And if so, is it even a problem if the conditions for fusion to occur can be created by external heating this "easily"?)
In order to protect astronauts from decompression, the hull of a spacecraft has to be insanely good at stopping gas particles. Not 99.5% good, but like 99.9999999…% with 20 zeros! That’s very good.
But a thin metal sheet has no trouble doing this, as demonstrated by the Apollo lunar lander.
Some things are just not as hard as they sound. Magnetic confinement works very well. It easily achieves the necessary 9’s.
It’s just hard to keep it stable at millions of degrees, but that’s a different problem.
I'm not sure, but we can try to figure out what is going on. And by the way I'm a physics layman too. I just read a lot of books about fission and a few about fusion too, it happens to be my hobby. When I'm bored, the bookmarks that I browse are [1] and [2].
So, when reports state that the a certain temperature was achieved and sustained for a certain period of time, what are they actually saying? We could go and find an article and get into some details, but I imagine they say that somewhere in the plasma that temperature was reached and sustained. But it is quite likely that that region is quite microscopic, maybe a very, very thin inner torus inside a larger torus. There is a gradient of temperature from the region where the announced temperature happens to the walls of the device. But one way or another that thin inner region can't have a surface area of anything close to 1 square meter. To get to 1 GW of power, you need 10^-12 square meters, and to get to 10 MW you need 10^-14 m2. That's about the surface area of a torus of (circular) length 3 m and diameter 1 femtometer. 1 femtometer is roughly the size of a nucleus of deuterium or tritium, so in principle this is the minimum diameter of a torus where you can talk about fusion.
Just wanted to say thank you for this comment, fascinating and perfect example of the beauty of HN. Relatively fresh off The Making of the Atomic Bomb and while fusion was not at all a focus, this (incomplete) impression is exactly what I came away with.
Is there any chance you'd recommend any books related to these topics? The walk through decades of revelations in physics was the most enjoyable aspect of that book, I'd love to continue building on that story.
Well if you liked "The Making of the Atomic Bomb" I can strongly recommend the follow up "Dark Sun" that covers both the Soviet atomic bomb program and the development of the H-bomb by the US.
Not sure how I missed that Rhodes wrote a continuation! The clarity of writing about physics for a layperson has been wonderful, glad to find there's more. Much appreciated.
One of the points I took away from that book being that "H-bomb" weapon design is as much about fission as fusion - most designs being fission-fusion-fission with most energy coming from the latter fission stage.
That book is absolutely great. As the sibling comment mentions, the Dark Sun is also great.
Here are some more books I read on this topic. One was written by someone who was very close to the Ulam and Teller inner circle: "Building the H Bomb" by Kenn Ford. Another one is "Sun in a Bottle: The Strange History of Fusion and the Science of Wishful Thinking" by Charles Seife. And finally, you can't go wrong with any book written by James Mahaffey.
Cool.
But even if you have a working stellarator that's a very long way from an economically viable energy source. You've still got to a) figure out how to cheaply convert the released energy into electricity (and the baseline way of doing that in D-T fusion is...a steam turbine), and b) figure out materials that can survive the radiation bombardment for a sufficiently long time.
In sunny places (and I fully acknowledge that's not all of the world) it's still going to be hard to beat sticking bits of special glass out in a field and connecting wires to it.
But we should sure as heck keep tinkering away at it!
I don't think the point of this project is to be closer to economic viability, but to demonstrate an approach that would lead to faster economic viability due to allowing faster iteration and evaluation of small scale experimental designs.
In HN terms they are demonstrating a significantly faster REPL by keeping the project small and minimising use of esoteric or highly bespoke components.
It's the closest you can get to building your own stellarator by walking into radioshack. I think it's a pretty cool idea.
Yep, sure. And that's great.
Not sure why steam engine is written like it's some crazy reveal. It's the accepted way of doing it in most other fossile fuel plants, to the point that there's a saying for it - someone, somewhere X is boiling water so you can boil water. There are other designs that have been proposed, but I'm not aware of anything in production that takes heat and turn it into electricity at powerplant scale. Eg RTGs exist, but only if you need a computer power supply's worth of power for decades.
Perhaps because the steam part alone, even if powered by pixie dust and magic, can't compete on price with solar, and probably solar and batteries either (varies with latitude and time but the cost is going down and the latitude is getting wider).
I think OpenAI is investing into a fusion design that avoids steam for exactly this reason, so it's not just an anti-nuclear talking point.
Newer solar panels don’t need full sun to function. It’s economically viable to place them further north and in cloudier climates now. So the area where these are alternatives are viable may be shrinking faster than you would expect.
Indeed, though in colder climates you do have the problem that peak electricity demand is precisely when you get minimum solar production.
But in sunnier, warmer parts of the world (which notably includes India, Pakistan, Bangladesh, Indonesia, Nigeria, Egypt, Ethiopia, Iran, Mexico, and Brazil, amongst others), over the next few decades it's hard to see anything much competing against solar and batteries for the bulk of energy usage.
That isn’t true yet, because most cold places don’t use electricity for heating; in Boston for example the norm is oil or gas furnaces. Europe typically uses gas for heating, as we all learned during the Ukraine war’s beginning. When I lived in the Boston area, our electricity use, peaked in the summer during air-conditioning season. That is, until we got a heat pump, then our electricity spiked like crazy in the winter. but we are long way from having the majority of homeowners using heat pumps for their primary heat source. (Also heat pumps are kinda awful compared to furnaces in my experience so we might never get there).
>heat pumps are kinda awful
Heat pumps work great under the right circumstances. In cold climates like Boston and north of it, ideally you want ground source, which means digging deep (or wide) holes to flow water through instead of just pulling heat from outside air. Alternatively, (or additionally, ideally) they're best used in homes with very tight air envelopes, good insulation, and heat/energy recovery ventilation (HRV/ERV) systems. Installed in radiant heating systems in well-built houses, heat pumps are fantastic and you'll be comfortable all year for pennies on the dollar.
The reality is that most (US) construction, especially older, is just terrible in terms of air seal and insulation. Couple that with a potentially undersized air-source heat pump, which gets inefficient as the outside temp gets near the low end of its operating range, and you will not have a great time. But even then they are a good way to supplement gas heating, so you can limit furnace use to the coldest weeks of the year and do the environment a favor, as well as cool in the summer.
There's a green trend in much of central-north europe to use electric heating, mainly by promoting air heat pumps.
> But in sunnier, warmer parts of the world
Though solar panels prefer if it isn't too hot. Still, any such inefficiencies are easily outweighed by their low cost. In Netherlands it is common to install practice (and advised) more Wp than the inverter can handle. Maybe also a solution if it gets too hot, make it up by having more panels.
As someone living in the further north, the issue of sun not going above the horizon for months at a time makes them a non-starter without 100% coverage from other sources, making them rather expensive without externalizing the costs to someone else
Depending on the location, there could be sources of geothermal energy, like Iceland. There are also various prototypical approaches at storing solar energy for the winter months, like molten salt (obviously a bit too spicy outside of industrial settings), or generating hydrogen and storing it in iron ore. The latter approach was recently pioneered by ETH Zürich.
https://ethz.ch/en/news-and-events/eth-news/news/2024/08/iro...
Sure. But, globally, you're the exception, not the rule.
I don't live nearly that far north and the low sun angle for much of the year frequent cloud cover and couple weeks a year that a panel will have snow on it adds up to it just not being economically viable vs other generation methods. If you DIY the whole thing out of used panels and other low cost parts then it makes good sense but that is a huge time sink but that's kind of the exception. I hate my utility and look into stuff like this every year or so so it's not like I'm using 2010 numbers for panel prices either.
Technically this is correct. Look at the small circle near the equator here which encircles more than half of world population: https://en.m.wikipedia.org/wiki/Valeriepieris_circle
The other half needs energy too though! And high-latitude areas are known to have dense enough populations and exceedingly high economic productivity, from Sweden to Ireland to New England.
I was commenting on the parent
>It’s economically viable to place them further north and in cloudier climates now
Which it really isn't unless we're neglecting to count in the need for long-term backup power. There still exists some amount of solar even now, though government subsidies and government provided peaker plants may have something to do with it
Globally it's the US that is the exception. Solars fall apart elsewhere be it solar irradiance or population density or prevalence of flying debris or whatever it might be.
Only in the US, Australia, and few other places it makes sense to just put up some panels for free energy. Incidentally also sometimes apply to EV arguments.
The next option for you would be to take the panels that you would have used and put them in an area where they work better.
Use the extra electricity to power machines that hydrogenate CO2 extracted from the atmosphere and turn it into methane.
Methane generated this way, (being the principal content of natural gas), would allow us to deliver nearly carbon-free fossil fuels to you and people in your biome, and everyone wins.
As someone living north of the arctic circle.. there are other issues than just clouds.
Well as long as you have vast amount of storage capacity + overprovision, or an alternative source of on-demand electricity, the cost of which I never see included when comparing to on-demand energy sources like nuclear of fossil fuels.
> Well as long as you have vast amount of storage capacity + overprovision
Why would that need to be? The full needs need to be taken into account. But that's a TCO calculation, not something to add to the solar cost.
Nuclear energy in Europe tends to be way more expensive than initially budgeted. Resulting in a crazy difference in the kWh cost vs solar/wind. And there's more ways to store "electricity" than just batteries.
Because what is implicit in saying that solar (or wind) is cheaper than nuclear per W produced is that it is a viable alternative to nuclear. But to be an alternative you need to be able to meet the demand whether there is light/wind or not (with a near zero probability of a blackout). And if you need to do that then you need either a bunch of alternative on demand sources of energy (UK is using LNG) or some big storage capacity (+overprovision to fill them when it is sunny/windy). Nuclear doesn't need that. If you don't factor those costs you are comparing apples and oranges.
I believe nuclear isn’t an on demand energy source. It’s great for base load, but it doesn’t scale up or down quickly.
That was true in the 70s. I think anything built in europe since the 80s can do load following and France does that every day. I think by regulation new plants must be able to adapt their production by something like 5% per 10min (or something of that order of magnitude), well within the range you need to meet intra day variations.
Now whether it is optimal economically is another question. If you have some sources of energy in your grid that cost per usage (eg fossil fuels), you should rather switch those off than nuclear which costs the same whether you use it or not. But if your grid is almost all nuclear (eg France), you do load following.
> but it doesn’t scale up or down quickly.
The newer ones can. Though that's partly my dislike. There's always yet another new technological solution for the various drawbacks. Which often are unproven (e.g. SMRs) and that often results in crazy cost overruns.
Nuclear has a crazy high fixed cost so ideally you still run it continuously.
Steam turbines are very good at what they do, so I don't think that's the problem. Turbines are well understood technology so they're they're the major impediment to economic viability then everything else must be working amazing.
I do think economic viability will be a major problem though. The fusion hype crowd focus on Q>1 being their major milestone but that's still a long way away from it being profitable to operate a fusion plant.
IIRC, a cascade of turbines, going from high-temperature water steam to basically room-temperature ammonia, can reach a total efficiency exceeding 50%.
Of course, ammonia is chemically active, and water at 700K is also chemically active, so the turbines, as usual, require special metallurgy and regular maintenance.
Sure, but your cascade of turbines isn't free.
The capital cost of just the turbines is enough to make it hard to compete with solar and batteries in many situations.
Turbines are indeed very well understood and mature technology, and even where the fuel is almost free and the reactor (a coal burner) is cheap, steam turbines are too expensive to compete with solar electricity in most places now.
That's all just good money after bad.
How many more trillion $$$ are going to be pissed away on this?
I'd say, shelve the idea for 50 to 75 years and then look at it again.
In the mean time, I think we could make major headway on a global elecrtic grid, that connects whatever part of the planet is sunny with all the rest. Add to this some major storage capacity, and I think we could resolve almost all of our energy problems with the money that would be wasted on further fusion efforts.
> In sunny places (and I fully acknowledge that's not all of the world) it's still going to be hard to beat sticking bits of special glass out in a field and connecting wires to it.
Why not both? Can't we utilize the special glass to fetch energy from a man-made fusion reactor, in similar ways as we use it to fetch energy from the natural fusion reactor in space?
Treating this as a serious question: because the big reactor in the sky has a built-in gravity well and a very big sphere of vacuum that works as a free containment system while still letting just enough energy out to be useful. Also we don’t need to refill it or handle the spent fuel.
In theory, yes. Those are called radiovoltaics and are being researched.
No.
D + T or D + D fusion reaction produces neutrons and hard gamma radiation. The special glass cannot capture the energy of these directly.
The natural fusion reactor has a blanket that is literally thousands of kilometers thick. It effectively converts much of this into longer-wavelength electromagnetic radiation, from ultraviolet to infrared, with quite some visible light. That's what the special glass can make use of.
Also, the highly radioactive blanket is kept hundreds of millions km away from consumers, which helps alleviate the problem of disposing of radioactive waste. With a planet-based fusion reactor, we'd have to think where to put thousands and thousands of tons of slightly radioactive concrete which would realistically serve as a blanket.
The actual paper describing the construction of the MUSE Stellarator: https://www.cambridge.org/core/journals/journal-of-plasma-ph...
I'm having trouble understanding what's actually been accomplished here. The article provides a good overview of Tokamak vs Stellarator, but seems to jump back and forth between proclaiming this as an innovative breakthrough and saying it's just a framework to test ideas.
> In terms of its ability to confine particles, Muse is two orders of magnitude better than any stellarator previously built
Is it? It doesn't seem as if they have reached first plasma or have plans to do so anytime soon. Using electromagnets to not only confine but also to control the the plasma is a big selling point of the stellarator design, and they don't seem to address this.
This seems really cool, and I love the idea of lower-cost fusion. (Or even just functional fusion.) There are about a dozen companies making real progress in fusion, but I can't quite figure out what this team has actually accomplished.
What am I missing?
Seems like the premise is that building these small experimental stellarators is a major cost in doing fusion research and therefore if we can bring the cost of these down from billions to less than a million, more teams can do more research faster, even if this specific design never generates any economic power. I have no idea if this premise is true or not -- I'm just a layman who read the article.
The struggle for funding can explain a lot of things
I admittedly don't know much about fusion reactors, but I do love that the thing which you create a star within is called a "Stellarator".
“PPPL researchers say their simpler machine demonstrates a way to build stellarators far more cheaply and quickly, allowing researchers to easily test new concepts for future fusion power plants.”
This quote reminded of the SpaceX’s approach to engineering and why they have leapfrogged past Boeing. Instead of spending 10-20 years and billions into a single design, SpaceX iterates.
I want to like this but at break even temperatures these things just melt. How about making ship stones from nuclear waste? You could make car engine sized batteries that would effectively last for years if properly shielded and which would provide essentially free power to run a dozen homes.
how one can estimate a progress of a given design? For example on the photo there are no walls protecting the people in the lab from neutrons. Even the fusor 60 years ago running at 100KV already generated neutron flux requiring such walls.
There's absolutely no way to get fusion with permanent magnets and copper coils.
This is a plasma test stand, and because it is so simple, you can potentially quickly iterate through different field configurations. This is at least a little bit useful, because a full Stellerator is extremely complicated to take apart, so you can't just change the coils around if you want to change something.
People don't understand the fundamental problem of fusion. It's a problem of energy loss. Of enormous energy losses.
Roughly speaking energy can be mechanical, for particles or radiative, for photons. The first one is proportional to the temperature (the famous NRT) and the second is proportional to the fourth power of the temperature. The constant of proportionality is very small, and at regular temperatures we generally don't think of it that much. But at millions of degrees Kelvin, it starts to dominate all considerations.
The heat always moves form hot to cold. In the case of particles the heat flow is proportional to the difference in temperature, and in the case of radiation with the difference in temperature to the power 4. But heat also travels from particles to photons and vice-versa. It doesn't matter how.
The problem with fusion is now this. Suppose that you have a super-duper device, let's call it brompillator. It brings an amount of deuterium-tritium mix at the required temperature, let's say 10 million Kelvin. Now that volume of plasma is surrounded by cold stuff. You can imagine that you have some mirrors, or magnetic fields, or some magic stuff, but the cold hard stuff is that that plasma will want to radiate to the exterior and the flow of heat would be proportional to the surface area times the fourth power of the difference in temperature. Since for all practical purposes the outer temperature is zero, we are talking about the fourth power of 10 million Kelvin. Now that constant of porportionality is very small, it is called the Stefan-Boltzman constant and has a value of about 10^7 W m^-2 K^-4. Let's say the surface area is 1 square meter. So the heat loss happens at a rate of 10^-7 times (10^7)^4 = 10^21 Watts. That is 10^12 GigaWatts. One GW is the output of a decent sized nuclear power plant.
Of course, you can try to shield that plasma, but that shield has to be 99.99999....9% effective, where the number of 9s needs to be about 15 or so.
That is the immensity of the challenge that nobody is willing to tell you about.
How was this overcome in the case of the thermonuclear bomb? People imagine that once you have a fission bomb, you just put some deuterium-tritium mix next to it, and voila, you have a fusion bomb. No. The world's greatest minds have worked at this issue for about 5 years. The solution was something like this: if you first compress significantly the volume of fusion fuel, then the heat losses are much smaller (remember they are proportional to the area, and that's proportional to the square of the radius). They will still be tremendous, but you don't even aim to keep the reaction going for a long time. The duration of the fusion reaction in a thermonuclear bomb is still classified information, but public sources put it at the order of 1 microsecond. The heat losses are still tremendous, but for a short moment the heat gains from the fusion reaction are even greater, so ignition is achieved.
In the NIF experiment that achieved more than breakeven 2 years ago, the fusion lasted less than 10 nanoseconds [1].
If someone thinks the brompillator will achieve fusion and that will run for years, or even hours, or seconds, they don't understand the fundamental problem. Unfortunately, nobody is willing to ask hard questions about this, not even Sabine Hossenfelder.
[1] https://journals.aps.org/prl/pdf/10.1103/PhysRevLett.132.065...
I don't disagree with this statement, fusion researchers do care about energy loss when they're evaluating fusion reactor feasibility. They talk more about neutron loss, bremsstrahlung radiation and synchrotron radiation instead of blackbody radiation. A paper on this: https://arxiv.org/pdf/2104.06251
So I did some searching, and found this stack exchange asking this question: https://physics.stackexchange.com/questions/415028/how-do-fu... . They argued that because fusion reactor plasma is optically thin, it doesn't radiate following blackbody radiation law. This textbook also say that: https://www.cambridge.org/core/books/abs/physics-of-plasmas/...
I tried to search more about plasma energy losses, and it becomes extremely complicated with insane amount of equations. One thing that I can get is that you can't model fusion reactor plasmas as a blackbody radiator because plasma is that complicated. If plasma is simpler then we should either have fusion already or we have given up on fusion research a long time ago
> because fusion reactor plasma is optically thin, it doesn't radiate following blackbody radiation law.
It still follows the laws of blackbody radiation - it's just that the emissivity of the body is part of the equation.
A classical blackbody has a emissivity of 1. This means not only that it absorbs radiation really well, it also means it's really good at radiating energy away.
Things that have low emissivity (all things transparent and all things reflective) are also really bad at radiating energy away. This is used for solar-thermal collectors today: you make them from an engineered material that is completely black at in the visible range, but highly reflective in the infra-red. That way, they absorb sunlight and get hot, but they don't lose heat energy because they can't radiate it away as heat radiation.
And yes, fusion plasma is extremely, extremely transparent. Not only is it extremely thin (ITER or Wendeltstein 7-X contain only 1-2g of hydrogen during operation), hydrogen is also extremely bad at absorbing gamma-rays (black body radiation at 1e8 K).
Inertial confinement fusion like the NIF is not intended to run continuously, so the 2ns duration is irrelevant. The surface area for the calculation is not the surface area of the machine, but the surface area of the volume in which the fusion is occurring which could be very much smaller than that.
The heat loss is practically limited by the mass of hydrogen fusing in the machine. To have a continuous heat flux of 10^21 watts you would need to fuse ~4*10^5 kg of hydrogen every second. Which clearly these machines are not intended to do.
You raise some good points.
> Inertial confinement fusion like the NIF is not intended to run continuously, so the 2ns duration is irrelevant.
Indeed. I do think ICF has a future. The issue I described applies to machines that attempt to achieve sustained fusion. Pulsed fusion is ok.
> The heat loss is practically limited by the mass of hydrogen fusing in the machine.
Yes, but it goes the other way too. If the heat loss is to high you can't sustain fusion because you can't stay at the required temperature for long enough.
> People don't understand the fundamental problem of fusion. It's a problem of energy loss. Of enormous energy losses.
I'm not sure that's even true, because if you manage to crack that, you still have the problem that your sustainable reaction is pumping out most of its energy in the form of very fast neutrons, which are (a) very hard to harvest energy from and (b) extremely bad for people and materials if you don't. You could have a self-sustaining reaction that you can't actually use!
Aneutronic fusion has been previously mentioned, specifically HB11.
https://en.m.wikipedia.org/wiki/Aneutronic_fusion
It requires much higher temperatures, and thus suffers much higher Bremsstrahlung. You can sniff out BS quickly with anyone claiming a steady state aneutronic reactor. A working aneutronic design would necessarily be pulsed. Not that it can't be done, but you'd first need to pass through DT and DD performance metrics and then go another order of magnitude. No one's done that yet.
You're talking about the same thing.
High energy neutrons leave the system. They cause damage to the container (ie neutron embrittlement) and that's a separate problem. But the real problem is the energy loss from the system.
Charged particles can be contained. Personally I think there are limits to even that because a high-temperature plasma is turbulent [1]. Containing that is just a hugely difficult problem.
I'm not convinced that nuclear fusion will ever be commercially viable.
All while we already have emission-free, reliable and cheap energy production in the form of solar power. [1]: https://www.psfc.mit.edu/research/topics/plasma-turbulence
Also, the fusion reactors will inevitably have poor volumetric power density, due to limits on power/area through the first wall and the square-cube law.
Engineering studies of stellarators found they tend to be larger and have worse economics than tokamaks.
We need a liquid with high heat capacity, large hydrogen content (for high neutron interaction rate) and a solid 300 year engineering history in heat engine applications. Better if it is nontoxic and environmentally friendly as well.
But what if you breathe it in!!!1! Dihydrogen monoxide is no joke, many people are killed by it every year.
I'm a physics layman, and I'm having some trouble with uniting the content of your comment with the fact that existing magnetic confinement experiments have reported maintaining a plasma at the right temperature for longer times (not with fusion, but with microwave heating, and with the power of those heaters in the 10MW range).
Have I understood the consequences of those reports wrong? Does the heat loss you talk about only occur with fusion? (And if so, is it even a problem if the conditions for fusion to occur can be created by external heating this "easily"?)
In order to protect astronauts from decompression, the hull of a spacecraft has to be insanely good at stopping gas particles. Not 99.5% good, but like 99.9999999…% with 20 zeros! That’s very good.
But a thin metal sheet has no trouble doing this, as demonstrated by the Apollo lunar lander.
Some things are just not as hard as they sound. Magnetic confinement works very well. It easily achieves the necessary 9’s.
It’s just hard to keep it stable at millions of degrees, but that’s a different problem.
Wait til you guys hear about DNA transcription error rates!
Are you saying the way to contain plasma is the shape of a double helix
*Slaps proton*
These things can fit so many nines of reliability.
I'm not sure, but we can try to figure out what is going on. And by the way I'm a physics layman too. I just read a lot of books about fission and a few about fusion too, it happens to be my hobby. When I'm bored, the bookmarks that I browse are [1] and [2].
So, when reports state that the a certain temperature was achieved and sustained for a certain period of time, what are they actually saying? We could go and find an article and get into some details, but I imagine they say that somewhere in the plasma that temperature was reached and sustained. But it is quite likely that that region is quite microscopic, maybe a very, very thin inner torus inside a larger torus. There is a gradient of temperature from the region where the announced temperature happens to the walls of the device. But one way or another that thin inner region can't have a surface area of anything close to 1 square meter. To get to 1 GW of power, you need 10^-12 square meters, and to get to 10 MW you need 10^-14 m2. That's about the surface area of a torus of (circular) length 3 m and diameter 1 femtometer. 1 femtometer is roughly the size of a nucleus of deuterium or tritium, so in principle this is the minimum diameter of a torus where you can talk about fusion.
[1] https://www.ncnr.nist.gov/resources/n-lengths/
[2] https://www.oecd-nea.org/janisweb/
Just wanted to say thank you for this comment, fascinating and perfect example of the beauty of HN. Relatively fresh off The Making of the Atomic Bomb and while fusion was not at all a focus, this (incomplete) impression is exactly what I came away with.
Is there any chance you'd recommend any books related to these topics? The walk through decades of revelations in physics was the most enjoyable aspect of that book, I'd love to continue building on that story.
Well if you liked "The Making of the Atomic Bomb" I can strongly recommend the follow up "Dark Sun" that covers both the Soviet atomic bomb program and the development of the H-bomb by the US.
Not sure how I missed that Rhodes wrote a continuation! The clarity of writing about physics for a layperson has been wonderful, glad to find there's more. Much appreciated.
One of the points I took away from that book being that "H-bomb" weapon design is as much about fission as fusion - most designs being fission-fusion-fission with most energy coming from the latter fission stage.
That book is absolutely great. As the sibling comment mentions, the Dark Sun is also great.
Here are some more books I read on this topic. One was written by someone who was very close to the Ulam and Teller inner circle: "Building the H Bomb" by Kenn Ford. Another one is "Sun in a Bottle: The Strange History of Fusion and the Science of Wishful Thinking" by Charles Seife. And finally, you can't go wrong with any book written by James Mahaffey.
These are exactly what I was looking for. Many thanks!
It would be entirely reasonable to wonder if fission-pumped fusion could be scaled down and pulsed.
It would be a Lovecraftian nightmare of unmentionable proportions to actually operate, but you could imagine it breaking even.