Is particle physics dead, dying, or just hard?

(quantamagazine.org)

138 points | by mellosouls 13 hours ago ago

220 comments

  • mattlangston 10 hours ago

    Experimental particle physicist here. It's just hard.

    I measured the electron's vector coupling to the Z boson at SLAC in the late 1990s, and the answer from that measurement is: we don't know yet - and that's the point.

    Thirty years later, the discrepancy between my experiment and LEP's hasn't been resolved.

    It might be nothing. It might be the first whisper of dark matter or a new force. And the only way to find out is to build the next machine. That's not 'dead', that's science being hard.

    My measurement is a thread that's been dangling for decades, waiting to be pulled.

    • sashank_1509 4 hours ago

      What would the cost of the “next machine” be? Is it going to be tens of billions or can we make progress with lesser money. If it is going to be tens of billions, then maybe we need to invest in engineering to reduce this cost, because it’s not sustainable to suspend thirty years, tens of billions for every incremental improvement.

      • ForgotIdAgain 3 minutes ago

        I think that engineering progress made while building those machines are maybe more relevant for practical technical development than the discovery they make.

      • sigmoid10 4 hours ago

        This kind of slow, incremental improvement that costs tens of billions of dollars and takes decades gave us the microchips that ultimately enabled you to type this comment on your phone/computer. The return on that investment is obvious.

        But it is not just about making money: The entire field of radiation therapy for cancer exists and continues to improve because people figured out ways to control particle beams with extreme precision and in a much more economical way to study particle physics. Heck, commercial MRIs exist and continue to improve because physicists want cheaper, stronger magnets so they can build more powerful colliders. What if in the future you could do advanced screening quickly and without hassle at your GP's office instead of having to wait for an appointment (and possibly pay lots of money) at an imaging specialist center? And if they find something they could immediately nuke it without cutting you open? We're talking about the ultimate possibility of Star Trek level medbays here.

        Let the physicists build the damn thing however they want and future society will be better off for sure. God knows what else they will figure out along the way, but it will definitely be better for the world than sinking another trillion dollars on wars in the middle east.

        • accidentallfact 11 minutes ago

          I'm so sick of this "good guy approach". It didn't give us progress, it gave us those like Watt and Intel, highly celebrated bullshiters who stopped being relevant as soon as their IP deadlock expired.

          I suppose the only solution is undeground science. Do enough progress in silence, dont disseminare the results, unless the superiority becomes so obvious that an armed resistance becomes unthinkable.

        • brazzy 3 hours ago

          > This kind of slow, incremental improvement that costs tens of billions of dollars and takes decades gave us the microchips that ultimately enabled you to type this comment on your phone/computer.

          No. These two cases are absurdly different, and you're even completely misunderstanding (or misrepresenting) the meaning of the "tens of billions of dollars" figure.

          Microchips were an incremental improvement where the individual increments yielded utility far greater than the investment.

          For particle physics, the problem is that the costs have exploded with the size of facilities to reach higher energies (the "tens of billions of dollars" is for one of them) but the results in scientific knowledge (let alone technological advances) have NOT. The early accelerators cost millions or tens of millions and revolutionized our undestanding of the universe. The latest ones cost billions and have confirmed a few things we already thought to be true.

          > Let the physicists build the damn thing and future society will be better off for sure.

          Absolutely not.

          • sigmoid10 an hour ago

            >Microchips were an incremental improvement where the individual increments yielded utility far greater than the investment.

            You should look up how modern EUV lithography was commercialised. This was essentially a big plasma physics puzzle. If ASML hadn't taken on a ridiculous gamble (financially on the same order of magnitude as a new collider, esp. for a single colpany) with the research, Moore's law would have died long ago and the entire tech industry would be affected. And there was zero proof that this was going to work beforehand.

            • varjag 31 minutes ago

              EUV lith would have absolutely been achieved if LHC wasn't ever built.

          • plastic-enjoyer 3 hours ago

            > Absolutely not

            Engineers not being able to fathom that by building this huge-ass and complicated machines to answer questions about the fundamentals of nature, other problems are solved or new things are invented that improve and change our life will never not be funny to me

            • brazzy 2 hours ago

              Just as funny as armchair science enthusiasts not being able to fathom that research budgets are limited and it makes sense to redirect them into other, more promising fields when a particular avenue of research is both extremely expensive and has shown diminishing returns for decades.

              • hugh-avherald 40 minutes ago

                Does targeting research towards 'more promising' fields actually produce greater economic returns?

              • XorNot 11 minutes ago

                The more important question is, are you content with simply dismantling any progress in accelerator science at all for the next century? Because the LHCs successors won't be online till the 2050s at least. If you don't find them now though and start the work, then no one does the work, no one studies the previous work (because there's no more grant money in it) and the next generation of accelerator engineers and physcists doesn't get trained and the knowledge and skill base withers and literally dies.

                Because the trade off of no new accelerators is the definite end of accelerator science for several generations.

          • crispyambulance an hour ago

            >> Let the physicists build the damn thing and future society will be better off for sure.

            > Absolutely not.

            And what do YOU mean, "absolutely not"? You have no more say in what happens than anyone else unless you're high level politician, who would still be beholden to their constituents anyway.

            And yet big science, like particle accelerators, STILL gets funding. There's plenty to go around. Sure, every once in a while a political imperative will "pull the plug" on something deemed wasteful or too expensive and maybe sometimes that's right. But we STILL have particle physics, we STILL send out pure science space missions, there are STILL mathematicians and theorists who are paid for their whole careers to study subject matter that has no remotely practical applications.

            Not everything must have a straight-line monetary ROI.

          • bayindirh 2 hours ago

            > Absolutely not.

            I'd not be so sure about that. Doing this research will probably allow us to answer "it works but we don't know exactly why" cases in things we use everyday (i.e. li-ion batteries). Plus, while the machines are getting bigger, the understood tech is getting smaller as the laws of physics allows.

            If we are going to insist on "Absolutely not" path, we should start with proof-of-work crypto farms and AI datacenters which consume county or state equivalents of electricity and water resources for low quality slop.

            • brazzy 2 hours ago

              That "probably" is really more of a "maybe" given the experience with the current big accelerators, and really needs to be weighed against the extreme costs - and other, more promising avenues of research.

              > If we are going to insist on "Absolutely not" path, we should start with proof-of-work crypto farms and AI datacenters which consume county or state equivalents of electricity and water resources for low quality slop.

              Who exactly is the "we" that is able to make this decision? The allocation of research budgets is completely unrelated to the funding of AI datacenters or crypto farms. There is no organization on this planet that controls both.

              And if you're gonna propose that the whole of human efforts should somehow be organized differently so that these things can be prioritized against each other properly, then I'm afraid that is a much, MUCH harder problem than any fundamental physics.

          • Iulioh 3 hours ago

            I'm torn between "yes, these experinets are way too expensive and the knowlage is too niche to be really usefull" and "We said this about A LOT and we found utility in surprising ways so it could be a gamble worth taking"

            That's the problem with cutting edge reaserch....you don't even know if you will ever needed it or if a trilion dollar industry is waiting for just a number to be born

            • brazzy 2 hours ago

              Yes, we don't really know. But at some point the gamble is just too big.

              Because the costs aren't just numbers. They represent hundreds or thousands of person-years of effort. You're proposing that a large number of people should spend their entire lives supporting this (either directly as scientists, or indirectly through funding it) - and maybe end up with nothing to show for it.

              And there's the opportunity costs. You could fund hundreds of smaller, yet still substantial scientific efforts in many different fields for the cost of just one particle accelerator of the size we think is sufficient to yield some new observations.

          • bl0rg 2 hours ago

            > The latest ones cost billions and have confirmed a few things we already thought to be true.

            Yes, but we had hopes that it would lead to more. And had lead to more, something only known to be false in hindsight, who knows where that would have ended us up? What if it upended the standard model instead of reinforcing it?

            > Absolutely not.

            What are we supposed to do then? As humans, I mean. No one knows why we're here, what the universe really is like. We have some pretty good models that we know are wrong and we don't know what wonders the theoretical implications of any successor models might bring. That said, do we really need to motivate fundamental research into the nature of reality with a promise of technology?

            I'm not arguing for mindlessly building bigger accelerators, and I don't think anyone is - there has to exist a solid line of reasoning to warrant the effort. And we might find that there are smarter ways of getting there for less effort - great! But if there isn't, discrediting the venue of particle accelerators due to their high upfront cost as well as historical results would be a mistake. We can afford it, and we don't know the future.

            • brazzy 2 hours ago

              > Yes, but we had hopes that it would lead to more. And had lead to more, something only known to be false in hindsight, who knows where that would have ended us up? What if it upended the standard model instead of reinforcing it?

              Sure, but it didn't. Which is knowledge that really should factor into the decision to build the next, bigger one.

              > What are we supposed to do then? As humans, I mean.

              Invest the money and effort elsewhere, for now. There are many other fields of scientific exploration that are very likely to yield greater return (in knowledge and utility) for less. You could fund a hundred smaller but still substantial intiatives instead of one big accelerator. And be virtually guaranteed to have an exciting breakthrough in a few of them.

              And who knows, maybe a breakthrough in material science or high-voltage electrophysics will substantially reduce the costs for a bigger particle accelerator?

              • bl0rg 2 hours ago

                > Which is knowledge that really should factor into the decision to build the next, bigger one.

                It was always factored in, and of course it would be in any next iteration.

                > Invest the money and effort elsewhere, for now. There are many other fields of scientific exploration that are very likely to yield greater return (in knowledge and utility) for less. You could fund a hundred smaller but still substantial intiatives instead of one big accelerator. And be virtually guaranteed to have an exciting breakthrough in a few of them.

                I agree with this to a large extent. I'm just not against particle accelerators as a venue for scientific advancement and in the best of worlds we could do both.

            • tokai 2 hours ago

              >I'm not arguing for mindlessly building bigger accelerators, and I don't think anyone is

              But you are and they are. Just by the comments here its clear that even suggesting not to use untold billions on maybe pushing theoretical physics a little forward is meet with scorn. The value proposition either, in knowledge or technology, is just not well argued anymore besides hand waving.

              • bl0rg 2 hours ago

                No, I'm not and neither is anyone else. It's common sense that we should explore options that require less effort, just as one would in any project. I'm saying that we can't discredit huge particle accelerators due to, in the grandest scheme of things, a small economic cost and past results of a different experiment.

      • raverbashing 4 hours ago

        The next machine is not necessarily a longer LHC

        There are talks of a Muon collider, also there's a spallation source being built in Sweden(?) and also of an electron 'Higgs factory' (and while the LHC was built for the Higgs boson it is not a great source for it - it is built as a generic tool that could produce and see the Higgs)

    • KolibriFly 27 minutes ago

      So, if the answer were obvious or quick, it wouldn't be worth building machines that take decades to design

    • hippich 41 minutes ago

      Is it hard as in:

      1) we know what to do, but it is expensive

      2) we don't know what to do exactly, but many more people involved can increase search speed, so just need more people

      3) it is purely sequential problem, and therefore it takes a lot of time

      • samus 28 minutes ago

        A combination to some degree. Scientists yearn to stumble upon something hitherto unexplainable that requires a new theory or validates or definitely rules out some of the more fringe theories.

        While other natural sciences often suffer from an abundance of things that "merely" need to be documented, or where simulation capability is the limit, particle physics is mostly based on a theoretical framework from the middle of the 20th century that has mostly beth explored.

        Getting ahead in particle physics comprises measuring many arcane numbers to as high precision as possible until something doesn't line up with existing theories or other measurements anymore. More people could help with brainstorming and measuring things that don't require humongous particle accelerators.

    • kakacik 4 hours ago

      Its a clickbait article name (from otherwise good place), of course its not dead... we are now getting understanding of all things we don't know yet, discrepancies like yours, unified theory and so on.

      Everybody knows we are not there yet and how the final knowledge set will look like, if its even possible to cover it (ie are quarks the base layer or we can go deeper, much deeper all the way to planck scales? dynamics of singularities etc)

  • bananaflag 12 hours ago

    It's basically the opposite situation from 150 years ago.

    Back then, we thought our theory was more or less complete while having experimental data which disproved it (Michelson-Morley experiment, Mercury perihelion, I am sure there are others).

    Right now, we know our theories are incomplete (since GR and QFT are incompatible) while having no experimental data which contradicts them.

    • Sniffnoy 6 hours ago

      I wouldn't say that we have no experimental data which contradicts them. Rather, we do have experimental data which contradicts them, but no experimental data that points us in the direction of a solution (and whenever we go looking for the latter, we fail).

      Consider e.g. neutrino masses. We have plenty of experimental data indicating that neutrinos oscillate and therefore have mass. This poses a problem for the standard model (because there are problems unless the mass comes from the Higgs mechanism, but in the standard model neutrinos can't participate in the Higgs mechanism due to always being left-handed). But whenever we do experiments to attempt to verify one of the ways of fixing this problem -- are there separate right-handed neutrinos we didn't know about, or maybe instead the right-handed neutrinos were just antineutrinos all along? -- we turn up nothing.

      • T-A 5 hours ago

        > the standard model neutrinos can't participate in the Higgs mechanism due to always being left-handed

        This again? It's only true if you insist on sticking with the original form of Weinberg's "model of leptons" from 1967 [1], which was written when massless neutrinos were consistent with available experimental data. Adding quark-style (i.e. Dirac) neutrino mass terms to the Standard Model is a trivial exercise. If doing so offends some prejudice of yours that right-handed neutrino can not exist because they have no electric and weak charge (in which case you must really hate photons too, not to mention gravity) you can resort to a Majorana mass term [2] instead.

        That question (are neutrinos Dirac or Majorana?) is not a "contradiction", it's an uncertainty caused by how difficult it is to experimentally rule out either option. It is most certainly not "a problem for the standard model".

        [1] https://journals.aps.org/prl/pdf/10.1103/PhysRevLett.19.1264

        [2] https://en.wikipedia.org/wiki/Majorana_equation#Mass_term

        • TheOtherHobbes an hour ago

          It's trivial to add a matrix to account for neutrino masses, but that doesn't explain their origin.

          That is not a trivial problem at all. It certainly has not been solved, and it's possible experiments will say "Both the current ideas are wrong."

          • T-A an hour ago

            > It's trivial to add a matrix to account for neutrino masses

            The matrix you are thinking of is presumably the PMNS matrix [1]. It's equivalent to the CKM matrix for quarks [2]. The purpose of both is to parametrize the mismatch between flavor [3] and mass eigenstates, not "to account for neutrino masses" or "explain their origin".

            As far as the standard model is concerned, neutrino masses and quark masses all originate from Yukawa couplings [4] with the Higgs field. Adding such terms to Weinberg's original model of leptons is very much a trivial exercise, and was done already well before there was solid evidence for non-zero neutrino masses.

            > it's possible experiments will say "Both the current ideas are wrong."

            Assuming that by "Both current ideas" you mean Dirac vs Majorana mass, those are the only available relativistic invariants. For both to be wrong, special relativity would have to be wrong. Hopefully I don't need to explain how extraordinarily unlikely that is.

            [1] https://en.wikipedia.org/wiki/Pontecorvo%E2%80%93Maki%E2%80%...

            [2] https://en.wikipedia.org/wiki/Cabibbo%E2%80%93Kobayashi%E2%8...

            [3] https://en.wikipedia.org/wiki/Flavour_(particle_physics)

            [4] https://en.wikipedia.org/wiki/Yukawa_coupling

    • atakan_gurkan 6 hours ago

      I disagree, but maybe only because we are using different definitions. For example, we have neutrino oscillations, this requires neutrino mass, which is not part of the standard model of particle physics. In cosmology, there is "lithium problem" (amongst others), which cannot be explained by Lambda-CDM. We know our physical theories are incomplete not only because our mathematical frameworks (GR & QFT) are incompatible (similar to the incompatibility of Maxwell's equations and the Galilean transformations that form the basis of Newtonian mechanics), but also there are these unexplained phenomena, much like the blackbody radiation at the turn of previous century.

    • KolibriFly 23 minutes ago

      This era might be one where we have to earn the next clue much more slowly

    • Paracompact 11 hours ago

      What about underexplained cosmological epicycles like dark matter (in explaining long-standing divergences of gravitational theory from observation), or the Hubble tension?

      • joe_the_user 6 hours ago

        The dark matter theory broadly is that there is amount of invisible matter that obeys the laws of Einsteinian gravity but isn't otherwise visible. By itself, it has considerable experimental evidence. It doesn't resemble Ptolemaic theories of planetary motion notably in that doesn't and hasn't required regular updating as new data arrives.

        It really fits well with the OP comments. Nothing really contradicts the theory but there's no deeper theory beyond it. Another comment mentioned as "nightmare" of dark matter only have gravitational interaction with other matter. That would be very unsatisfying for physicists but wouldn't something that really disprove any given theory.

        • geysersam 4 hours ago

          When you say dark matter theory doesn't require updates when new data arrives, it sounds like you don't count the parameters that describe the dark matter distribution to be part of the theory.

      • XorNot 11 hours ago

        This is your regular reminder that epicycles were not an incorrect theory addition until an alternative hypothesis could explain the same behavior without requiring them.

        • Paracompact 11 hours ago

          Sure, but in that regard dark matter is even more unsatisfying than (contemporary) epicycles, because not only does it add extra complexity, it doesn't even characterize the source of that complexity beyond its gravitational effects.

          • dataflow 7 hours ago

            FYI, very recently (as in this has been in the news the past few days, and the article is from December) an article was published that suggested we might already have experimental evidence for dark matter being primordial black holes, though there are reasons to doubt it as well. I just posted the article: https://news.ycombinator.com/item?id=46955545

            But this might be easier to read: https://www.space.com/astronomy/black-holes/did-astronomers-...

          • cozzyd 11 hours ago

            Even better, there are the "nightmare" scenarios where dark matter can only interact gravitationally with Standard Model particles.

            • Paracompact 11 hours ago

              Personally—and this is where I expect to lose the materialists that I imagine predominate HN—I think we are already in a nightmare scenario with regard to another area: the science of consciousness.

              The following seem likely to me: (1) Consciousness exists, and is not an illusion that doesn't need explaining (a la Daniel Dennett), nor does it drop out of some magical part of physical theory we've somehow overlooked until now; (2) Mind-matter interactions do not exist, that is, purely physical phenomena can be perfectly explained by appeals to purely physical theories.

              Such are the stakes of "naturalistic dualist" thinkers like David Chalmers. But if this is the case, it implies that the physics of matter and the physics of consciousness are orthogonal to each other. Much like it would be a nightmare to stipulate that dark matter is a purely gravitational interaction and that's that, it would be a nightmare to stipulate that consciousness and qualia arise noninteractionally from certain physical processes just because. And if there is at least one materially noninteracting orthogonal component to our universe, what if there are more that we can't even perceive?

              • jemmyw 5 hours ago

                I don't think any of this is particularly nightmarish. Just because we don't yet know how this complex system arises from another lower level one doesn't make it new physics. There's no evidence of it being new or orthogonal physics.

                Imagine trying to figure out what is happening on someone's computer screen with only physical access to their hardware minus the screen, and an MRI scanner. And that's a system we built! We've come exceedingly far with brains and minds considering the tools we have to peer inside.

                • Paracompact 2 hours ago

                  Knowing how to build a brain is different from knowing whether that brain has consciousness in the sense that you or I do. The question of consciousness appears to demand new/orthogonal physics because according to our existing physics, there's no sense in which you or should "feel" any differently than a rock does, or a computer does, or Searle's room does, or a Chinese brain does, or the universe as a whole does, etc.

                  • squeefers 2 hours ago

                    > The question of consciousness appears to demand new/orthogonal physics because according to our existing physics, there's no sense in which you or should "feel" any differently than a rock does,

                    deepak chopra may interest you

              • TheOtherHobbes an hour ago

                Yes, but it doesn't even need mysticism or duality.

                There's a more straightforward problem, which is that all of science is limited by our ability to generate and test mental models, and there's been no research into the accuracy and reliability of our modelling processes.

                Everything gets filtered through human consciousness - math, experiment, all of it. And our definition of "objective" is literally just "we cross-check with other educated humans and the most reliable and consistent experience wins, for now."

                How likely is it that human consciousness is the most perfect of all possible lenses, doesn't introduce distortions, and has no limits, questionable habits, or blind spots?

              • eucyclos 5 hours ago

                I think the old theory of the planes of existence has a lot of utility here - if you substitute "the dimensionality at which you're analyzing your dataset" for the hermetic concept of "planes of existence" you get essentially the same thing, at least in lower dimensions like one (matter) or two (energy). Mind, specifically a human mind, would be a four dimensional under the old system, which feels about right. No idea how you'd set up an experiment to test that theory though. It may be completely impossible because experiments only work when they work in all contexts and only matter is ever the same regardless of context.

              • galaxyLogic 6 hours ago

                I don't think there is any mystery to what we call "consciousness". Our senses and brain have evolved so we can "sense" the external world, so we can live in it and react to it. So why couldn''t we also sense what is happening inside our brains?

                Our brain needs to sense our "inner talk" so we can let it guide our decision-making and actions. If we couldn't remember sentences, we couldn't remember "facts" and would be much worse for that. And talking with our "inner voice" and hearing it, isn't that what most people would call consciousness?

                • jacquesm 6 hours ago

                  This is not nearly as profound as you make it out to be: a computer program also doesn't sense the hardware that it runs on, from its point of view it is invisible until it is made explicit: peripherals.

                  • dgfl 4 hours ago

                    You also don’t consciously use your senses until you actively think about them. Same as “you are now aware of your breathing”. Sudden changes in a sensation may trigger them to be conscious without “you” taking action, but that’s not so different. You’re still directing your attention to something that’s always been there.

                    I agree with the poster (and Daniel Dennet and others) that there isn’t anything that needs explaining. It’s just a question framing problem, much like the measurement problem in quantum mechanics.

                • tehjoker 6 hours ago

                  another one that thinks they solved the hard problem of consciousness by addressing the easy problem. how on earth does a feedback system cause matter to "wake up"? we are making lots of progress on the easy problem though

                  • dgfl 4 hours ago

                    This is not as good a refusal as you think it is. To me (and I imagine, the parent poster) there is no extra logical step needed. The problem IS solved in this sense.

                    If it’s completely impossible to even imagine what the answer to a question is, as is the case here, it’s probably the wrong question to pose. Is there any answer you’d be satisfied by?

                    To me the hard problem is more or less akin to looking for the true boundaries of a cloud: a seemingly valid quest, but one that can’t really be answered in a satisfactory sense, because it’s not the right one to pose to make sense of clouds.

                    • Paracompact 2 hours ago

                      > If it’s completely impossible to even imagine what the answer to a question is, as is the case here, it’s probably the wrong question to pose. Is there any answer you’d be satisfied by?

                      I would be very satisfied to have an answer, or even just convincing heuristic arguments, for the following:

                      (1) What systems experience consciousness? For example, is a computer as conscious as a rock, as conscious as a human, or somewhere in between? (2) What are the fundamental symmetries and invariants of consciousness? Does it impact consciousness whether a system is flipped in spacetime, skewed in spacetime, isomorphically recast in different physical media, etc.? (3) What aspects of a system's organization give rise to different qualia? What does the possible parameter space (or set of possible dynamical traces, or what have you) of qualia look like? (4) Is a consciousness a distinct entity, like some phase transition with a sharp boundary, or is there no fundamentally rigorous sense in which we can distinguish each and every consciousness in the universe? (5) What explains the nature of phenomena like blindsight or split brain patients, where seemingly high-level recognition, coordination, and/or intent occurs in the absence of any conscious awareness? Generally, what behavior-affecting processes in our brains do and do not affect our conscious experience?

                      And so on. I imagine you'll take issue with all of these questions, perhaps saying that "consciousness" isn't well defined, or that an "explanation" can only refer to functional descriptions of physical matter, but I figured I would at least answer your question honestly.

              • Dylan16807 5 hours ago

                How can consciousness have information about the material world if it doesn't interact with it in any way?

                And when your fingers type that you experience qualia, are they bullshitting because your fingers have never actually received any signals from your consciousness in any direct or indirect way?

              • geysersam 4 hours ago

                That would certainly be a difficult scenario. But it doesn't seem very likely. For example, consciousness and material systems seem to interact. Putting drugs in your blood changes your conscious experience etc.

              • im3w1l 7 hours ago

                I've thought about this possibility but come to reject it. If mind-matter interactions did not exist, then matter could not detect the presence of mind. And if the brain cannot detect the mind then we wouldn't be able to talk or write about the mind.

                • meindnoch 7 hours ago

                  Or, the mind is in spectator mode?

        • suddenlybananas 4 hours ago

          Scientific theories are not curve-fitting.

    • klipt 11 hours ago

      Doesn't that imply our theories are "good enough" for all practical purposes? If they're impossible to empirically disprove?

      • adrian_b 2 hours ago

        The existing theories are extremely far from being good enough for practical purposes.

        There exists a huge number of fundamental quantities that should be calculated from the parameters of the "standard model", but we cannot compute them, we can only measure them experimentally.

        For instance, the masses and magnetic moments of the proton, of the neutron and of all other hadrons, the masses and magnetic moments of the nuclei, the energy spectra of nuclei, of atoms, of ions, of molecules, and so on.

        The "standard model" can compute only things of negligible practical importance, like the statistical properties of the particle collisions that are performed at LHC.

        It cannot compute anything of value for practical engineering. All semiconductor devices, lasers and any other devices where quantum physics matters are not designed using any consistent theory of quantum physics, but they are designed using models based on a great number of empirical parameters determined by measurement, for which quantum physics is only an inspiration for how the model should look like and not a base from which the model can be derived rigorously.

        • jhrmnn an hour ago

          This depends very much on what "practical purposes" are. For almost all conceivable technology, relativistic quantum mechanics for electrons and light, ie QED, is sufficient fundamental theory. This is unlike before quantum mechanics, when we basically didn't have fundamental laws for chemistry and solid-state physics.

      • hackingonempty 11 hours ago

        Yes, for all practical purposes. This is the position of physicist Sean Carroll and probably others. We may not know what is happening in the middle of a black hole, or very close to the big bang, but here on Earth we do.

        "in the specific regime covering the particles and forces that make up human beings and their environments, we have good reason to think that all of the ingredients and their dynamics are understood to extremely high precision"[0]

        0: https://philpapers.org/archive/CARCAT-33

        • throwaway81523 10 hours ago

          ER=EPR says something completely shocking about the nature of the universe. If there is anything to it, we have almost no clue about how it works or what its consequences are.

          Sean Carroll's own favorite topics (emergent gravity, and the many worlds interpretation) are also things that we don't have any clue about.

          Yes there is stuff we can calculate to very high precision. Being able to calculate it, and understanding it, are not necessarily the same thing.

      • Legend2440 11 hours ago

        Typically whenever you look closely at an object with complex behavior, there is a system inside made of smaller, simpler objects interacting to produce the complexity.

        You'd expect that at the bottom, the smallest objects would be extremely simple and would follow some single physical law.

        But the smallest objects we know of still have pretty complex behavior! So there's probably another layer underneath that we don't know about yet, maybe more than one.

        • jhanschoo 9 hours ago

          I agree, and I think that your claim is compatible with the comment that you are responding to. Indeed, perhaps it's turtles all the way down and there is systematic complexity upon systematic complexity governing our universe that humanity has been just too limited to experience.

          For a historical analogy, classical physics was and is sufficient for most practical purposes, and we didn't need relativity or quantum mechanics until we had instruments that could manipulate them, or that at least experienced them. While I guess that there were still macroscopic quantum phenomena, perhaps they could have just been treated as empirical material properties without a systematic universal theory accounting for them, when instruments would not have been precise enough to explore and exploit predictions of a systematic theory.

          • adrianN 4 hours ago

            The experiments that lead to the invention of quantum theory are relatively simple and involve objects you can touch with your bare hands without damaging them. Some are done in high school, eg the photoelectric effect.

            • jhanschoo 2 hours ago

              Whereas I did hedge my point regarding macroscopic quantum phenomena, I think that the quantum nature of the photoelectric effect would have been harder to discern without modern access to pure wavelength lighting. But you could still rely on precise optics to purify mixed light I suppose. But without even optics it should be even harder.

        • epsilonsalts 7 hours ago

          Yeah that's the outcome theorized by Gödel.

          Incompleteness is inherent to our understanding as the universe is too vast and endless for us to ever capture a holistic model of all the variables.

          Gödel says something specific about human axiomatic systems, akin to a special relativity, but it generalizes to physical reality too. A written system is made physical writing it out, and never complete. Demonstrates that our grasp of physical systems themselves is always incomplete.

          • drdeca 7 hours ago

            Gödel’s incompleteness says almost nothing about this. I wish people wouldn’t try to apply it in ways that it very clearly is not applicable to.

            An environment living in Conway’s Game of Life could be quite capable of hypothesizing that it is implemented in Conway’s Game of Life.

            • longfacehorrace 2 hours ago

              That's not what they were saying.

              Systems can hypothesize about themselves but they cannot determine why the rules they can learn exist in the first place. Prior states are no longer observable so there is always incomplete history.

              Conway's Game of Life can't explain its own origins just itself. Because the origins are no longer observable after they occur.

              What are the origins of our universe? We can only guess without the specificity of direct observation. Understanding is incomplete with only simulation and theory.

              So the comment is right. We would expect to be able to define what is now but not completely know what came before.

            • bananaflag 3 hours ago

              Indeed, as I think I commented before here, this kind of self-reference is exactly what makes Gödel's proof work.

            • mastermage 5 hours ago

              Now the question is are we in Conways Game of Life?

      • andreareina 6 hours ago

        The fundamental theories are good enough in that we can't find a counterexample, but they're only useful up to a certain scale before the computational power needed is infeasible. We're still hoping to find higher-level emergent theories to describe larger systems. By analogy, in principle you could use Newton's laws of motion (1685) to predict what a gas in a room is going to do, or how fluid will flow in a pipe, but in practice it's intractable and we prefer to use the higher-level language of fluid mechanics: the ideal gas law, the navier-stokes equations, etc.

      • recursivecaveat 11 hours ago

        Maybe? We seem to be able to characterize all the stuff we have access to. That doesn't mean we couldn't say produce new and interesting materials with new knowledge. Before we knew about nuclear fission we didn't realize that we couldn't predict that anything would happen from a big chunk of uranium or the useful applications of that. New physics might be quite subtle or specific but still useful.

      • sixo 10 hours ago

        The point is not to make better predictions of the things we already know how to predict. The point is to determine what abstractions link the things we don't presently understand--because these abstraction tend to open many new doors in other directions. This has been the story of physics over and over: relativity, quantum theory, etc, not only answered the questions they were designed to answer but opened thousands of new doors in other directions.

      • doctoboggan 11 hours ago

        The theories don't answer all the questions we can ask, namely questions about how gravity behaves at the quantum scale. (These questions pop up when exploring extremely dense regions of space - the very early universe and black holes).

      • colechristensen 7 hours ago

        Classical physics was indeed "good enough for all practical purposes" as well at the time... but those didn't include electronics, nuclear power, most all basic understanding of materials, chemistry, and just a tremendous amount of things.

        The point being it's not at all clear what we might be missing without these impractical little mysteries that so far are very distant from every day life.

      • PlatoIsADisease 11 hours ago

        If I have to make a guess, we are at the level of pre-copernicus in particle physics.

        We are finding local maximums(induction) but the establishment cannot handle deduction.

        Everything is an overly complex bandaid. At some point someone will find something elegant that can predict 70% as good, and at some point we will realize: 'Oh that's great, the sun is actually at the center of the solar system, Copernicious was slightly wrong thinking planets make circular rotations. We just needed to use ellipses!'

        But with particles.

      • light_triad 11 hours ago

        There's still huge gaps in our understanding: quantum gravity, dark matter, what happens before planck time, thermodynamics of life and many others.

        Part of the problem is that building bigger colliders, telescopes, and gravitational wave detectors requires huge resources and very powerful computers to store and crunch all the data.

        We're cutting research instead of funding it right now and sending our brightest researchers to Europe and China...

      • idiotsecant 11 hours ago

        Absolutely not. Newtonian physics was 'good enough' until we disproved it. Imagine where we would be if all we had was Newtonian physics.

        • nancyminusone 11 hours ago

          You would still make it to the moon (so I've heard). Maybe you wouldn't have GPS systems?

        • mikkupikku 11 hours ago

          Newtonian physics is good enough for almost everything that humans do. It's not good for predicting the shit we see in telescopes, and apparently it's not good for GPS, although honestly I think without general relativity, GPS would still get made but there'd be a fudge factor that people just shrug about.

          For just about anything else, Newton has us covered.

          • z3phyr 5 hours ago

            Microchips? A lot of quantum physics is applied here from the top of my mind.

            • mikkupikku an hour ago

              Quantum mechanics is relevant to humanity because we build things which are very small. General relativity is not, because we're more or less incapable of actually doing things on a scale where it matters.

          • idiotsecant 8 hours ago

            Oh sure, nothing major. Just transistors, lasers, MRI, GPS,nuke power, photovoltaics, LEDs, x-rays, and pretty much anything requiring maxwells equations.

            Nothing major.

          • cozzyd 9 hours ago

            quantum mechanics (also very much not Newtonian) is much more important to our day-to-day lives.

            • momoschili 7 hours ago

              this kind of distinction is quite stupid in general as plenty of things that we rely on for day-to-day activities such as our houses, desks, chairs, beds, shoes, clothes, etc are all based on Newtonian/classical mechanics. Basically everything that we use which existed pre-transistor strictly speaking only required classical physics.

            • refulgentis 7 hours ago

              Is it?

              • nerdsniper 7 hours ago

                Flash memory (quantum tunneling), lasers (stimulated emission), transistors (band theory), MRI machines (nuclear spin), GPS (atomic transition), LED's (band gap), digital cameras (photoelectric effect), ...the list does, in fact, go on, and on, and on.

                • narcraft 5 hours ago

                  Did you intentionally list things that are clearly not essential to day-to-day life?

                  • refulgentis 4 hours ago

                    I'd argue flash memory and transistors certainly are.

      • csomar 11 hours ago

        I think the problem is that GR and QFT are at odds with each other? (I am not quite versed in the subject and this is my high-level understanding of the “problem”)

    • throw_m239339 11 hours ago

      I find the idea that reality might be quantized fascinating, so that all information that exists could be stored in a storage medium big enough.

      It's also kind of interesting how causality allegedly has a speed limit and it's rather slow all things considered.

      Anyway, in 150 years we absolutely came a long way, we'll figure it that out eventually, but as always, figuring it out might lead even bigger questions and mysteries...

      • tsimionescu 4 hours ago

        Note that "reality" is not quantized in any existing theory. Even in QM/QFT, only certain properties are quantized, such as mass or charge. Others, like position or time, are very much not quantized - the distance between two objects can very well be 2.5pi planck lengths. And not only are they not quantized, the math of these theories does not work if you try to discretize space or time or other properties.

      • csomar 11 hours ago

        If reality is quantized, how can you store all the information out there without creating a real simulation? (Essentially cloning the environment you want stored)

  • pjmlp 4 hours ago

    As CERN Alumni, this isn't easy, the data is endless, processing it takes take, usually everything is new technology, and also needs to be validated before being put into use.

    Thousands of people have worked on bringing LHC up during a few decades before, Higgs came to be, across all engineering branches.

    This stuff is hard, and there is no roadmap on how to get there.

  • tasty_freeze 12 hours ago

    Here is one fact that seems, to me, pretty convincing that there is another layer underneath what we know.

    The charge of electrons is -1 and protons +1. It has been experimentally measured out to 12 digits or so to be the same magnitude, just opposite charge. However, there are no theories why this is -- they are simply measured and that is it.

    It beggars belief that these just happen to be exactly (as far as we can measure) the same magnitude. There almost certainly is a lower level mechanism which explains why they are exactly the same but opposite.

    • Paracompact 12 hours ago

      Technically, the charge of a proton can be derived from its constituent 2 up quarks and 1 down quark, which have charges 2/3 and -1/3 respectively. I'm not aware of any deeper reason why these should be simple fractional ratios of the charge of the electron, however, I'm not sure there needs to be one. If you believe the stack of turtles ends somewhere, you have to accept there will eventually be (hopefully simple) coincidences between certain fundamental values, no?

      • auntienomen 11 hours ago

        There does appear to be a deeper reason, but it's really not well understood.

        Consistent quantum field theories involving chiral fermions (such as the Standard Model) are relatively rare: the charges have to satisfy a set of polynomial relationships with the inspiring name "gauge anomaly cancellation conditions". If these conditions aren't satisfied, the mathematical model will fail pretty spectacularly. It won't be unitary, can't couple consistently to gravity, won't allow high and low energy behavior to decouple,..

        For the Standard Model, the anomaly cancellation conditions imply that the sum of electric charges within a generation must vanish, which they do:

        3 colors of quark * ( up charge 2/3 - down charge 1/3) + electron charge -1 + neutrino charge 0 = 0.

        So, there's something quite special about the charge assignments in the Standard Model. They're nowhere near as arbitrary as they could be a priori.

        Historically, this has been taken as a hint that the standard model should come from a simpler "grand unified" model. Particle accelerators and cosmology hace turned up at best circumstantial evidence for these so far. To me, it's one of the great mysteries.

      • tasty_freeze 11 hours ago

        I'm aware of the charge coming from quarks, but my point remains.

        > you have to accept there will eventually be (hopefully simple) coincidences between certain fundamental values, no?

        When the probability of coincidence is epsilon, then, no. Right now they are the same to 12 digits, but that undersells it, because that is just the trailing digits. There is nothing which says the leading digits must be the same, eg, one could be 10^30 times bigger than the other. Are you still going to just shrug and say "coincidence?"

        That there are 26 fundamental constants and this one is just exactly the same is untenable.

        • jacquesm 6 hours ago

          I think I agree with you. It could be just a matter of static bias or some other fairly simple mechanism to explain why these numbers are the same.

          Imagine an object made of only red marbles as the 'base state'. Now you somehow manage to remove one red marble: you're at -1. You add a red marble and you're at +1. It doesn't require any other marbles. Then you go and measure the charge of a marble and you and up at some 12 digit number. The one state will show negative that 12 digit number the other will show positive that 12 digit number.

          Assigning charge as being the property of a proton or an electron rather than one of their equivalent constituent components is probably a mistake.

        • Paracompact 10 hours ago

          If you imagine the universe is made of random real fundamental constants rather than random integer fundamental constants, then indeed there's no reason to expect such collisions. But if our universe starts from discrete foundations, then there may be no more satisfying explanation to this than there is to the question of, say, why the survival threshold and the reproduction threshold in Conway's Game of Life both involve the number 3. That's just how that universe is defined.

          • tasty_freeze 10 hours ago

            Why do you assume the two have to be small integers? There is nothing currently in physics which would disallow the electron to be -1 and the proton to be +1234567891011213141516171819. The fact they are both of magnitude 1 is a huge coincidence.

            • Paracompact 10 hours ago

              I'm not assuming they have to be small integers—I'm saying that if the universe is built on discrete rather than continuous foundations, then small integers and coincidences at the bottom-turtle theory-of-everything become much less surprising. You're treating the space of possible charge values as if it's the reals, or at least some enormous range, but I consider that unlikely.

              Consider: in every known case where we have found a deeper layer of explanation for a "coincidence" in physics, the explanation involved some symmetry or conservation law that constrained the values to a small discrete set. The quark model took seemingly arbitrary coincidences and revealed them as consequences of a restrictive structure. auntienomen's point about anomaly cancellation is also exactly this kind of thing. The smallness of the set in question isn't forced, but it is plausible.

              But I actually think we're agreeing more than you realize. You're saying "this can't be a coincidence, there must be a deeper reason." I'm saying the deeper reason might bottom out at "the consistent discrete structures are sparse and this is one of them," which is a real explanation, but it might not have the form of yet another dynamical layer underneath.

              • light_hue_1 7 hours ago

                Sparsity != symmetry.

                It's simple to say "Ah well, it's sparse" that doesn't mean anything and doesn't explain anything.

                Symmetries are equivalent to a conserved quantity. They exist because something else is invariant with respect to some transformation and vice versa. We didn't discover arbitrary constraints we found a conserved quantity & the implied symmetry.

                "There are integers", "the numbers should be small" all of these are nothing like what works normally. They aren't symmetries. At most they're from some anthropic argument about collections of universes being more or less likely, which is its own rabbit hole that most people stay away from.

            • IsTom 2 hours ago

              If they were, I'd assume that there wouldn't be anyone in the universe to observe that.

            • jaybrendansmith 8 hours ago

              Perhaps only visible matter is made up of particles with these exactly matching charges? If they did not match, they would not stay in equilibrium, and would not be so easily found.

              • thegabriele 4 hours ago

                I like this survivorship bias, "evolution" works in everything why not in the shaping of the "costants" of the universe as we know it?

      • JumpCrisscross 11 hours ago

        > you have to accept there will eventually be (hopefully simple) coincidences between certain fundamental values, no?

        No. It’s almost certainly not a coïncidence that these charges are symmetric like that (in stable particles that like to hang out together).

        • Paracompact 11 hours ago

          Whence your confidence? As they say in math, "There aren't enough small numbers to meet the many demands made of them." If we assume the turtle stack ends, and it ends simply (i.e. with small numbers), some of those numbers may wind up looking alike. Even more so if you find anthropic arguments convincing, or if you consider sampling bias (which may be what you mean by, "in stable particles that like to hang out together").

          • JumpCrisscross 6 hours ago

            > if you find anthropic arguments convincing

            Which makes every constant fair game. Currently, we don’t have a good process for explaining multiple universes beyond divine preference. Hence the notion that a random number settled on mirror whole sums.

        • hackyhacky 11 hours ago

          > coïncidence

          Nïce

      • idiotsecant 11 hours ago

        Shrugging and calling it a coincidence is generally not an end state when figuring out how something works.

    • andyferris 6 hours ago

      The hint from quantum field theory (and things like lattice gauge theory) is that charge emerges from interesting topological states/defects of the underlying field (by "interesting topological shapes" I mean - imagine a vortex in the shape of a ring/doughnut). It's kind of a topological property of a state of the photonic field, if you will - something like a winding number (which has to be an integer). Electric charge is a kind of "defect" or "kink" in the photonic field, while color charge (quarks) are defects in the strong-force field, etc.

      When an electron-positron pair is formed from a vacuum, we get all sorts of interesting geometry which I struggle to grasp or picture clearly. I understand the fact that these are fermions with spin-1/2 can similarly be explained as localized defects in a field of particles with integer spin (possibly a feature of the exact same "defect" as the charge itself, in the photonic field, which is what defines an electron as an electron).

      EDIT:

      > However, there are no theories why this is -- they are simply measured and that is it.

      My take is that there _are_ accepted hypotheses for this, but solving the equations (of e.g. the standard model, in full 3D space) to a precision suitable to compare to experimental data is currently entirely impractical (at least for some things like absolute masses - though I think there are predictions of ratios etc that work out between theory and measurement - sorry not a specialist in high-energy physics, had more exposure to low-energy quantum topological defects).

      • quchen 4 hours ago

        (Note the post you’ve replied to mentioned electrons and _protons_, not positrons.)

      • RupertSalt 5 hours ago

        > interesting topological states/defects of the underlying field

        eddies in the space-time continuum?

    • cozzyd 9 hours ago

      As soon as charge is quantized, this will happen. In any quantization scheme you will have some smallest charge. There are particles with charge +2 (the Delta++, for example), but ... anything that can decay while preserving quantum numbers will decay, so you end up with protons in the end. (ok, the quarks have fractional charge but that's not really relevant at scales we care about QED)

      If the question is, why is quantum mechanics the correct theory? Well, I guess that's how our universe works...

    • rjh29 12 hours ago

      One argument (while unsatisfying) is there are trillions of possible configurations, but ours is the one that happened to work which is why we're here to observe it. Changing any of them even a little bit would result in an empty universe.

      • libraryofbabel 11 hours ago

        There’s a name for that: the Anthropic principle. And it is deeply unsatisfying as an explanation.

        And does it even apply here? If the charge on the electron differed from the charge on the proton at just the 12th decimal place, would that actually prevent complex life from forming. Citation needed for that one.

        I agree with OP. The unexplained symmetry points to a deeper level.

        • squeefers an hour ago

          > There’s a name for that: the Anthropic principle. And it is deeply unsatisfying as an explanation.

          i feel the same about many worlds

        • krzat 4 hours ago

          I find the anthropic principle fascinating.

          I was born to this world at a certain point in time. I look around, and I see environment compatible with me: air, water, food, gravity, time, space. How deep does this go? Why I am not an ant or bacteria?

          • GordonS 3 hours ago

            Presumably your parents weren't ants?

    • PaulHoule 12 hours ago

      If it wasn't the case then matter wouldn't be stable.

      • tasty_freeze 10 hours ago

        Agreed (well, assuming the delta is more than a small fraction of a percent or whatever). But this is begging the question. If they are really independent then the vast, overwhelming fraction of all possible universes simply wouldn't have matter. Ours does have matter, so it makes our universe exceedingly unlikely. I find it far more parsimonious to assume they are connected by an undiscovered (and perhaps never to be discovered) mechanism.

        Some lean on the multiverse and the anthropic principle to explain it, but that is far less parsimonious.

        • PaulHoule 9 hours ago

          Also note that the proton is not an elementary particle so it is really a question of "are the various quarks really 1/3, 2/3 of an electron charge".

          Crackpots have found thousands of formula that try to explain the ratio of the proton to electron mass but there is no expectation that there is a simple relationship between those masses since the proton mass is the sum of all sorts of terms.

          • gsf_emergency_6 9 hours ago

            Crackpots are downstream of the "physics community" awarding cultural cachet to certain types of questions -- those with affordances they don't necessarily "deserve"-- but not others.

            (I use quotes because those are emergent concepts)

            Same as "hacker community" deciding that AI is worth FOMO'ing about

      • jiggawatts 6 hours ago

        An interesting early theory of gravity was: "What if opposite charges attract slight more strongly than identical charges repel each other?"

        If you tally up the forces, the difference is a residual attraction that can model gravity. It was rejected on various experimental and theoretical grounds, but it goes to show that if things don't cancel out exactly then the result can still leave a universe that would appear normal to us.

      • libraryofbabel 11 hours ago

        Is that actually true, if the charges differed at the 12th decimal place only? That’s non-obvious to me.

        • baggy_trough 8 hours ago

          Yes because matter would have a residual charge that would massively overpower gravity even at that small a discrepancy.

    • andyfilms1 11 hours ago

      For a given calculation on given hardware, the 100th digit of a floating point decimal can be replicated every time. But that digit is basically just noise, and has no influence on the 1st digit.

      In other words: There can be multiple "layers" of linked states, but that doesn't necessarily mean the lower layers "create" the higher layers, or vice versa.

    • wvbdmp 12 hours ago

      Aren’t things like this usually explained by being the only viable configuration, or is that not the case here?

    • throwup238 12 hours ago

      Or why the quarks that make up protons and neutrons have fractional charges, with +1 protons mixing two +2/3 up quarks and one -1/3 down quark, and the neutral neutron is one up quark and two down quarks. And where are all the other Quarks in all of this, busy tending bar?

      • david-gpu 11 hours ago

        They have fractional charges because that is how we happen to measure charge. If our unit of charge had been set when we knew about quarks, we would have chosen those as fundamental, and the charge of the electron would instead be -3.

        Now, the ratios between these charges appear to be fundamental. But the presence of fractions is arbitrary.

        • jcranmer 11 hours ago

          > If our unit of charge had been set when we knew about quarks, we would have chosen those as fundamental, and the charge of the electron would instead be -3.

          Actually, I doubt it. Because of their color charge, quarks can never be found in an unbound state but instead in various kinds of hadrons. The ways that quarks combine cause all hadrons to end up with an integer charge, with the ⅔ and -⅓ charges on various quarks merely being ways to make them come out to resulting integer charges.

        • throwup238 11 hours ago

          Isn’t charge quantized? Observable isolated charges are quantized in units of e. You can call it -3 and +3 but that just changes the relative value for the quanta. The interesting question is still why the positive and neutral particles are nonelementary particles made up of quarks with a fraction of e, the math made possible only by including negatively charged ones (and yet electrons are elementary particles).

    • jiggawatts 11 hours ago

      This is "expected" from theory, because all particles seem to be just various aspects of the "same things" that obey a fairly simple algebra.

      For example, pair production is:

          photon + photon = electron + (-)electron
      
      You can take that diagram, rotate it in spacetime, and you have the direct equivalent, which is electrons changing paths by exchanging a photon:

         electron + photon = electron - photon
      
      There are similar formulas for beta decay, which is:

         proton = neutron + electron + (-)neutrino
      
      You can also "rotate" this diagram, or any other Feyman diagram. This very, very strongly hints that the fundamental particles aren't actually fundamental in some sense.

      The precise why of this algebra is the big question! People are chipping away at it, and there's been slow but steady progress.

      One of the "best" approaches I've seen is "The Harari-Shupe preon model and nonrelativistic quantum phase space"[1] by Piotr Zenczykowski which makes the claim that just like how Schrodinger "solved" the quantum wave equation in 3D space by using complex numbers, it's possible to solve a slightly extended version of the same equation in 6D phase space, yielding matrices that have properties that match the Harari-Shupe preon model. The preon model claims that fundamental particles are further subdivided into preons, the "charges" of which neatly add up to the observed zoo of particle charges, and a simple additive algebra over these charges match Feyman diagrams. The preon model has issues with particle masses and binding energies, but Piotr's work neatly sidesteps that issue by claiming that the preons aren't "particles" as such, but just mathematical properties of these matrices.

      I put "best" in quotes above because there isn't anything remotely like a widely accepted theory for this yet, just a few clever people throwing ideas at the wall to see what sticks.

      [1] https://arxiv.org/abs/0803.0223

      • tasty_freeze 10 hours ago

        > This is "expected" from theory, because all particles seem to be just various aspects of the "same things" that obey a fairly simple algebra.

        But again, this is just observation, and it is consistent with the charges we measure (again, just observation). It doesn't explain why these rules must behave as they do.

        > This very, very strongly hints that the fundamental particles aren't actually fundamental in some sense.

        This is exactly what I am suggesting in my original comment: this "coincidence" is not a coincidence but falls out from some deeper, shared mechanism.

        • jiggawatts 10 hours ago

          > this is just observation

          Sure, but that's fundamental to observing the universe from the inside. We can't ever be sure of anything other than our observations because we can't step outside our universe to look at its source code.

          > It doesn't explain why these rules must behave as they do.

          Not yet! Once we have a a theory of everything (TOE), or just a better model of fundamental particles, we may have a satisfactory explanation.

          For example, if the theory ends up being something vaguely like Wolfram's "Ruliad", then we may be able to point at some aspect of very trivial mathematical rules and say: that "the electron and proton charges pop out of that naturally, it's the only way it can be, nothing else makes sense".

          We can of course never be totally certain, but that type of answer may be both good enough and the best we can do.

    • smnplk 11 hours ago

      There are layers science can not access.

      • f30e3dfed1c9 9 hours ago

        Well OK then! Let's tell all the physicists they can close up shop now. They might not have realized it, but they're done. All their little "theories" and "experiments" and what not have taken them as far as they can go.

        • albatross79 8 hours ago

          We're already in the realm of virtual particles, instantaneous collapse, fields with abstract geometric shape and no material reality, wave particle duality, quantized energy etc. The project of physics was to discover what the universe was made of. None of these things can answer that. If intelligibility was the goal, we lost that. So in an important sense, they might as well have closed up shop. If you're interested in the specific value of a certain property to the nth decimal place, there is work to do, but if you're interested in the workings of the universe in a fundamentally intelligible sense, that project is over with. What they're doing now is making doodles around mathematical abstractions that fit the data and presenting those as discoveries.

        • paganel 3 hours ago

          > Let's tell all the physicists they can close up shop now.

          Yes, that's part of the plan. I mean, not to all the physicists, just to those whose work doesn't bring in results anymore, and it hasn't for 30 to 40 years now. At some point they (said physicists) have to stop their work and ask themselves what it is that they're doing, because judging by their results it doesn't seem like they're doing much, while consuming a lot of resources (which could have been better spent elsewhere).

      • jacquesm 5 hours ago

        By observing the discrepancies between theories we are accessing those layers. Whether we can access them with instruments is a different matter but with our minds we apparently can.

  • KolibriFly 28 minutes ago

    This feels less like a story about particle physics "failing" and more like a story about a field running out of easy leverage

  • GlibMonkeyDeath 9 hours ago

    It's hard. Particle physics faces the problem that in order to dig down to ever smaller scales, ironically, ever larger experiments are needed. We've pretty much built large enough colliders for our current understanding. No one really knows how much more energy would be needed to expose something new - it might be incremental, within current technical reach, or it might be many orders of magnitude beyond our current capabilities. The experiments have become expensive enough that there isn't a lot of appetite to build giant new systems without some really good reason. The hard part is coming up with a theory to justify the outlay, if you can't generate compelling data from existing systems.

    Physics advances have been generally driven by observation, obtained through better and better instrumentation. We might be entering a long period of technology development, waiting for the moment our measurements can access (either through greater energy or precision) some new physics.

  • threethirtytwo 7 hours ago

    All of science is getting harder as the easiest discoveries are all pretty much behind us.

    LLMs were a breakthrough I didn't expect and it's likely the last one we'll see in our lifetime.

    • j-krieger 44 minutes ago

      The additional irony here is that LLMs are a tool that is likely forever damned to regurgitate knowledge of the past, with the inability to derive new information.

    • iterance 7 hours ago

      Specific fields may not advance for decades at a time, but we are hardly in a scientific drought. There have been dramatic advances in countless fields over the last 20 years alone and there is no good reason to expect such advances to abruptly cease. Frankly this is far too pessimistic.

      • threethirtytwo 7 hours ago

        I don't understand what is wrong with pessimism. That's not a valid critique. If someone is pessimistic but his description of the world matches REALITY, then there's nothing wrong with his view point.

        Either way this is also opinion based.

        There hasn't been a revolutionary change in technology in the last 20 years. I don't consider smart phones to be revolutionary. I consider going to the moon revolutionary and catching a rocket sort of revolutionary.

        Actually I take that back I predict mars as a possible break through along with LLMs, but we got lucky with musk.

        • andrewflnr 5 hours ago

          You imply your view "matches REALITY", then fall back to "Either way this is also opinion based." Nicely played. But the actual reality is that scientific discovery is proceeding at least as fast as it ever has. These things take time. 20 years is a laughably short time in which to declare defeat, even ignoring the fact that genetic and other biological tech has advanced leaps and bounds in that time. There's important work happening in solid state physics and materials science. JWST is overturning old theories and spawning new ones in cosmology. There's every reality-based reason to believe there will be plenty of big changes in science in the next 20 years or so.

          • threethirtytwo 4 hours ago

            >You imply your view "matches REALITY", then fall back to "Either way this is also opinion based." Nicely played.

            Oh fuck off. Opinions exist in reality do they not? Pessimism implies biased opinions that are biased towards negativity. My opinions bias towards reality, aka what I observe, not what is negative.

            There , clear?

            >But the actual reality is that scientific discovery is proceeding at least as fast as it ever has.

            No it's not. Space is the best example of this. Computing speed is another example. We are hitting physical barriers to technology and discovery in every practical dimension.

            >There's important work happening in solid state physics and materials science. JWST is overturning old theories and spawning new ones in cosmology. There's every reality-based reason to believe there will be plenty of big changes in science in the next 20 years or so.

            Astronomy? Give me a break we're trying to infer what's millions of light years a way of course we're going to be stumbling and getting shit wrong all the time. A step function in Astronomy is actually going to the stars. I guarantee even your great great great grandchildren won't go to one.

        • iterance 5 hours ago

          My critique is not due to pessimism, it is due to afactuality. Breakthroughs in science are plenty in the modern era and there is no reason to expect them to slow or halt.

          However, from your later comments, it sounds as though you feel the only operating definition of a "breakthrough" is a change inducing a rapid rise in labor extraction / conventional productivity. I could not disagree more strongly with this opinion, as I find this definition utterly defies intuition. It rejects many, if not most, changes in scientific understanding that do not directly induce a discontinuty in labor extraction. But admittedly if one restricts the definition of a breakthrough in this way, then, well, you're probably about right. (Though I don't see what Mars has to do with labor extraction.)

        • tehjoker 6 hours ago

          genetic technology and computing technology have been the biggest drivers for a while. i do think it is remarkable to video call another continent. communication technology is disruptive and revolutionary though it looks like chaos. ai is interesting too if it lives up to the hype even slightly.

          catching a rocket is very impressive, but its just a lower cost method for earth orbit. it does unlock megaconstellations tho

          • threethirtytwo 6 hours ago

            Yeah none of those are step function changes. Video calling another continent is like a tiny step from TV. Yeah I receive video wirelessly on my tv not that amazed when I can stretch the distance further with a call that has video. Big deal.

            AI is the step function change. The irony is that it became so pervasive and intertwined with slop people like you forget that what it does now (write all code) was unheard of just a couple years ago. ai surpassed the hype, now it’s popular to talk shit about it.

            • incr_me 5 hours ago

              A step in which function are you talking about, exactly?

              • threethirtytwo 5 hours ago

                If you want it stated precisely, the function is human cognitive labor per unit time and cost.

                For decades, progress mostly shifted physical constraints or communication bandwidth. Faster chips, better networks, cheaper storage. Those move slopes, not discontinuities. Humans still had to think, reason, design, write, debug. The bottleneck stayed human cognition.

                LLMs changed that. Not marginally. Qualitatively.

                The input to the function used to be “a human with training.” The output was plans, code, explanations, synthesis. Now the same class of output can be produced on demand, at scale, by a machine, with latency measured in seconds and cost approaching zero. That is a step change in effective cognitive throughput.

                This is why “video calling another continent” feels incremental. It reduces friction in moving information between humans. AI reduces or removes the human from parts of the loop entirely.

                You can argue about ceilings, reliability, or long term limits. Fine. But the step already happened. Tasks that were categorically human two years ago are now automatable enough to be economically and practically useful.

                That is the function. And it jumped.

  • benreesman 5 hours ago

    It is almost always the case that when progress stops for some meaningful period of time that a parochial taboo would need violating to move forwards.

    The best known example is the pre- and post-Copernican conceptions of our relationship to the sun. But long before and ever since: if you show me physics with its wheels slipping in mud I'll show you a culture not yet ready for a new frame.

    We are so very attached to the notions of a unique and continuous identity observed by a physically real consciousness observing an unambiguous arrow of time.

    Causality. That's what you give up next.

    • gary_0 4 hours ago

      I'm pretty sure quantum mechanics already forgoes conventional causality. Attosecond interactions take place in such narrow slices of time that the uncertainty principle turns everything into a blur where events can't be described linearly. In other words, the math sometimes requires that effect precedes cause. As far as we can tell, causality and conservation of energy is only preserved on a macroscopic scale. (IANAQP, but I'm going off my recollections of books by people who are.)

    • raincole 5 hours ago

      It's easy to give up existing concepts. It's called being a crackpot and you can find thousands of papers doing that online.

      • indymike 4 hours ago

        I'm not sure the crackpot is what we're talking about here. We're talking about something tht violates the prevailing opinion in a way that can be verified, and results a change in what we know to be true. The crackpot is mostly the result of a very aspirational world view, and usually under the hood has bias and error that is often quite obvious.

    • fatbird 5 hours ago

      This is a common framing of the Copernican revolution, and it's wrong.

      Copernicus was proposing circular orbits with the sun at the center instead of the earth. The Copernican model required more epicycles for accurate predictions than the considerably well-proven Ptolemaic model did, with the earth at the centre.

      It wasn't until Kepler came along and proposed elliptical orbits that a heliocentric solar system was obviously a genuine advance on the model, both simpler and more accurate.

      There was no taboo being preserved by rejecting Copernicus's model. The thinkers of the day rightfully saw a conceptual shift with no apparent advantage and several additional costs.

    • mastermage 5 hours ago

      the fuck you mean giving up causality?

  • ggm 11 hours ago

    I am sure others will say it better, but the cat-in-the-box experiment is a shockingly bad metaphor for the idea behind quantum states and observer effect.

    I will commit the first sin, by declaring without fear of contradiction the cat actually IS either alive or dead. it is not in a superposition of states. What is unknown is our knowledge of the state, and what collapses is that uncertainty.

    If you shift this to the particle, not the cat, what changes? because if very much changes, my first comment about the unsuitability of the metaphor is upheld, and if very little changes, my comment has been disproven.

    It would be clear I am neither a physicist nor a logician.

    • plomme 6 hours ago

      Well you are in luck because that was the point of Schroedingers cat; it was constructed to show the impossibly odd implications of quantum mechanics.

      From the wikipedia page: “This thought experiment was devised by physicist Erwin Schrödinger in 1935 in a discussion with Albert Einstein to illustrate what Schrödinger saw as the problems of Niels Bohr and Werner Heisenberg's philosophical views on quantum mechanics.”

    • BalinKing 7 hours ago

      There are various theories about what's actually happening in quantum mechanics. Some theories have hidden variables, in which case the issue is simply one of measurement (i.e. there really is an "objectively correct" value, but it only looks to us like there isn't).[0] However, this is not known to be the case, and many theories really do claim that position and momentum fundamentally cannot both be well-defined at once. (The "default" Copenhagen interpretation is in the latter camp; AFAIK it's convenient in practice, and as a result it's implicitly assumed in introductory QM classes.)

      [0] Well, and the hidden variables are non-local, which is a whole 'nother can of highly non-intuitive worms.

      • ggm 7 hours ago

        I'm not qualified to say. But, because of inductive reasoning, I have some concern that underneath the next level of "oooh we found the hidden variable" will be a Feynman moment of saying "yea, thats defined by the as-yet unproven hidden-hidden variables, about which much conjecture is being made but no objective evidence exists, but if you fund this very large machine...."

    • sliken 10 hours ago

      Along similar lines, the double-slit experiment, seems simple. Two slits let light though and you get bands where they constructively or destructively interfere, just like waves.

      However I still find it crazy that when you slow down the laser and one photon at a time goes through either slit you still get the bands. Which begs the question, what exactly is it constructively or destructively interfering with?

      Still seems like there's much to be learned about the quantum world, gravity, and things like dark energy vs MOND.

      • squeefers an hour ago

        > However I still find it crazy that when you slow down the laser and one photon at a time goes through either slit you still get the bands.

        why does nobody mention the fact the photon doesnt keep going through the same hole? like why is it randomly moving through the air in this brownian way? the laser gun doesnt move, the slit doesnt move, so why do different photons end up going through different holes?

      • ggm 10 hours ago

        I had a conversation about this in HN some months back. It's a surprisingly modern experiment. It demanded an ability to reliably emit single photons. Young's theory may be 1800 but single photon emission is 1970-80.

        (This is what I was told, exploring my belief it's always been fringes in streams of photons not emerging over repeated applications of single photons and I was wrong)

        • lefra 6 hours ago

          To get single photons, you just need to stack up enough stained glass infront of a light source. That's been acheivable for aeons (the photon will go through at random time though).

          The difficult part is single photon _detectors_, they're the key technology to explore the single-photon version of Young's experiment (which originally showed that light has wave-like properties).

      • jasonwatkinspdx 7 hours ago

        The most simple answer here is the "fields are real, particles are excitation patterns of fields." And that's generally the practical way most physicists think of it today as I understand it.

        If I make the equivalent of a double slit experiment in a swimming pool, then generate a vortex that propagates towards my plywood slits or whatever, it's not really surprising that the extended volume of the vortex interacts with both slots even though it looks like a singular "particle."

        • el_nahual 3 hours ago

          And yet if you place a detector at the slits to know which slit the single photon goes through, you get no interference pattern at the end.

  • beezle 10 hours ago

    I never liked that the physics community shifted from 'high energy' particle physics (the topic of the article) to referring to this branch as just 'particle physics' which I think leaves the impression that anything to do with 'particles' is now a dead end.

    Nuclear physics (ie, low/medium energy physics) covers diverse topics, many with real world application - yet travels with a lot of the same particles (ie, quarks, gluons). Because it is so diverse, it is not dead/dying in the way HEP is today.

    • grebc 8 hours ago

      What’s the saying… if your only tool is a collider?

  • mastermage 5 hours ago

    Its probably just very hard, in my opinion as a physicist

  • davidw 12 hours ago

    It's impossible to tell without opening the box the particle physics is in.

  • mhandley 11 hours ago

    One interesting gap in the standard model is why neutrinos have mass: https://cerncourier.com/a/the-neutrino-mass-puzzle/

  • Rury 8 hours ago

    It's just hard. I mean... it could very well be, that there's so many deeper layers underneath what we know in particle physics, but from our scale, also so infeasible to build something to analyze and decompose the nuanced behavior happening at that level, to the point that it's practically impossible to do so. Just like it is impossible to split an atom with your bare hands...

  • GMoromisato 12 hours ago

    The use of "AI" in particle physics is not new. In 1999 they were using neural nets to compute various results. Here's one from Measurement of the top quark pair production cross section in p¯p collisions using multijet final states [https://repository.ias.ac.in/36977/1/36977.pdf]

    "The analysis has been optimized using neural networks to achieve the smallest expected fractional uncertainty on the t¯t production cross section"

    • BrandoElFollito an hour ago

      I did my PhD in physics using nn back in 1997. It was not thriving yet, but was quite advanced already.

      I remember I used a library (THE library) from a German university which was all the rage at that time.

    • jdshaffer 12 hours ago

      I remember back in 1995 or so being in a professor's office at Indiana University and he was talking about trying to figure out how to use Neural Networks to automatically track particle trails in bubble chamber results. He was part of a project at CERN at the time. So, yeah, they've been using NNs for quite awhile. :-)

      • elashri 9 hours ago

        Particle identification using NN classifiers was actually on the early success stories of NN. These are pretty standard algorithms in tracking and trigger software in HEP experiments now. There are even standard tools in the field to help you train your own.

        What is more interesting currently is things like anomaly detection using ML/NN and foundational models..etc.

  • aatd86 12 hours ago

    Isn't it the mathematics that is lagging? Amplituhedron? Higher dimensional models?

    Fun fact: I got to read the thesis of one my uncles who was a young professor back in the 90's. Right when they were discovering bosons. They were already modelling them as tensors back then. And probably multilinear transformations.

    Now that I am grown I can understand a little more, I was about 10 years old back then. I had no idea he was studying and teaching the state of the art. xD

    • elzbardico 11 hours ago

      Tensors are pretty old in physics; they are a central concept in Einstein's General Relativity.

      You can find tensors even in some niche stuff in macroeconomics.

    • ecshafer 11 hours ago

      Tensors are like 200 years old in mathematics. Gauss talked about Tensors.

      • aatd86 7 hours ago

        What was new was not tensors. It was the representation in SU of mesons for photon-photon collisions. But even saying that is skimming the surface. I can't read beyond the knowledge gap.

  • jahnu 12 hours ago

    I find the arguments from those who say there is no crisis convincing. Progress doesn’t happen at a constant rate. We made incredible unprecedented progress in the 20th century. The most likely scenario is that to slow down for a while. Perhaps hundreds of years again! Nobody can know. We are still making enormous strides compared to most of scientific history.

    • Insanity 12 hours ago

      Although we do have many more people now working on these problems than any time in the past. That said, science progresses one dead scientist at the time so might still take generations for a new golden era.

  • tariky 6 hours ago

    To my uneducated eye it looks like they are stuck in limbo for 120 years. Nothing practical has been create based on those theories. It is just words and calculations spinning in circles.

    I wish those people focus on practical real world physics. So we all can enjoy new innovations.

    • WantonQuantum 6 hours ago

      The device you used to make this comment relies heavily on quantum effects to make efficient transistors. The necessary theoretical understanding of semiconductors did not exist 120 years ago.

    • potamic 5 hours ago

      You're right. If you were educated, you would have learnt about the numerous applications of particle physics in modern technologies.

    • jacquesm 5 hours ago

      > Nothing practical has been create based on those theories.

      Ever used GPS?

      A CD player?

      A laser?

      Semiconductors?

      • gary_0 4 hours ago

        Einstein laid the theoretical foundations for lasers in 1917, and it took over 40 years of "impractical" scientific work before the first functioning laser was built. It took decades more for them to become a cheap, ubiquitous technological building-block. The research is still continuing, and there's no reason to assume it will stop eventually bearing fruit (for the societies that haven't decimated their scientific workforce, anyways). Look at the insanity required to design and build the EUV lasers in ASML's machines, which were used to fabricate the CPU I'm using right now, over a century after Einstein first scribbled down those obscure equations!

        • jacquesm an hour ago

          I sincerely wonder how someone that is unaware of any of this finds their way onto HN, but at the same time it is an educational opportunity. 'nothing practical' indeed...

  • sprash 2 hours ago

    It is obviously not dead but it should be dead: Almost all of the technical and economic progress made in the last century was achieved with macroscopic quantum effects. Particle physics spends a lot of energy and material resources to measure microscopic effects. The priorities are esentially inverted. At this point it is not even about discovery. Experiments are relegated to precision measurements. What practical use will it be if we know the mass/charge distribution/polarizability of some particles more precicely by a few percent? About nothing.

  • nephihaha 2 hours ago

    When the model appears to have massive problems, maybe it's time to go back and revise it.

    • squeefers an hour ago

      or if youre micho kaku, just parrot it on low grade tv shows and public appearances because its easier to gain notoriety than to do science

  • gowld 12 hours ago

    Information content of the article:

    The discovery of the Higgs boson in 2012 completed the Standard Model of particle physics, but the field has since faced a "crisis" due to the lack of new discoveries. The Large Hadron Collider (LHC) has not found any particles or forces beyond the Standard Model, defying theoretical expectations that additional particles would appear to solve the "hierarchy problem"—the unnatural gap between the Higgs mass and the Planck scale. This absence of new physics challenged the "naturalness" argument that had long guided the field.

    In 2012, physicist Adam Falkowski predicted the field would undergo a slow decay without new discoveries. Reviewing the state of the field in 2026, he maintains that experimental particle physics is indeed dying, citing a "brain drain" where talented postdocs are leaving the field for jobs in AI and data science. However, the LHC remains operational and is expected to run for at least another decade.

    Artificial intelligence is now being integrated into the field to improve data handling. AI pattern recognizers are classifying collision debris more accurately than human-written algorithms, allowing for more precise measurements of "scattering amplitude" or interaction probabilities. Some physicists, like Matt Strassler, argue that new physics might not lie at higher energies but could be hidden in "unexplored territory" at lower energies, such as unstable dark matter particles that decay into muon-antimuon pairs.

    CERN physicists have proposed a Future Circular Collider (FCC), a 91-kilometer tunnel that would triple the circumference of the LHC. The plan involves first colliding electrons to measure scattering amplitudes precisely, followed by proton collisions at energies roughly seven times higher than the LHC later in the century. Formal approval and funding for this project are not expected before 2028.

    Meanwhile, U.S. physicists are pursuing a muon collider. Muons are elementary particles like electrons but are 200 times heavier, allowing for high-energy, clean collisions. The challenge is that muons are highly unstable and decay in microseconds, requiring rapid acceleration. A June 2025 national report endorsed the program, which is estimated to take about 30 years to develop and cost between $10 and $20 billion.

    China has reportedly moved away from plans to build a massive supercollider. Instead, they are favoring a cheaper experiment costing hundreds of millions of dollars—a "super-tau-charm facility"—designed to produce tau particles and charm quarks at lower energies.

    On the theoretical side, some researchers have shifted to "amplitudeology," the abstract mathematical study of scattering amplitudes, in hopes of reformulating particle physics equations to connect with quantum gravity. Additionally, Jared Kaplan, a former physicist and co-founder of the AI company Anthropic, suggests that AI progress is outpacing scientific experimentation, positing that future colliders or theoretical breakthroughs might eventually be designed or discovered by AI rather than humans.

  • Razengan 6 hours ago

    Maybe this is all we can learn from home and we need to get out more.

  • ktallett 12 hours ago

    Is it more that even the most dedicated and passionate researchers have to frame their interests in a way that will get funding? Particle Physics right now is not the thing those with the cash will fund right now. AI and QC is the focus.

    • Legend2440 12 hours ago

      Well, it's hard to make an argument for a $100 billion collider when your $10 billion collider didn't find anything revolutionary.

      Scaling up particle colliders has arguably hit diminishing returns.

  • bsder 12 hours ago

    Theoretical physics progresses via the anomalies it can't explain.

    The problem is that we've mostly explained everything we have easy access to. We simply don't have that many anomalies left. Theoretical physicists were both happy and disappointed that the LHC simply verified everything--theories were correct, but there weren't really any pointers to where to go next.

    Quantum gravity seems to be the big one, but that is not something we can penetrate easily. LIGO just came online, and could only really detect enormous events (like black hole mergers).

    And while we don't always understand what things do as we scale up or in the aggregate, that doesn't require new physics to explain.

    • beezle 10 hours ago

      Please do not conflate the broad "theoretical physics" with the very specific "beyond the standard model" physics questions. There are many other areas of physics with countless unsolved problems/mysteries.

      • bsder 6 hours ago

        Sure, there are things like "Really, how do superconductors work?", but nobody (mostly) believes that understanding things like that requires "new physics".

        And, I think, most people would place that kind of stuff under "solid state physics" anyway.

        • squeefers an hour ago

          oh i dont know, being able to predict the path of a particle seems pretty basic to me, and it cannot be done for any given particle.

    • mhandley 11 hours ago

      Neutrino mass is another anomaly, which is at least slightly easier to probe than quantum gravity: https://cerncourier.com/a/the-neutrino-mass-puzzle/

  • tehjoker 12 hours ago

    It's kind of legitimate, but it's kind of sad to see some of the smartest people in society just being like "maybe AI will just give me the answer," a phrase that has a lot of potential to be thought terminating.

    • emmelaich 12 hours ago

      That's mentioned in the article too:

      >Cari Cesarotti, a postdoctoral fellow in the theory group at CERN, is skeptical about that future. She notices chatbots’ mistakes, and how they’ve become too much of a crutch for physics students. “AI is making people worse at physics,” she said.

      • yalok 12 hours ago

        this. Deep understanding of physics involves building a mental model & intuition how things work, and the process of building is what gives the skill to deduce & predict. Using AI to just get to the answers directly prevents building that "muscle" strength...

      • gowld 10 hours ago

        AI chatbots are also making people better at physics, by answering questions the textbook doesn't or the professor can't explain clearly, patiently. Critical thinking skills are critical. Students cheating with chatbots might not have put in the effort to learn without chatbots.

    • 0x3f 12 hours ago

      I'm quite happy that it might give me, with pre-existing skills, more time on the clock to stay relevant.

  • AIorNot 4 hours ago

    Curious what everyone thinks about this physicists idea

    - the universe as a Neural Network (yes yes moving the universe model paradigm from the old Clockwork to machine to computer to neural network)

    I found it interesting and speculative but also fascinating

    See video here:

    https://youtu.be/73IdQGgfxas?si=PKyTP8ElWNr87prG

    AI summary of the video:

    This video discusses Professor Vitaly Vanchurin's theory that the universe is literally a neural network, where learning dynamics are the fundamental physics (0:24). This concept goes beyond simply using neural networks to model physical phenomena; instead, it posits that the universe's own learning process gives rise to physical laws (0:46).

    Key takeaways from the discussion include: • The Universe as a Neural Network (0:00-0:57): Vanchurin emphasizes that he is proposing this as a promising model for describing the universe, rather than a definitive statement of its ontological nature (2:48). The core idea is that the learning dynamics, which are typically used to optimize functions in machine learning, are the fundamental physics of the cosmos (6:20). • Deriving Fundamental Field Equations (21:17-22:01): The theory suggests that well-known physics equations, such as Einstein's field equations, Dirac, and Klein-Gordon equations, emerge from the learning process of this neural network universe. • Fermions and Particle Emergence (28:47-32:15): The conversation delves into how particles like fermions could emerge within this framework, with the idea that useful network configurations for learning survive, similar to natural selection. • Emergent Quantum Mechanics (44:53-49:31): The video explores how quantum behaviors, including the Schrödinger equation, could emerge from the two distinct dynamics within the system: activation and learning. This requires the system to have access to a "bath" or "reservoir" of neurons. • Natural Selection at the Subatomic Scale (1:05:10-1:07:34): Vanchurin suggests that natural selection operates on subatomic particles, where configurations that are more useful for minimizing the loss function (i.e., for efficient learning) survive and those that are not are removed. • Consciousness and Observers (1:15:40-1:24:09): The theory integrates the concept of observers into physics, proposing a three-way unification of quantum mechanics, general relativity, and observers. Consciousness is viewed as a measure of learning efficiency within a subsystem (1:30:38).

  • albatross79 7 hours ago

    Why are we even trying to look deeper? To fit our mathematical curves better? Abstract spacetime, fields, virtual particles, wave function collapse, quantized energy, wave particle duality, etc. This is all BS. And I'm not disputing the theories or the experimental results. These concepts are unintelligible. They are self contradictory. They are not even abstractions, they are mutually exclusive paradigms forced together into a bewilderment. I'm not disputing that the math fits the observations. But these are not explanations. If this is what it's come to, all we can expect from here on is to better fit the math to the observation. And in the end, an equation that tells us nothing about what we really wanted to know, like "what is it really"? Nobody is going to be satisfied with an equation, so why are we still funding this enterprise, for better lasers to kill bad guys?

    • WantonQuantum 6 hours ago

      I find quite a lot of it very satisfying. For example, the deep mathematical symmetries of gauge theory and how they relate to the observed forces of the universe is truly amazing.

      The excellent Arvin Ash has a very accessible video about it: https://www.youtube.com/watch?v=paQLJKtiAEE

      • squeefers an hour ago

        maybe thats the problem. satisfaction isnt understanding. string theory is exciting maths, but fits nothing in reality. maybe scientists should go back to explaining reality instead of whatever this current paradigm is

    • drdeca 6 hours ago

      The universe is not obligated to appeal to your aesthetic tastes in its innermost functioning.

      Maybe you aren’t going to be satisfied with the sort of complicated mathematics which appears to be correct (or, on the right track).

      If you have complaints about the aesthetics of how the universe works, take it up with God.

      Personally, I think there is a lot of beauty to be found in it.

      I’ll admit that there are a few parts that go against my tastes (I don’t like needing to resort to distributions instead of proper functions), but that’s probably just intellectual laziness on my part.

      • squeefers an hour ago

        > The universe is not obligated to appeal to your aesthetic tastes in its innermost functioning.

        This is truly a copout. When science faulters in explaining the world we get answers like this. His argument isnt with the universe, but with out own scientific theories. If you dont want your theories about the physical world to explain physical world, then be an engineer. Science explains the world, engineers use those theories. QM has large gaps and doesnt actually explain much, but I guess the universe doesnt care whether our theories are wildly off the mark or not.

  • meindnoch 11 hours ago

    Maybe it's time for physicists to switch to agile? Don't try to solve the theory of the Universe at once; that's the waterfall model. Try to come up with just a single new equation each sprint!