Entropy Explained, with Sheep (2016)

(engineersedge.com)

89 points | by Nevin1901 4 days ago ago

29 comments

  • Maro 4 days ago

    I've written a few articles about Entropy (I'm a physicist working in DS).

    Almost all of them have Python code to illustrate concepts.

    -

    1. Entropy of a fair coin toss - https://bytepawn.com/what-is-the-entropy-of-a-fair-coin-toss...

    2. Cross entropy, joint entropy, conditional entropy and relative entropy - https://bytepawn.com/cross-entropy-joint-entropy-conditional...

    3. Entropy in Data Science - https://bytepawn.com/entropy-in-data-science.html

    4. Entropy of a [monoatomic] ideal gas with coarse-graining - https://bytepawn.com/entropy-of-an-ideal-gas-with-coarse-gra...

    5. All entropy related posts - https://bytepawn.com/tag/entropy.html

    • dr_dshiv 3 days ago

      What’s the difference between information entropy and physical entropy?

  • dr_dshiv 3 days ago

    I did a deep dive on entropy a couple years ago. I found the concept to be much harder to understand than I expected! Specifically, it was confusing to shift from the intuitive but wrong “entropy is disorder” to “entropy is about the number of possible microstates in a macrostate” (Boltzmann Entropy) https://en.wikipedia.org/wiki/Boltzmann%27s_entropy_formula

    I was extra confused when I discovered that a spread out cloud of hydrogen is lower entropy than the same cloud gravitationally bound together in a star. So entropy isn’t just about “spreading out,” either.

    I found that Legos provide a really nice example to illustrate entropy, so I’ll share that here.

    Consider a big pile of Legos, the detritus of many past projects. Intuitively, a pile of Legos is high entropy because it is disordered—but if we are trying to move beyond order/disorder, we need to relate it to micro states and macro states.

    Therefore, a pile of Legos is high entropy because if you randomly swap positions of the pieces it will all be the same macrostate—ie a big pile of Legos. Nevertheless, each of the Lego pieces is still in a very specific position— and if we could clearly snapshot all those positions, that would be the specific microstate. That means that the macrostate of the pile has an astronomical number of possible microstates — there are many ways to reorganize the pieces that still look like a pile.

    On the other hand, consider a freshly built Lego Death Star. This is clearly low entropy. But to understand why in terms of microstates, it is because very few Legos can be swapped or moved without it not really being a Death Star anymore. The low entropy is because there are very few microstates (specific Lego positions) that correspond to the given macro state (being a Death Star).

    This specific case helped me grok Boltzmann entropy. To extend it, consider a box with a small ice crystal in it: this has many fewer possible microstates than the same box filled with steam. In the steam, molecules can pretty much we swapped and moved anywhere and the macrostate is the same. With the crystal, if you start randomly swapping molecules to different microstates, it stops being an ice crystal quickly. So an ice crystal is low entropy.

    Now, the definition of what counts as a macrostate is very important in this… but this comment is long enough and I still haven’t gotten to the gym…

    • kitd 3 days ago

      Helpful post nonetheless!

  • kaonwarb 4 days ago

    I appreciate the explanation, but the very first example doesn't sit well with me. Water forming into ice cubes spontaneously looks weird simply because we’re not used to seeing it. Consider a time-lapse of an icicle forming as a sort of counter-example: https://m.youtube.com/watch?v=mmHQft7-iSU

    (Not refuting entropy as the order of time at all, just noting a visual example is not great evidence.)

    • throwuxiytayq 4 days ago

      Ah, but the icicle isn’t really equivalent to water undripping and refreezing back into a nice cube-shaped object, at presumed room temperature (since our point of reference is the first gif). That would be weird to see IRL always, no matter what. You could watch that gif a million times and you’d still shit your pants if that happened to the glass of water on your desk.

  • xtiansimon 3 days ago

    I like my entropy story with more steam engines,

    “The most misunderstood concept in physics", by Veritasium (YouTube, 2023) (https://youtu.be/DxL2HoqLbyA?si=5a_4lCnuv85lRb57)

  • markhahn 3 days ago

    Now imagine that this was in gradeschool curricula!

    I really think education is mostly about providing higher-level intuitions - making correct thought habitual and thus easy.

    Part of what's so attractive about this particular article is how it would mesh with related fields (chemistry, statistics, politics, evolution, astophysics, climate science, etc)

  • LaundroMat 4 days ago

    So if I get this right, there is an infinitely small possibility that a cracked egg returns to its initial state. Imagine that happening and being put on video. We'd all believe we're living in a simulation and witnessed a glitch.

    No-one would believe the scientists explaining that although highly improbable, the uncracked egg does make scientific sense.

    • speakeron 3 days ago

      It's not really whether it makes scientific sense or not, it's just that it's so very highly improbable (really, really improbable) that other explanations make more sense: the video's a fake, it's mass hysteria, or even that we're living in a simulation.

    • markhahn 3 days ago

      I think your question mainly demonstrates how much trouble most people have about reasoning about exponentials. Not intending any personal insult, but unless you do it regularly (ie, probably are a physicist), you will use a term like "highly improbable" when referring to quantities that can only be expressed in scientific notation.

      In other words, most humans have bad intuition about large numbers. And I'm not talking about "small" large numbers like "how many Teslas could Elon buy". I mean "how many atoms are in a chicken egg" (and what are their statistical properties at room temperature)

  • NewsaHackO 4 days ago

    Also, one should check out Tenet, it's a pretty authoritative resource about this as well.

    • ruthmarx 4 days ago

      How so? That film is a narrative mess with some good action scenes an not much more.

      If you want to reference a relevant sci-fi, I'd say Asimov's The Last Question is a better fit.

  • xtrapol8 4 days ago

    > entropy is just a fancy word for ‘number of possible arrangements’

    It isn’t though.

    Entropy is a fancy word for potential distribution over negative potential. Negative potential is the “surface area” over which potential may distribute. The “number of possible arrangements” casually fits into this, yet misses some unintuitive possibilities, like the resistive variance or other characteristics not preempted by who ever constructed the intellectual model.

    Idealists insist entropy is a scalar state resolve of delta probability in their model. They are self deceived. Entropy is the existential tendency for potential to distribute toward equilibrium.

    As long as boffins can throw away results that do not correlate, they can insist it is anything they like.

    • ninetyninenine 4 days ago

      >Entropy is a fancy word for potential distribution over negative potential. Negative potential is the “surface area” over which potential may distribute.

      I don't understand this. Please elucidate.

      • IncreasePosts 4 days ago

        Consider that if you Google for `"resistive variance" entropy`, the only hit is for the hn comment.

        It doesn't make sense because what they wrote makes no sense. Probably some looney with their own definition of entropy.

        • xtrapol8 4 days ago

          Well, this looney likes to point out that the manifold surface area (which is not always uniform) determines the rate of or density of distribution. All this is accounted for in the math selected for whichever for instance (“number of possible states” would count them) though a superior definition is one which is the most general technically correct without enumerating exceptions or extraneous clauses.

          Resistance controls the rate of flow of any potential. That you cannot parse English without an exact match of phrase is kind of ironic.

          I guess when arguing convention I shouldn’t be too casual with my own language.

          Entropy is the distribution of potential over negative potential. In every case. Period. All the other words we use describe how this happens in a specific context (thermal or information).

          Can you find an established definition that can be more succinctly regarded as this?

          Also I like insisting it is a phenomenon of existential reality, not a conceptual tool of the human mind.

          • ninetyninenine 3 days ago

            Can you explain exactly what potential is and what it means to have negative potential in very fundamental terms? Like explain it to me if I’m 8. Use real world examples.

            Then explain what is and how this relates to a manifold surface area?

            • xtrapol8 a day ago

              Sure, thanks for taking an honest curious interest. I actually posted a new HN entry which was removed (not “dead”, not “flagged”.) after this entry, I will post the contents of that and you can take what you will from it.

              Potential - the great mystery, I’m pretty sure if I nail this I will solve “the grand unified field theory” and/or the “unified theory of everything” though don’t cringe, I think I’m on to something. EVERYTHING is potential. Existential existence is potential. Information is potential. Hungry? An orange is potential for the equilibrium of your hunger.

              Whatever light is may be tied in a knot and that is the basis of all matter. Potential is state unresolved. When QM theorists describe the super amplitude which resolves through interference, that super amplitude is universal potential. This gives rise to state. State is NOT the fundamental aspect. There is no smallest particle building up to the universe, there is potential subdividing into smaller particles through resolve. That is potential.

              Negative potential is another way of saying wherever potential is not. Like an ion is missing an electron. The negative potential attracts the electron of another. A different way of saying this is “potential distributes to equilibrium” though “negative potential” is technically more descriptive. Heat moves towards lower heat, only technically if heat is at equilibrium everything is still moving wherever the math shows a gap sufficiently for the thermal flux to vibrate (negative potential still exists at thermal equilibrium).

              Check out Roger Penrose’s Road to Reality, he devotes chapters to manifold surface areas. Essentially three dimensional space is not honest. Space-time actually has hyper-dimensional curvature (accretion and planetary ecliptics for instance show how spacial distribution forms an axis)

              Let’s talk about electricity. Voltage is literally defined as the potential (negative and positive), where an abundance of electron excitation may be closed in a circuit. Entropy is the fundamental force that causes the electrons (the negative is actually the “positive potential”) to rush to the positive side (which is technically the negative potential, don’t blame me lots have noticed this oddity over the century.) if you add resistance, this turns into heat, slowing the entropic property of voltage equilibrium down. This heat could be insulated, which would slow its dissipation (that talk of entropy may be slowed through resistance.) Heat is not entropy, however heat is the final exhaust of all work, making it the obvious measure (thus the laws of thermodynamics.)

              Seeing things this way teach us something important which unifies our understanding of “entropy” in both physics and information theory.

              For information, toss a handful of marbles (packed and “ordered” by your grip) into the air and entropy will cause them to spread spacially along the “manifold” of the force arc of your toss.

              I will post an additional comment with the content of my HN post which was censored. It didn’t include all the explanations, though I worded my whole point more carefully. Thanks again for being more curious than anti-karmic, this one gave me a beating!

            • xtrapol8 a day ago

              Entropy is not enumerated states

              1 point by xtrapol8 1 day ago | hide | past | favorite

              Entropy is an existential property describing the distribution of potential. Popularly leading minds in physics, math, and information theory have long upheld a number of standing definitions which do not consistently coincide with observable or functional Truth (causing enumerations and conditions.) These have represented the best of our ideas, and have been accepted among you generationally as immutable and foundational.

              These definitions are imperfect and addressing it as such has triggered a blowback of condescension revealing hypocrisy within the cult of science.

              I contended this monumental revelation as substantial, edging forward the understanding of all human kind.

              Entropy is an existential force, not an abstract idea. Entropy is the distribution of potential toward equilibrium.

              Our Universe IS NOT COMPOSED OF STATES. States are momentary values taken under measurement (as extolled by a recent Nobel Prize.) If science can say “the universe is not locally real”, it is time to say entropy is not a number of states, it is potential distributing.

              The conceptual resistance this idea has provoked reveals an intellectually brittle character among the very ideas of discovery and understanding. This inflexibility demonstrates a blindness towards one’s own learned misconceptions, the antithesis of advanced learning (which resolves our simple truths.)

              Entropy is not a construct of the mind, and it is not limited to known states of a modeled problem, in existential reality it includes all possibilities even those breaking intellectual models.

            • a day ago
              [deleted]
          • IncreasePosts 4 days ago

            I'm not interested in reading your word salad.

        • kelnos 4 days ago

          Your post is a good primer on bullshit detection: if you read something on the internet that sounds confidently authoritative, but you yourself are not well-versed in the subject, find what seems to be some key phrases and search the web for them. If you find lots of other hits on seemingly-reputable sources, then what you've read may be correct. If you only find the thing that you've just read, it's bullshit, with high probability.

          • xtrapol8 4 days ago

            Entropy is not the number of possible states in a system. Entropy includes outcomes not predicted by the mental model. Convention teaches this incorrectly. I’m the black sheep, bhaaaa.

      • xtrapol8 4 days ago

        The inverse square law is an example of potential distribution, not number of states.

      • xtrapol8 4 days ago

        Please refer to the comment of this comment!

      • lisper 4 days ago

        > I don't understand this.

        That's because it's nonsense.