34 comments

  • skybrian an hour ago

    I guess gigawatts is how we roughly measure computing capacity at the datacenter scale? Also saw something similar here:

    > Costs and pricing are expressed per “token”, but the published data immediately seems to admit that this is a bad choice of unit because it costs a lot more to output a token than input one. It seems to me that the actual marginal quantity being produced and consumed is “processing power”, which is apparently measured in gigawatt hours these days. In any case, I think more than anything this vindicates my original decision not to get too precise. [...]

    https://backofmind.substack.com/p/new-new-rules-for-the-new-...

    Is it priced that way, though? I assume next-gen TPU's will be more efficient?

    • nomel 15 minutes ago

      > but the published data immediately seems to admit that this is a bad choice of unit because it costs a lot more to output a token than input one

      And, that's silly, because API pricing is more expensive for output than input tokens, 5x so for Anthropic [1], and 6x so for OpenAI!

      [1] https://platform.claude.com/docs/en/about-claude/pricing

      [2] https://openai.com/api/pricing

    • brokencode 32 minutes ago

      Gigawatts seems like more a statement of the power supply and dissipation of the actual facility.

      I’m assuming you can cram more chips in there if you have more efficient chips to make use of spare capacity?

      Trying to measure the actual compute is a moving target since you’d be upgrading things over time, whereas the power aspects are probably more fixed by fire code, building size, and utilities.

  • mikert89 an hour ago

    There's no limit to the algorithms. People dont understand yet. They can learn the whole universe with a big enough compute cluster. We built a generalizable learning machine

    • totaa an hour ago

      the question is will we experience resource constraints before we get there? what if the step up to post-scarcity is gated by a compute level just out of our reach?

      • mikert89 an hour ago

        human ingenuity will solve this

        • __loam 13 minutes ago

          Or we'll have ecological collapse.

    • teaearlgraycold an hour ago

      Not sure if this is satire.

      Edit: What we have built is a natural language interface to existing, textually recorded, information. Transformers cannot learn the whole universe because the universe has not yet been recorded into text.

      • 0x3f 22 minutes ago

        Based on a glance at their other comments: not satire.

      • supliminal 28 minutes ago

        It’s more than likely not.

      • erelong 27 minutes ago

        Poe's (c)law?

        • bryogenic a few seconds ago

          Poe’s (C)law: The more absurd AI-generated content becomes, the more likely people are to believe it is real.

      • alfalfasprout 16 minutes ago

        100% agreed. Sadly, lots of people out there with the "trust me bro, just need more compute". Hopefully we don't consume all the planet's resources trying.

        • xvector 10 minutes ago

          I reevaluated my priors long ago when I saw that scaling laws show no sign of stopping, no sign of plateau.

          Strangely some people on HN seem to desperately cling to the notion that it's all going to come to a halt. This is unscientific. What evidence do you have - any evidence - that the scaling laws are due to come to an end?

          • rishabhaiover 3 minutes ago

            I suspect it's not that people do not see the progress, they fail to fully trust laws not truly backed by physics like the transistor laws. We empirically see that scaling works and continue to work.

          • 0x3f 2 minutes ago

            All the curves have been levelling off as expected. Not really sure what you're talking about.

  • Eufrat an hour ago

    Can someone explain why everything is being marketed in terms of power consumption?

    • NoahZuniga an hour ago

      It's more meaningful to most people than FLOPS/other measures of actual computing power.

    • teaearlgraycold an hour ago

      Because that’s the limiting factor

      • Eufrat 2 minutes ago

        I feel like that’s a bit glib?

        Surely, there should be some more critical questions posed by why just buying a bunch of GPUs is a good idea? It just feels like a cheap way to show that growth is happening. It feels a bit much like FOMO. It feels like nobody with the capital is questioning whether this is actually a good idea or even a desirable way to improve AI models or even if that is money well spent. 1 GW is a lot of power. My understanding is that it is the equivalent to the instantaneous demand of a city like Seattle. This is absurd.

        It feels like the big banks are already trying to CYA themselves by publishing reports saying AI has not contributed meaningfully to the economy.

      • zozbot234 11 minutes ago

        There's at least a decent argument to be made that the limiting factor is actually the physical silicon itself (at least at cutting-edge nodes) not really the power. This actually gives AI labs an incentive to run those specific chips somewhat cooler, because high device temperatures and high input voltages (which you need to push frequencies higher) might severely impact a modern chip's reliability over time.

      • serf 30 minutes ago

        kinda complicated though when you consider it fully. Power consumption only measures the environmental impact really, we come up with more clever ways to use the same amount of power daily.

        it's kind of like an electrical motor that exists before the strong understanding of lorentz/ohm's law. We don't really know how inefficient the thing is because we don't really know where the ceiling is aside from some loosey theoretical computational efficiency concepts that don't strongly apply to practical LLMs.

        to be clear, I don't disagree that it's the limiting factor, just that 'limits' is nuanced here between effort/ability and raw power use.

      • Animats an hour ago

        Somehow we must be doing this wrong.

        "Do you realize that the human brain has been liken to an electronic brain? Someone said and I don't know whether he is right or not, but he said, if the human brain were put together on the basis of an IBM electronic brain, it would take 7 buildings the size of the Empire State Building to house it, it would take all the water of the Niagara River to cool it, and all of the power generated by the Niagara River to operate it." (Sermon by Paris Reidhead, circa 1950s.[1])

        We're there on size and power. Is there some more efficient way to do this?

        [1] https://www.sermonindex.net/speakers/paris-reidhead/the-trag...

        • whimsicalism 40 minutes ago

          pretty sure evolution spent more time and energy getting there then we ultimately will

        • brianjlogan an hour ago

          I'd imagine one day there will be a limiting factor of cash to burn as well.

          • Animats an hour ago

            We're getting close. The first big AI bankruptcy can't be far off.

            • bdangubic 28 minutes ago

              Big Gov will bail out the big guys if/when necessary

    • jeffbee an hour ago

      It's easy to think about. Google reported a global average power consumption of 3.7GW in 2024, so you can think of this deal as representing an expansion of something like 10-15% of that 2024 baseline, if you assume 50% capacity utilization.

  • cebert 25 minutes ago

    I’m surprised Anthropic wanted to partner with Broadcom when they have such a negative reputation with antics such as their VMWare acquisition.

    • ggm 4 minutes ago

      The VMware s/w rental market has no relevance to this deal, any more than the IBM role in data processing in the 1930s had any relevance to their business, or Oracle's failure in the DC market impacts licencing of the database product.

      It's just not material. Broadcom make devices they need, and Broadcom want to sell those devices and exclude another VLSI company from selling, so the two have an interest in doing business. That's all there is to it.

      About the most you could say is that the lawyers drafting whatever agreement they sign to, will reflect on the contract in regard to future changes of pricing and supply, in the light of what Broadcom did with VMWare licencing costs.

    • Eufrat 19 minutes ago

      I think it’s also important to add the context that Broadcom’s CEO, Hock Tan, went on CNBC in October and had a vacuous conversation with Jim Cramer about their OpenAI “deal” at the time [0]. Nothing of substance was said, it was just endless loops about the opportunity of AI. It is now 6 months later and there has been nary a peep from Broadcom about any updates.

      I think Anthropic is a more grounded company than OpenAI because Sam Altman is insane, but it is still playing the same game.

      [0] https://www.youtube.com/watch?v=pU2HhJ3jCts

    • thundergolfer 6 minutes ago

      Broadcom builds the TPU chip. Google designs it. You can’t avoid partnering with Broadcom if you want TPUs in significant volume .

    • jeffbee 11 minutes ago

      Broadcom makes the TPU. If you want TPUs, you are working with Broadcom whether you want to or not.