35 comments

  • xnx an hour ago

    The Megahertz Wars were an exciting time. Going from 75 MHz to 200 MHz meant that everything (CPU limited) ran 2x as fast (or better with architectural improvements).

    Nothing since has packed nearly the impact with the exception of going from spinning disks to SSDs.

    • rr808 2 minutes ago

      I still remember my first CPU with a heatsink. It seemed like a temporary dumb hack.

    • dlcarrier 12 minutes ago

      In my experience, SSDs had a bigger impact. Thanks to Wirth's Law (https://en.wikipedia.org/wiki/Wirth%27s_law) the steady across-the-board increase in processing power didn't equate to programs running much faster, e.g. Discord running on a modern computer isn't any more responsive, if not less responsive than an ICQ client was running on a computer 25 years ago.

      SSDs provided a huge bump in performance to each individual computer, but trickled their way into market saturation over a generation or two of computers, so you'd be effectively running the same software but in a much more responsive environment.

      • vachina 5 minutes ago

        > Discord running on a modern computer isn't any more responsive, if not less responsive than an ICQ client was running on a computer 25 years ago.

        I feel this. Humanity has peaked.

    • st_goliath an hour ago

      > The Megahertz Wars were an exciting time.

      About a week ago, completely out of the blue, YouTube recommended this old gem to me: https://www.youtube.com/watch?v=z0jQZxH7NgM

      A Pentium 4, overclocked to 5GHz with liquid nitrogen cooling.

      Watching this was such an amazing throwback. I remember clearly the last time I saw it, which was when an excited friend showed it to me on a PC at our schools library. A year or so before YouTube even existed.

      By late 2005, my Pentium 4 Prescott at home had some 3.6GHz without overclocking, 4GHz models for the consumer market were already announced (but plagued by delays), but surely 10GHz was "just a few more years away".

    • embedding-shape an hour ago

      > Nothing since has packed nearly the impact with the exception of going from spinning disks to SSDs.

      "Bananas" core-counts gave me the same experience. Some year ago I moved to Ryzen Threadripper and experienced similar "Wow, compiling this project is now 4x faster" or "processing this TBs of data is now 8x faster", but of course it's very specific to specific workloads where concurrency and parallism is thought of from the ground up, not a general 2x speed up in everything.

    • HPsquared an hour ago

      SSDs were such a revolution though, and a really rewarding upgrade. I'd fit SSDs to friend and family computers as an upgrade.

      • micv an hour ago

        Getting my first SSD was absolutely the best computer upgrade I've ever bought. I didn't even realise how annoying load times were because I was so used to them and coming from C64s and Amigas even spinning rust seemed fairly quick.

        It took a long time before I felt a need to improve my PC's performance again after that.

        • coffeebeqn 35 minutes ago

          There were quite a few mind blowing upgrades back in the day. The first sound card instead of PC beeper was one of my most memorable moments.

          I remember loading up Doom, plugging my shitty earplugs that had a barely long enough cable and hearing the “real” shotgun sound for the first time. Oo-wee

      • sigmoid10 an hour ago

        I once had a decade old Thinkpad that suddenly became my new work laptop once more thanks to an SSD. It's a true shame they simply don't make them like this anymore.

      • dcminter 41 minutes ago

        Just before I installed an SSD was the last time I owned a computer that felt slow.

    • geon an hour ago

      GPUs for 3d graphics were a game changer.

      I can see why you wouldn’t consider it as impactful if you weren’t into gaming at the time.

  • Sharlin an hour ago

    The i486DX 33MHz was introduced in May 1990. A 30x increase, or about five doublings, in clock speeds over ten years. That's of course not the whole truth; the Athlon could do much more in one cycle than the 486. In any case, in 2010 we clearly did not have 30GHz processors – by then, the era of exponentially rising clock speeds was very decidedly over. I bought an original quadcore i7 in 2009 and used it for the next fifteen years. In that time, roughly one doubling in the number of cores and one doubling in clock speeds occurred.

    • bee_rider 33 minutes ago

      It is true that we haven’t seen single core clock speeds increasing as fast, for a long while now. And I think everyone agrees that some nebulously defined “rate of computing progress” has slowed down.

      But, we can be slightly less pessimistic if we’re more specific. Already by the early 90’s, a lot of the clock speed increase came from strategies like pipelines, superscalar instructions, branch prediction. Instruction level parallelism. Then in 200X we started using additional parallelism strategies like multicore and SMT.

      It isn’t a meaningless distinction. There’s a real difference between parallelism that the compiler and hardware can usually figure out, and parallelism that the programmer usually has to expose.

      But there’s some artificiality to it. We’re talking about the ability of parallel hardware to provide the illusion of sequential execution. And we know that if we want full “single threaded” performance, we have to think about the instruction level parallelism. It’s just implicit rather than explicit like thread-level parallelism. And the explicit parallelism is right there in any modern compiler.

      If the syntax of C was slightly different, to the point where it could automatically add OpenMP pragmas to all it’s for loops, we’d have 30GHz processors by now, haha.

    • adrian_b an hour ago

      "The era of exponentially rising clock speeds" was already over in 2003, when the 130-nm Pentium 4 reached 3.2GHz.

      All the later CMOS fabrication processes, starting with the 90-nm process (in 2004), have provided only very small improvements in the clock frequency, so that now, 23 years later after 2003, the desktop CPUs have not reached a double clock frequency yet.

      In the history of computers, the decade with the highest rate of clock frequency increase has been 1993 to 2003, during which the clock frequency has increased from 67 MHz in 1993 in the first Pentium, up to 3.2 GHz in the last Northwood Pentium 4. So the clock frequency had increased almost 50 times during that decade.

      For comparison, in the previous decade, 1983 to 1993, the clock frequency in mass-produced CPUs had increased only around 5 times, i.e. at a rate about 10 times slower than in the next decade.

    • layer8 an hour ago

      On the plus side, the 486DX-33 didn’t require active cooling. The second half of the 1990s was when home computing started to become noisy, and the art of trying to build silent PCs began.

  • hedora 43 minutes ago

    The Athlon XP was the bigger milestone, as I remember it.

    They were both "seventh generation" according to their marketing, but you could get an entire GHz+ Athlon XP machine for much less than half the $990 tray price from the article.

    I distinctly remember the day work bought a 5 or 6 node cluster for $2000. (A local computer shop gave us a bulk discount and assembled it for them, so sadly, I didn't poke around inside the boxes much.)

    We had a Solaris workstation that retailed for $10K in the same office. Its per-core speed was comparable to one Athlon machine, so the cluster ran circles around it for our workload.

    Intel was completely missing in action at that point, despite being the market leader. They were about to release the Pentium 4, and didn't put anything decent out from then to the Core 2 Duo. (The Pentium 4 had high clock rates, but low instructions per cycle, so it didn't really matter. Then AMD beat Intel to market with 64 bit support.)

    I suspect history is in the process of repeating itself. My $550 AMD box happily runs Qwen 3.5 (32B parameters). An nvidia board that can run that costs > 4x as much.

  • dd_xplore 2 hours ago

    I remember back in 2006 I used to browse overclock forums to overclock my pentium 4, I tons of fun consuming lots of instructions, I learned the bios, changed PLL clocks, mem clocks etc.

    • rckclmbr an hour ago

      I bought a car radiator and dremeled out my case, visited Home Depot for all the tubes and connectors. It’s too easy nowadays to add watercooling

  • mtucker502 2 hours ago

    What progress is being made in overcoming the current thermal limits blocking us from high clock rates (10Ghz+)?

    • vessenes an hour ago

      Like any doubling rule, the buck has to stop somewhere. Higher energy usage + smaller geometry means much more exotic analog physics to worry about in chips. I’m not a silicon engineer by any means but I’d expect 10Ghz cycles will be optical or very exotically cooled or not coming at us at all.

      • adrian_b an hour ago

        Reaching 10 GHz for a CPU will never be done in silicon.

        It could be done if either silicon will be replaced with another semiconductor or semiconductors will be replaced with something else for making logical gates, e.g. with organic molecules, to be able to design a logical gate atom by atom.

        For the first variant, i.e. replacing silicon with another semiconductor, research is fairly advanced, but this would increase the fabrication cost so it will be done only when any methods for further improvements of silicon integrated circuits will become ineffective or too expensive, which is unlikely to happen earlier than a decade from now.

      • FpUser 27 minutes ago

        Having RAM read / write faster will be of way more benefit

    • brennanpeterson an hour ago

      None for normal.compute, since energy density is still fundamental. But the interesting option is cryogenic computing, which can have zero switching energy, and 10s of GHz clock rates

      Some neat startups to watch for in this space.

    • HarHarVeryFunny an hour ago

      What would be the benefit? You don't need a 10GHz processor to browse the web, or edit a spreadsheet, and in any case things like that are already multi-threaded.

      The current direction of adding more cores makes more sense, since this is really what CPU intensive programs generally need - more parallelism.

      • nurettin 14 minutes ago

        Single core speed is absolutely a thing that is needed and preferred to multicore. That's why we have avx, amx, etc.

    • magic_man 2 hours ago

      The energy consumed is cv^2f. It makes no sense to keep increasing frequency as you make power way worse.

      • dlcarrier 2 minutes ago

        At lower frequencies, leakage current plays a larger role than gate capacitance, so for any given process node, there's a sweet spot. For medium to low loads, it takes less power to rapidly switch between cutting off power to a core, and running at a higher frequency than is needed, than to run at a lower frequency.

        Newer process nodes decrease the per-gate capacitance, increasing the optimal operating frequency.

      • vlovich123 an hour ago

        So heat. There’s efforts to switch to optics which don’t have that heat problem so much but have the problem that it’s really hard to build an optical transistor. + anywhere your interfacing with the electrical world you’re back to the heat problem.

        Maybe reversible computing will help unlock several more orders of magnitude of growth.

  • davidee 23 minutes ago

    I have very fond memories of my first dual-cpu Athlon machine.

    It was the workstation on which I learned Logic Audio before, you know, Apple bought Emagic. I took that machine, running very low latency Reason to live gigs with my band.

    Carting around a full-tower computer (not to mention the large CRT monitor we needed) next to a bunch of tube Fender & Ampeg amps was wild at the time. Finding a good drummer was hard; we turned that challenge into a lot of fun programming rhythm sections we could jam to, and control in real-time, live.

  • 1970-01-01 an hour ago

    Argh. The headline. The opener. Awful. Where are editors in 2026? There's no way an LLM would write this.

    The GHz barrier wasn't special. What was much more important was the fact that AMD was giving Intel a hard time and there was finally hard competition.

    • adrian_b an hour ago

      In terms of marketing, the "GHz" barrier was special, because surpassing it has indeed created a lot of recognition in the general public for the fact that the AMD Athlon CPUs were better than the Intel Pentium III CPUs.

      In reality, of course what you say is true and the fact that Athlon could previde a few extra hundreds of MHz in the clock frequency was not decisive.

      Athlon had many improvements in microarchitecture in comparison with Pentium III, which ensured a much better performance even at equal clock frequency. For instance, Athlon was the first x86 CPU that was able to do both a floating-point multiplication and a floating-point addition in a single clock cycle. Pentium III, like all previous Intel Pentium CPUs required 2 clock cycles for this pair of operations.

      This much better floating-point performance of Athlon vs. Intel contrasted with the previous generation, where AMD K6 had competitive integer performance with Intel, but its floating-point performance was well below that of the various Intel Pentium models (which had hurt its performance in some games).

    • HarHarVeryFunny an hour ago

      There was a time where increased clock speeds, or more generally increased processor throughput was important. I can remember when computers were slow, even for things like browsing the web (and not just because internet connection speeds were slow), and paying more for a new faster computer made sense. I think this time period may well have lasted roughly until the "GHz era" or thereabouts, after which even the cheapest, slowest, computers were all that anybody really needed, except for gamers where the the solution was a faster graphics card (which eventually lead to GPU-computing and the current AI revolution!)

      • 1970-01-01 44 minutes ago

        You're conflating a few things here. The Vista era was the biggest requirement hit. That was the time where people really needed a faster PC to continue browsing. Before that, you could get away with XP running on a sub-GHz processor.