AMD Will Need Another Decade to Try to Pass Nvidia

(nextplatform.com)

27 points | by rbanffy 2 days ago ago

16 comments

  • kkielhofner a day ago

    Jensen has said for years that 30% of their R&D spend is on software. Needless to say as they continue to crush it financially this number continues to completely race past AMD.

    Turns out people don’t actually want GPUs, they want solutions that happen to run best on GPUs. Nvidia understands that, AMD doesn’t.

    Lisa Su keeps talking about “chips chips chips” and MAYBE “Oh btw here’s a minor ROCm update”. Meanwhile, Nvidia continues to masterfully execute deeper and wider into overall solutions and ecosystems - a substantial portion of which is software.

    Nvidia is at the point where they’re eating the entire stack. They do a lot of work on their own models and then package them up nice and tight for you with NIM and Nvidia AI Enterprise. On top of stuff like Metropolis, RIVA, countless things. They even have a ton of frameworks to ingest/handle data, finetune/train, and then deploy via NIM.

    Enterprise customers can be 100% Nvidia for a solution. When Nvidia is the #1-#2 most valuable company in the world “no one ever got fired for buying Nvidia” hits hard.

    The people who say “AMD and Nvidia are equal - it’s all PyTorch anyway” have no view of the larger picture.

    With x86_64, day one you could take a drive out of an Intel system, put it in an AMD system, and it would boot and run perfectly. You can still do that today unless you build something for REALLY specific/obscure CPU instructions.

    Needless to say that’s not the case with GPUs and a lot of people that make the AMD vs Intel comparison don’t seem to understand that.

    • pjmlp a day ago

      Back when Khronos was still trying to push OpenCL 2.0, I attended one IOCWL webminar where an attendee asked about Fortran support for SPIR.

      No one on the panel provided a meaningful answer, and some were even puzzled why someone would want that.

      NVidia knew pretty well not only why someone would want Fortran on the GPU, they bought PGI before someone else though about doing the same.

      Meanwhile Khronos and AMD still haven't got it.

      At least Intel Fortran now also does GPU, and like NVidia are having Python GPU JIT efforts.

      And then there is the whole tooling, libraries and additional compilers with PTX backends.

    • takinola a day ago

      > Needless to say that’s not the case with GPUs and a lot of people that make the AMD vs Intel comparison don’t seem to understand that.

      What needs to happen for GPUs to become as interchangeable as CPUs? What are the leading signs that this is becoming possible?

      • _aavaa_ a day ago

        What kind of interchangeable are you talking about? Vulkan and OpenCl exist.

        The CPU analogy for the GPU case isn't AMD and Intel, it's x86 and ARM.

        • takinola a day ago

          What needs to happen for GPUs to be commoditized so that an infrastructure builder is relatively indifferent to using GPUs from Nvidia, AMD or any other provider?

          • dagw 16 hours ago

            Either Nvidia has to open up CUDA to third party providers, or someone (OpenCL?) has to finally manage to create a high level cross hardware abstraction that has all the features of CUDA, can match CUDAs performance on Nvidia hardware and is as easy to use as CUDA. Honestly I'm not sure which is more unrealistic.

  • hcfman 2 days ago

    Is this a bit like saying I'll need another 11 years to get older than my brother who is ten years older than me ??

  • DiabloD3 17 hours ago

    Heh, what a weird story, I didn't know that website ran stealth ads.

    Nvidia has been trying to catch up to the AMD juggernaut for years, including having sockpuppet accounts on even tiny websites like ours, and it seems to have been paying off, I guess? They claim larger revenue, but most of it is just a by-product of rent-seeking and price gouging, combined with a moat that they sell to the user as a software stack; they can't keep up with the actual hardware side of the business, and rely on a good PR team to shore up the difference, including getting universities to teach "how to CUDA" classes, instead of actual useful classes with transferable skills.

    When it comes to actually important things, like perf/watt, perf/slot, perf/$, unless you were stupid and let yourself be locked into a CUDA-only solution, why would you pick Nvidia? And lets say you were a gamer, and not some compute customer, why would you buy Nvidia, unless you really wanted to spend $1600 and 600 watts on a card that barely outruns a $1000/450w 7900XTX. Even on games that are "Made For Nvidia" (= botched PC ports that border on an easy to win antitrust suit), RDNA3 across the board is still better than Lovelace.

    And to be completely clear, many gamers aren't PC gamers, they're console gamers: XBox One, XSX, PS4, and PS5 are AMD, and NVidia didn't win the contract for either because they couldn't deliver purely on technological reasons (couldn't meet the perf/watt requirements for even remotely the same $). What they did win? The Switch because they had a warehouse full of chips meant for the gaming tablet revolution that didn't (and wouldn't have) come; easier to convince Nintendo to buy them at a loss (they didn't even break even on the original 20nm run of the X1) than making literally ziltch on the run.

    Given how ubiquitous AMD hardware is, both inside and outside of gaming and enterprise, I just find it utterly baffling to think Nvidia somehow has so much brand and goodwill that Wall Street would value it in the super-exclusive $T club.

  • tliltocatl a day ago

    Good because that means that AMD will continue to provide desktop graphics rather that investing everything into numbercrunchers.

    • KetoManx64 a day ago

      But will people keep buying AMD GPUs as gaming tech advanced and AI tools like LLM start getting integrated into games to make their worlds more realistic and expansive?

      • Aleklart a day ago

        no, of course not, it will be used to cheap out on game and graphic designers, as UE5 is used to avoid hiring actual game engine developers.

    • mrguyorama a day ago

      Well, their lack of software spend also hurts their GPU driver quality. I have been an AMD graphics fanboy since they were ATI, and yet I finally bought NVidia since I like it when my video games don't crash for stupid fucking reasons. This affected Unreal engine games way worse. Now I can finally play whatever game I want, even running the GPU full bore, struggling it's absolute best to give me any frame it can, and not have any worry about stability.

      Meanwhile, for several years, the default GPU driver for my RX 5700xt used a fan curve that was hard capped at 20% fan RPM, such that the card consistently overheated itself and died. Every single game I would play that pushed that card hard would crash it. Despite having suitable grunt, I would have to turn down graphics settings in games to not crash. That just doesn't happen with my modern NVidia card.

      This is Windows specific.

      • tliltocatl 16 hours ago

        > This is Windows specific.

        Nuff said.

  • pjmlp 2 days ago

    > Beating Nvidia is going to be a lot harder than beating Intel was.

    Starting that to beat NVidia they also need to care about software ecosystem, which wasn't something to care about against Intel.

  • htrp 2 days ago

    Assuming no own goals on Nvidia's side and continued execution on AMD's side.

  • Aleklart a day ago

    AMD should be much more successful with Tesla, Microsoft and Sony all using their chips exclusively in their top devices, but look like big corps milked the deals to the point AMD have no profit at all. Nvidia salespersons and marketing department are clearly decade ahead.