RTX 5090 and M4 MacBook Air: Can It Game?

(scottjg.com)

98 points | by allenleee an hour ago ago

24 comments

  • mywittyname 16 minutes ago

    > As much as I hate to admit it, step one in most of my projects now is to ask AI about it. Maybe it’ll tell me something I don’t know.

    Or, more likely, it will tell you something it doesn't know.

    Reminds me of yesterday, when I was arguing with ChatGPT that the 5070TI was an actual video card. It kept trying to correct me by saying I must have meant a 4070ti, since no such 5070ti card exists.

    • collabs 3 minutes ago

      Or, it will acknowledge that it made a mistake and continue to make the same mistake again.

      I asked Claude to generate an HTML page about PowerShell 7. It gave me a page saying 7.4 was the latest LTS release. I corrected it with links showing 7.6 was released in March and asked it to regenerate with the latest information.

      It generated basically the same page with the same claim that 7.4 was the latest release.

    • simonh 4 minutes ago

      It’s training data only goes up to late 2024 or early 2025 so that might be why, though it does have access to the internet.

    • perarneng 5 minutes ago

      This is why i use grok expert mode. It agressivly goes out searching the web for info. Its so much better then relying on year old data.

      • _blk a minute ago

        Yes, I really like that about Grok. It had a few good qualities but it was too verbose so now it's mostly Claude.

  • divbzero a minute ago

    [delayed]

  • matthewfcarlson 14 minutes ago

    I have been bothering the VM team for years for VM GPU pass through. I worked on the Apple Silicon Mac Pro and it would have made way more sense if you could run a linux VM and pass through the GPU that goes inside the case!

    Sadly, as you can tell, they have not taken me up on my requests. Awesome that other people got it working!

  • swiftcoder 31 minutes ago

    This is proper mad science, love it

  • coder68 27 minutes ago

    This seems pretty useful for AI inference if it can pass Apple approval. I've wanted to use my Nvidia GPUs with a Mac Mini, this would enable it to run CUDA directly. Very cool!

  • frollogaston 35 minutes ago

    I'm guessing the x86 emu is cause Windows games are rarely built for ARM, right? Was kinda curious how an ARM VM would fare. Anyway awesome article.

    • hparadiz 28 minutes ago

      Yes. Valve has done a ton of work here because it's required to be able to run x86 games on a Steam deck which has an ARM cpu.

      • hypercube33 12 minutes ago

        Steam deck runs a full x86-64 AMD APU. The work valve has done for that was to get Windows games to run seamlessly on Linux.

        Hopefully in 2026 the Valve Index VR headset which is ARM (Qualcomm?) we get what you're talking about here - basically proton for Win32/64 to Linux ARM64.

        Side note that Windows on ARM isn't bad just that its priced out of its league and cooling is awful for gaming on current laptops. The only issue I had was OpenGL needing some obscure GL on DirectX thing for Maya3D to get games to work.

      • sva_ 11 minutes ago

        As sibling pointed out, the Steamdeck basically runs a Ryzen 3 7335U which is x86.

      • bigyabai 26 minutes ago

        The Steam Deck is pure x86, it's not an ARM-based CPU. The Steam Frame might be what you're thinking of.

  • 24 minutes ago
    [deleted]
    • stock_toaster 15 minutes ago

      > running all performance-sensitive workloads in The Clown now

      I really hope "The Clown" isn't just a typod "The Cloud".

      If not, tell us more!

      I'm inspired/tempted by this to rename my external "higher grunt offload" machine to "clowntown".

  • nothinkjustai 6 minutes ago

    > As much as I hate to admit it, step one in most of my projects now is to ask AI about it. Maybe it’ll tell me something I don’t know.

    It’s these people, not the ones who refuse to use LLMs, who are as they say, “cooked”.

  • delbronski 25 minutes ago

    Nicely done! Glad to see real hacking is still alive in the age of AI.

  • moralestapia 32 minutes ago

    Wow, phenomenal project and write-up, thanks for sharing it.

    "no - not in any practical sense today, and "maybe" only in a very deep, borderline-impractical research sense."

    This is why humans will always rule over crappy LLMs.

    • falcor84 23 minutes ago

      Wait, why? This is exactly what I as a human would have said in this situation.

      Or if you're referring to how the OP still decided to go ahead, I've seen AIs go ahead on impractical courses of action many times, and surprisingly succeed on some of them.

      • moralestapia 16 minutes ago

        And I see that you succeeded in not doing it.

        Congrats! Each one got what they wanted :).

    • csours 28 minutes ago

      I believe that LLM (and ML in general) tools really shine when they are developed and used AS tools.

      Unfortunately, I also believe that market forces may push away from this direction, as LLM companies try to capture the value stream

    • rvz 24 minutes ago

      Exactly. AI psychosis is real.

      Never let an AI tell you that you cannot do something practical for your own self for research, discovery or for fun.

      The only thing that is close to impractical is expecting your non-technical friends or others to follow you without any incentive or benefit.