31 points | by smusamashah 4 hours ago ago
5 comments
Could it run on Macbook? Just on GPU device?
Will this run on CPU? (as opposed to GPU)
Why would you want to? It's like using a hammer for screws.
CPU compute is infinity times less expensive and much easier to work with in general
To maximise the VRAM available for an LLM on the same machine. That's why I asked myself the same question, anyway.
Could it run on Macbook? Just on GPU device?
Will this run on CPU? (as opposed to GPU)
Why would you want to? It's like using a hammer for screws.
CPU compute is infinity times less expensive and much easier to work with in general
To maximise the VRAM available for an LLM on the same machine. That's why I asked myself the same question, anyway.