1 points | by msoul 7 hours ago ago
1 comments
Running an LLM locally on your own laptop is no longer a problem. And the models are improving every month.
“But which model should I choose?” That’s exactly why I built a detailed benchmark that measures not just quality, but also speed on a Macbook Pro.
The current best all-rounder: Qwen-3.6 35B
Running an LLM locally on your own laptop is no longer a problem. And the models are improving every month.
“But which model should I choose?” That’s exactly why I built a detailed benchmark that measures not just quality, but also speed on a Macbook Pro.
The current best all-rounder: Qwen-3.6 35B