2 comments

  • KetoManx64 14 hours ago

    I think there's not much interest currently because of how inexpensive online LLM models are. If it costs me $0.001 per message, what's the point of me running a model locally other than wanting to work on something you don't want logged. Once the AI companies start running out of money and start raising the prices for the models there will be a large migration of users and companies wanting to self host.

  • GeoSys 9 hours ago

    I think there're a few problems, unfortunately:

    - It wouldn't quite work on mobile devices. Many ppl wouldn't like to download 4GB on their iPhone; - Many use cases involve real-time data - e.g. summary of the latest news and events. That would require the LLM to be updated often and be able to perform actions like Google searches; - Switching devices will lose the history, created artefacts, etc.

    While I think there're use cases, IMHO, they would be mostly for tech experts and not the wider audience.