I'm not sure I totally buy the "no plasticity" argument. If you are allowed to write to context in an agentic fashion, certainly the LLM can record intermediate answers, go back and re-rank it's memory. The "plasticity" is in the form of data that can be looked up and referenced as a shortcut. I would think this forms a Turing complete system so theoretically it can represent pretty much anything.
I'm not sure I totally buy the "no plasticity" argument. If you are allowed to write to context in an agentic fashion, certainly the LLM can record intermediate answers, go back and re-rank it's memory. The "plasticity" is in the form of data that can be looked up and referenced as a shortcut. I would think this forms a Turing complete system so theoretically it can represent pretty much anything.
I’m interested in the subject, but it’s clear that an LLM did most of the writing, and I don’t know enough of the subject to tell if this is nonsense.
Interesting