Not true. AI has been around far longer than modern LLMs and has performed well in non-generative areas, often with orders of magnitude fewer parameters.
Neural nets are unstable during training and dynamic weights amplifies the problem. Thus, the neural networks could end up in totally unusable states at inference time.
I guess it's impossible to simulate billions of dynamical neurons with current computer architecture, it is limited by memory bandwidth.
I agree, but maybe there is no need for billions of neurons to be simulated right away. Artificial neural networks were pretty small at the time.
and they did not do much
Not true. AI has been around far longer than modern LLMs and has performed well in non-generative areas, often with orders of magnitude fewer parameters.
Dynamical neurons are not used because they add enormous computational cost and instability for limited practical gain.
What do you mean by "they add instability"?
Neural nets are unstable during training and dynamic weights amplifies the problem. Thus, the neural networks could end up in totally unusable states at inference time.