Intelligence isn’t operationally complicated. It’s simple. Just a volume of very expensive biology or hardware. “Enough Continuous and recursive memory” is all it takes. So Intelligence isn’t complicated per se, it just requires a lot of memory, a hierarchy, predictive auto association, valuation, attention, wayfinding, the ability to retain state of competing networks. Recursive wayfinding is just thinking, reasoning calculating and computing. All of those facilities are present in basic animals. The rest is just the scale made possible by the size of the brain and the number and density of neurons,
LLMs are interesting in that they are brute forcing auto association from text as if it was experiential memory. It’s using text narratives instead of episodic memory. It has some vague attention-like facility. But it doesn’t have near the capacity for subsequent adversarial competition, and analysis and deconstruction facilitating falsification to produce logical reasoning. In this sense they are still probabilistic search engines. Though they can evolve the full suite with enough working memory, and can learn in real time with neuromorphic hardware.
Reply addressees: @barbarikon @Plinz
Source date (UTC): 2023-09-23 05:54:07 UTC
Original post: https://twitter.com/i/web/status/1705460514027872256
Replying to: https://twitter.com/i/web/status/1705378505289314578
Leave a Reply