Russell Yes, and so LLMs will evolve into 1) specialized agents (already happeni

Russell
Yes, and so LLMs will evolve into 1) specialized agents (already happening) 2) increasing association between increasingly abstract concepts, 3) competing predictions 4) wayfinding between predictions, 5) auditing predictions for possibility of acting upon them.


Source date (UTC): 2024-07-08 19:07:33 UTC

Original post: https://twitter.com/i/web/status/1810390280912384459

Reply addressees: @RussellJohnston @RolandBasilides

Replying to: https://twitter.com/i/web/status/1810388770660720703


IN REPLY TO:

Unknown author

The reason we require reason so to speak exists because our auto-associative memory (intuition) proposes too uncertain or too many competing (undecidable) possibilities to choose from. Reason then acts as a wayfinding process (navigation, maze-solving) in order to get to some one of those possibilities. And this process iteratively continues until some possibility is chosen as possible. 😉

Original post: https://x.com/i/web/status/1810388770660720703

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *