No. Attention isn’t enough: All these problems need to be solved: Sensation Disa

No. Attention isn’t enough:

All these problems need to be solved:
Sensation
Disambiguation (perception)
World Modeling (synthesis)
Auto Association (adversarial prediction)
… Episode formation
… … Gain-Loss Valence
… … … Social, Moral, Ethical Valence
Attention (state)
… Selection (goal)
… … Narration
Release of Action
Recursion (wayfinding) (reinforcement)

Tesla has world modeling down.

World modeling allows cause and effect prediction
And as a consequence prevention of hallucination
And as a consequence logical testing.
And as a consequence ethical testing.

LLMs are trying to brute force AI through language top down, which is the opposite of all previous models (My generation) that worked from bottom up.

The brute force method can work because it’s easily testable. But until we have world models, recursion and wayfinding (many of which are in progress) the rather obvious limits of the present AIs as other than glorified search engines is going to be challenging.

Reply addressees: @IntuitMachine


Source date (UTC): 2023-07-24 20:26:46 UTC

Original post: https://twitter.com/i/web/status/1683574464305328129

Replying to: https://twitter.com/i/web/status/1683431882325622784

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *