So far, like taking tests in grade school, all of them (I subscribe to most) are

So far, like taking tests in grade school, all of them (I subscribe to most) are only capable of answering what is known. Even if I engineer a sequence of prompts to try to attempt to ideate, it’s as dumb as a rock at such inferences.

Why? AI Hallucination is equal to human ideation. It’s the human capacity for recursion, and often a great deal of it in massive parallel until we find some very subtle relation worthy of ‘testing’ by acting. Getting AI’s to perform this feat is possible, it’s just … well, computationally even more expensive.

Reply addressees: @Plinz


Source date (UTC): 2024-05-19 23:07:56 UTC

Original post: https://twitter.com/i/web/status/1792331385685839872

Replying to: https://twitter.com/i/web/status/1792186894513574169

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *