THE QUESTION OF LLM CONSCIOUSNESS It can’t be conscious (as we humans are) witho

THE QUESTION OF LLM CONSCIOUSNESS
It can’t be conscious (as we humans are) without persistent memory, some equivalent of homeostasis as measurement, some continuous self assessment (self), and the capacity to plan its own continuous innovation and adaptation.

However consciousness is a spectrum from awareness, to assessment, to prediction, to planning and acting. But without a sense of self an AI is not ever going to be ‘conscious’ outside of a given conversation.

LLMs produce the human language faculty. They do not yet produce the other necessary faculties for consciousness. Those other faculties are enumerable (we can know them) and they can be produced, but at even larger costs. So, we need to continue to see costs decline in order to implement them with any degree of feasibility at scale.


Source date (UTC): 2026-03-02 17:59:36 UTC

Original post: https://twitter.com/i/web/status/2028530676639900050

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *