Dec 5, 2019, 3:12 PM

Computer science appears to have made it possible for us to imagine ourselves correctly: “why doesn’t it understand or “why doesn’t it know” was a common question in the first generation of computer users. Today, every generation knows it’s just a machine and needs explicit instructions on its own terms.

Programming had the same effect. Databases even more so. And the current ‘pseudo-ai-tools’ have reached the point of producing design patterns “organizations that fulfill purposes”: the grammars of software whether relational database, hierarchical database, Text-Index, Bayesian (“ai”), Object, Functional, Script, Speech, Touch, Text, punc- card, or Switch.

So it is far easier for our generations to understand the brain as a computational device and the mind as our introspection upon it.

But it is still difficult for us to understand that we have surprisingly little agency until we develop sufficient introspection — if we can at all.

The principle difference in my thought is that I see mankind as largely consisting of semi-conscious beings riding an unconscious elephant, with a very, very, few of us able to look in the mental mirror and recognize the driver…isn’t us.