Human nature invests minimum to gain maximum and is quite lazy when it comes to unnecessary precision, but then attempts to use imprecise terms (and ideas) to solve precise problems.
Human experties in sciences (deflationary grammars) serves to deflate any given level of abstraction. I have a chart you need to see.
I understand the unification of the sciences and I think plenty of other people do – but it’s just very different from what we’d expected.
—“the inference would be that a corresponding language of specificity would be a part of that.”—
Well, exactly.
–“However, that is not why we have failed at achieving”—
The reason we failed is that there is a market for agency via deception (non-correspondence, inconsistency, and in-coherence), just as much as there is a market for agency via truthfulness(correspondence, consistency, and coherence).
Ergo, just as we have eliminated the markets for violence, theft, fraud, free riding, etc, we can eliminate the market for falsehoods: by law. The problem was (and is no longer) a criteria for warranty of due diligence against falsehood of information entered intot he informational commons.
In other words, I’m not ‘selling’. I’m not interested in convincing people that crime is crime, only in producing law that states what crime is, and therefore outlaws it.
People will then respond accordingly – as they always have done – to incremental suppression of parasitism.
And that is the means by which we have produced civilization: the incremental suppression of parasitism through the incremental expansion of the law, by the discovery and cataloging the means by which man engages in parasitism.
So I am not really writing philosophy (choice and preference), but law (necessity and truth).
Hence my lack of concern for what ‘people think’. People have ‘thought’ that outlawing each form of parasitism was bad in every generation because it forces them into survival in the service of others in the market – and non-survival if they do not.