**Hello.** I’m analytic philosopher. This is Part 2 of my walkthru of Chris’ CTMU paper. The Introduction.
Chris’ text in italic, mine in normal font.
*Bucking the traditional physical reductionism of the hard sciences, complexity theory has given rise to a new trend, informational reductionism, which holds that the basis of reality is not matter and energy, but information. *
Beginning at least with Hayek’s paper in the 30’s in the social sciences, if not well before that in physics, and during the failed operational movement (Brouwer in Math, Bridgman in Physics, and Mises in Economics, and arguably Hayek in Law) the general consensus was that ‘information necessary to change state’ was a superior model to physical forces when modeling phenomenon, as a way of preventing us from making errors and assumptions. So yes. That transformation occurred early in the 20th century. This fact has simply been more commonly adopted since then as it has spread through colloquial literature.
We do NOT claim that the universe consists of information, but that information is the best model we have to work with, since it has no mass, and therefore will lead us to fewer errors of reasoning. Every few centuries the model changes. In the past we saw Computers, Machinery, Steam, Blood, Airs and fumes, Water, ‘Essences’ and Plato’s forms. The purpose of each evolutionary step in our use of analogies when describing nature (that which has no intention) has been to reduce the cognitive biases present in our priors.
*The relationship between physical and informational reductionism is a telling one, for it directly mirrors Cartesian mind-matter dualism,*
Correct. The rest of the paragraph suggest we are repeating the retrenchment to the mind body problem, and I think that’s false. It’s pretty clear that at this point we understand (or at least I do and others do) how consciousness is created from fragmentary information combined with fragmentary memory over a few seconds.
*Mathematically, the theoretical framework…* * the mathematics of probability must be causally interpreted to be scientifically meaningful,….*
I don’t think anyone of any significant understand would say that. No one buys the ludic fallacy. I think we merely state that probably provides suggestions on the one and and falsification of errors and biases on the other. Either we solve for operations (changes in state) and states, or we are just making excuses. Probability does not tell us causality. It tells us where we might look for causality. If you can’t explain something causally, then you’re just ‘making stuff up’.
*…because probabilities are therefore expressly relativized to specific causal scenarios,*
Well, all that means is that probability fails us at high causal density. We know this. Particularly in economics, where it appears that a single regression analysis is about as sophisticated as you can get without introducing more error than you remove ignorance. We have the same problem in physics because we simply can’t afford to conduct the experiments necessary to test our theories.
If we instead talk about the content of our ideas, fantasies, and free associations, it’s pretty obvious from intellectual history that we have been increasing precision over the millennia, and that we keep increasing the complexity of our frame, with socrates, copernicus, smith/hume, darwin, maxwell, and einstein’s innovation largely reducible to providing a new frame. I think in retrospect Turing/Chomsky will be thought of in the same terms. It’s too bad Brouwer, Bridgman, Mises, Popper and Hayek will not be.
*…of an evolutionary state ultimately entails the specification of an absolute (intrinsic global) model with respect to which absolute probabilistic deviations can be determined.*
Well sure, but… don’t we already know that model sufficiently, but that the possible transformations and combinations at even genetic levels of complexity are simply beyond what we call MATHEMATICAL models, and require SIMULATIONS…. I mean. That’s what Wolfram has been saying for many years now. We need a new math. A math of operational combinations. And our current math is still taught like it’s a supernatural construction rather than a tediously simple grammar of positional relations.
…with a language…
Yes, but what constitutes a language other than a deflationary grammar the iteratively reduces ambiguity, and a set of semantic terms **limited to the dimensions of the constant relations that we seek to test across states?**
*Furthermore, in keeping with the name of that to be modeled, it must meaningfully incorporate the intelligence and design concepts, describing the universe as an intelligently self-designed, self-organizing system.*
Sorry man. That doesn’t follow. Period. It follows only that if we wish to describe this reality that can evolve complexity from a fairly small number of forces (rules), that we must produce a deflationary grammar and semantics limited to the production of categories (referents, relations, and values) and the information necessary to change state of those categories. The fact that entropy and conservation of energy produce complexity with a few forces, or that biology can be constructed of a few elements, or that the complexity of a sentient and cognitive mind can be produced by a four bit instruction set in large numbers, is simply tedious physics all the way up.
*Even cognition and perception are languages based on what Kant might have called “phenomenal syntax”. With logic and mathematics counted among its most fundamental syntactic ingredients, language defines the very structure of information. This is more than an empirical truth; it is a rational and scientific necessity. *
Um. that is more than an abuse of language. Instead, semantic content and grammar are limited by the sensory inputs and associative capacity of the primitive brain regions and the cortical layers. Or said more simply, smarter creatures can create more combinations over longer iterations, with lower friction of neurological transmission.
Again, this is pretty well covered ground.
To say that it is an empirical, rational, logical, and necessary conclusion is obvious. However we need frames (sets of constant relations that allow us to recategorize existing knowledge) and a language to express them in, because we ARE LIMITED by the ‘jumps’(chains of relations) we can make.
Those of us with higher abilities make much longer jumps (chains of relations) and those of us with lower abilities (and general knowledge) can make fewer. Training can compensate for the challenge. Hence the generational problem of teaching the various means of calculation: reading, writing, arithmetic, mathematics, grammar, logic, rhetoric, physics, chemistry, biology, and economics (cooperation). Even history and fiction allow us to make general ‘calculations’ in the broadest sense.
*Of particular interest to natural scientists is the fact that the laws of nature are a language.*
Or stated more clearly, all language consists of grammars of decidability that reduce through deflation and disambiguation that which is incomparable because of overloading of relations, to that which is comparable and decidable because of limited (testable or at least identifiable) relations.
*The existence of these laws is given by the stability of perception.*
Um. I think that’s close. The existence of those laws is determined by the hierarchy of consequences made possible by opportunities for preservation of energy against entropy. We DISCOVER those laws because it is only POSSIBLE for a neural network to identify constant relations – everything else would be an evolutionary disadvantage since it’s simply too biologically expensive to do so – and would slow us to the point where reaction times were suicidal.
*…. they can be regarded as distributed “instructions” from which self-instantiations of nature cannot deviate; ….*
Correct. I would use OPERATIONS or TRANSFORMATIONS rather than implying intention to the universe by calling them ‘instructions’, but since wer’re talking analogies here I suppose it’s close enough. My problem is with this whole attribution of intent to that which is merely a necessity of the combination of forces without which the universe of space time could not (as we understand it) exist.
*…On a note of forbearance…*
Ok. Look. Now we are really getting into making excuses.
If you want to make the argument that it’s possible to construct a grammar and semantics limited to the dimensions of existential reality, expressed in terms of sense perception available to humans at human scale then sure. That’s nothing other than stating we can create an algorithmic model of the universe, even if we cannot create a mathematical one. And that such a model will be more analogous to genetic permutation of high causal density leading to infrequent but high returns in indirect concert, (so called, punctuated equilibria), then I mean, that’s pretty obvious.
This whole anthropomorphic projection though is still hanging over our heads, and while I can cut through the poetic and ASD speech, and I understand the benefits of poetic and aspie speech in certain circumstances, I am also cognitive the use of Abrahamic argument (Pilpul), whether jewish, Augustinian, Kantian, or french-Postmodern to overload even the best mind’s ability to test the constance of relations, and to be deceived thereby.
* “reality theory is wedded to language theory and they beget a synthesis”, has the advantage that it leaves the current picture of reality virtually intact. *
Well, there is the promise again, but so far I can’t find delivery on it.
We’ll continue with the next section….