If you literally do research then explain to me why so much of computation (say,

If you literally do research then explain to me why so much of computation (say, something simple like the game of life) isn’t mathematically reducible. And that’s the dumbest possible rule set there is. I mean, irrational numbers aren’t reducible without either arbitrary specification of limits, or practical context to produce them.

I mean, while Cantor, Borh, and Einstein re-platonized mathematics, and the analytic project failed by the time of Kripke, and Godel and Turing started this, Chomsky in language, and Mandelbrot was doing it in the 70’s, I was doing this research (ai determinism) back in the 80s, and Taleb(finance and fat tails) was doing it in the 90s, and Wolfram (evolutionary computation) is doing it in the 00s. It’s not like this is uncovered territory. Most of what we observe is not mathematically reducible unless it’s purely statistical, because permutations are unpredictable.

Ergo what are the computational (sequence of operations) rules (first principles) of all non-deterministic processes (behaviors) made possible by memory and recursive prediction (rna, DNA, neurology, minds, reason, human calculation) that can be used to compute(trial and error) even if we cannot calculate(reduce to expression), and why does this mean that all logic is falsificationary, including mathematics?

Reply addressees: @Ket_Math_Dad @EricMorganCoach @Viorp2 @WerrellBradley @AntonyArakkal1 @Sargon_of_Akkad


Source date (UTC): 2023-09-01 00:33:37 UTC

Original post: https://twitter.com/i/web/status/1697407327530303488

Replying to: https://twitter.com/i/web/status/1697403690708250802

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *