Theme: Measurement

  • Everything Is Reducible to Calculation at Degrees of Precision and Degrees of Agency.

    READ JUST ONE PAPER BY POPPER, HAYEK, MISES, AND TURING BEFORE CHOMSKY – OR YOU WILL MISS THE POINT. Understand that Chomsky applied Turing to Grammar (language). That’s his insight. Just as Popper really needed to produce one paper (sources of knowledge and ignorance), Hayek One Paper (uses of knowledge in society), Mises (economic calculation), Turing (turing machine recursive computation), Chomsky produced one paper (Grammar). Most of this occurred in the pre-postmodernist period prior to 1960. We have been fighting pseudoscience since then, for the simple reason that universities could sell shit courses to proles and in doing so finance postmodern propaganda through tuition-debt, and the burning of intergenerational savings (Retirement Savings) as tuition fees. This is the most suicidal economic set of polities in western history other than perhaps the selling of indulgences. Once we further understand that universities do not teach but simply filter in, and out, we realize how catastrophic this entire state-propaganda driven experiment has been. Instead we should put the vast majority of people to work in part time apprenticeship (entry level work) by age 12-14, and limit university training to those forms of calculation (STEM) that cannot be learned until the brains are more fully matured. EVERYTHING IS REDUCIBLE TO CALCULATION AT DEGREES OF PRECISION AND DEGREES OF AGENCY.

  • Everything Is Reducible to Calculation at Degrees of Precision and Degrees of Agency.

    READ JUST ONE PAPER BY POPPER, HAYEK, MISES, AND TURING BEFORE CHOMSKY – OR YOU WILL MISS THE POINT. Understand that Chomsky applied Turing to Grammar (language). That’s his insight. Just as Popper really needed to produce one paper (sources of knowledge and ignorance), Hayek One Paper (uses of knowledge in society), Mises (economic calculation), Turing (turing machine recursive computation), Chomsky produced one paper (Grammar). Most of this occurred in the pre-postmodernist period prior to 1960. We have been fighting pseudoscience since then, for the simple reason that universities could sell shit courses to proles and in doing so finance postmodern propaganda through tuition-debt, and the burning of intergenerational savings (Retirement Savings) as tuition fees. This is the most suicidal economic set of polities in western history other than perhaps the selling of indulgences. Once we further understand that universities do not teach but simply filter in, and out, we realize how catastrophic this entire state-propaganda driven experiment has been. Instead we should put the vast majority of people to work in part time apprenticeship (entry level work) by age 12-14, and limit university training to those forms of calculation (STEM) that cannot be learned until the brains are more fully matured. EVERYTHING IS REDUCIBLE TO CALCULATION AT DEGREES OF PRECISION AND DEGREES OF AGENCY.

  • “Corporal Punishment, Limited Self Expression, Sacredness and the elimination of

    —“Corporal Punishment, Limited Self Expression, Sacredness and the elimination of self expression”—

    You get what you measure, but is what you measure that which is good?

    Stress is good.


    Source date (UTC): 2018-05-31 15:22:02 UTC

    Original post: https://twitter.com/i/web/status/1002208589547753474

  • “Corporal Punishment, Limited Self Expression, Sacredness and the elimination of

    —“Corporal Punishment, Limited Self Expression, Sacredness and the elimination of self expression”—

    You get what you measure, but is what you measure that which is good?

    Stress is good.


    Source date (UTC): 2018-05-31 11:21:00 UTC

  • photos_and_videos/TimelinePhotos_43196237263/33902929_10156390998717264_80183126

    photos_and_videos/TimelinePhotos_43196237263/33902929_10156390998717264_8018312627161661440_o_10156390998712264.jpg ( Working on Constant Relations, Logic and Grammars at my new ‘ideal writing spot’. Restaurant with huge covered porch, electricity, wireless, and friendly service. )Andrei VamenscuThis man has the will to change society, do you?May 30, 2018 3:27pmAndrei VamenscuI enjoyed the personality interactions with you on The Daily Shoah and The Public Space. Please continue doing more of that in the future.May 30, 2018 3:30pmTruxton OlmsteadWas the recent Shoah episode behind their paywall?May 30, 2018 4:02pmAndrei VamenscuYeah it was a Friday episode.May 30, 2018 4:04pmVengefül BobmoranCurt definitely shines when there’s smart people around to bounce ideas and ask for clarifications.May 30, 2018 4:23pmGary KnightLooks like you’re working on a fried chicken recipe 😬May 31, 2018 8:12amMary RomanoYes. Divert some of your beer money towards something useful.Jun 01, 2018 7:57am( Working on Constant Relations, Logic and Grammars at my new ‘ideal writing spot’. Restaurant with huge covered porch, electricity, wireless, and friendly service. )


    Source date (UTC): 2018-05-30 14:44:00 UTC

  • NOTES: Constant vs contingent vs inconsistent vs non-relations. Recursive Contin

    NOTES:

    1. Constant vs contingent vs inconsistent vs non-relations.
    2. Recursive Continuous Disambiguation vs Scale of Set of Constant Relations(density)
    3. Cumulation of association vs falsification of associations
    4. Computational efficiency.
    5. State Persistence vs breadth search, vs depth search
    6. We cannot know the intelligence of distant ancestors.
    7. Planning a series of steps in sequence must emerge – which requires recursion.
    8. Consciousness must emerge, meaning, the ability to compare states.
    9. Cooperation must emerge, meaning, the ability to empathize with intent.
    10. At some point we must develop sufficient computational ability to manipulate our bodies in some way that allows for unambiguous communication, or a means of continuous disambiguation, that is fast enough for one another to make use of in real time, and easy enough for one another to retain.
    11. And at some point, given sufficient computational ability, memory, and state persistence independent of recursion, language must emerge.
    12. At some point the value of such communication much be such that the cost of it is offset by the rewards of it.
    13. And we should see a cliff in history where there is a dramatic change when we did develop those abilities. And we do see it – rather recently.

    But language requires a system of measurement. The system of measurement is limited by our senses. And as such meaning refers to a set of measurements, eventually reducible to analogies to human experience. So while semantic content (measurements) must vary from species to species, grammar (continuous recursive disambiguation) should be universal in the sense that it varies predictably with computational abilities. We can understand a child, a person with 60IQ, 70IQ and so on, up to 200+ IQ. But as far as I can tell the set of measurements (basis of semantics) remain the same, and all that changes is the scope of the state persisted, the depth of recursion, and the density and distance of relations, and the ability to model (forecast). In other words, simple people are in fact simply ‘more simple’ in the density of content of their semantics, use of grammar, and models (Stories) that they can construct with them. So universal grammar as a set of computational minimums and efficiencies, should always exist, and human universal grammar as universal grammar limited to human measurements (semantics), does exist. And any organism with sufficient computational (neural) capacity, should develop some means of communication using some variation of universal grammar, and some sense-perception – action dependent semantics. May 29, 2018 4:28pm

  • NOTES: Constant vs contingent vs inconsistent vs non-relations. Recursive Contin

    NOTES:

    1. Constant vs contingent vs inconsistent vs non-relations.
    2. Recursive Continuous Disambiguation vs Scale of Set of Constant Relations(density)
    3. Cumulation of association vs falsification of associations
    4. Computational efficiency.
    5. State Persistence vs breadth search, vs depth search
    6. We cannot know the intelligence of distant ancestors.
    7. Planning a series of steps in sequence must emerge – which requires recursion.
    8. Consciousness must emerge, meaning, the ability to compare states.
    9. Cooperation must emerge, meaning, the ability to empathize with intent.
    10. At some point we must develop sufficient computational ability to manipulate our bodies in some way that allows for unambiguous communication, or a means of continuous disambiguation, that is fast enough for one another to make use of in real time, and easy enough for one another to retain.
    11. And at some point, given sufficient computational ability, memory, and state persistence independent of recursion, language must emerge.
    12. At some point the value of such communication much be such that the cost of it is offset by the rewards of it.
    13. And we should see a cliff in history where there is a dramatic change when we did develop those abilities. And we do see it – rather recently.

    But language requires a system of measurement. The system of measurement is limited by our senses. And as such meaning refers to a set of measurements, eventually reducible to analogies to human experience. So while semantic content (measurements) must vary from species to species, grammar (continuous recursive disambiguation) should be universal in the sense that it varies predictably with computational abilities. We can understand a child, a person with 60IQ, 70IQ and so on, up to 200+ IQ. But as far as I can tell the set of measurements (basis of semantics) remain the same, and all that changes is the scope of the state persisted, the depth of recursion, and the density and distance of relations, and the ability to model (forecast). In other words, simple people are in fact simply ‘more simple’ in the density of content of their semantics, use of grammar, and models (Stories) that they can construct with them. So universal grammar as a set of computational minimums and efficiencies, should always exist, and human universal grammar as universal grammar limited to human measurements (semantics), does exist. And any organism with sufficient computational (neural) capacity, should develop some means of communication using some variation of universal grammar, and some sense-perception – action dependent semantics. May 29, 2018 4:28pm

  • NOTES: Constant vs contingent vs inconsistent vs non-relations. Recursive Contin

    NOTES:

    Constant vs contingent vs inconsistent vs non-relations.

    Recursive Continuous Disambiguation vs Scale of Set of Constant Relations(density)

    Cumulation of association vs falsification of associations

    Computational efficiency.

    State Persistence vs breadth search, vs depth search

    We cannot know the intelligence of distant ancestors.

    Planning a series of steps in sequence must emerge – which requires recursion.

    Consciousness must emerge, meaning, the ability to compare states.

    Cooperation must emerge, meaning, the ability to empathize with intent.

    At some point we must develop sufficient computational ability to manipulate our bodies in some way that allows for unambiguous communication, or a means of continuous disambiguation, that is fast enough for one another to make use of in real time, and easy enough for one another to retain.

    And at some point, given sufficient computational ability, memory, and state persistence independent of recursion, language must emerge.

    At some point the value of such communication much be such that the cost of it is offset by the rewards of it.

    And we should see a cliff in history where there is a dramatic change when we did develop those abilities. And we do see it – rather recently.

    But language requires a system of measurement. The system of measurement is limited by our senses. And as such meaning refers to a set of measurements, eventually reducible to analogies to human experience.

    So while semantic content (measurements) must vary from species to species, grammar (continuous recursive disambiguation) should be universal in the sense that it varies predictably with computational abilities.

    We can understand a child, a person with 60IQ, 70IQ and so on, up to 200+ IQ. But as far as I can tell the set of measurements (basis of semantics) remain the same, and all that changes is the scope of the state persisted, the depth of recursion, and the density and distance of relations, and the ability to model (forecast). In other words, simple people are in fact simply ‘more simple’ in the density of content of their semantics, use of grammar, and models (Stories) that they can construct with them.

    So universal grammar as a set of computational minimums and efficiencies, should always exist, and human universal grammar as universal grammar limited to human measurements (semantics), does exist. And any organism with sufficient computational (neural) capacity, should develop some means of communication using some variation of universal grammar, and some sense-perception – action dependent semantics.


    Source date (UTC): 2018-05-29 16:28:00 UTC

  • Will Aliens Use the Same Grammar?

    (and thus be comprehensible?) Um. I don’t think they’ll be different, for reasons I hope to publish this year. Although there is a substantial difference… Chomsky can take 40 minutes to communicate an idea, and if you look at his sentence structure and vocabulary it’s extraordinary. I cannot match Chomsky’s context-retention during his discourses. This is how I know he’s smarter than I am. His ability to ‘maintain state’ while communicating complex relations and stories is exceptional. Despite working at it terribly hard, I find ‘simplification’ extremely difficult, and I find I use a variation on latin grammar, more 19th century sentence structure, and overwhelm the audience very easily with content. If you listen to young adults they often have trouble forming complete sentences, paragraphs, and narratives with any degree of precision (they require shared context). Some people (me when I was younger) and many people in the tech field for example, speak very very fast with very high word counts. Some people cannot manage that at all. Some people use large vocabularies to concentrate more content in fewer words while preserving or increasing precision. Some groups use terms (english, german) and some tones (chinese). Where terms are more precise because they are less demanding of deduction. Some groups use (awful) high context grammar, and some low context grammar. It appears that once you develop the ability to communicate in language all that matters is the increasing content and precision of that communication method. So we evolved from simple vocal sounds serialized. Others might evolved from parallel tones. Maybe others from some other form of display. Language must at least originate with analogy to experience, so its possible that creatures with different senses or processing (octopods) might use analogies that took us time to decode. So if you look across just that set of dimensions you can imagine that some very smart species would speak very quickly, in very precise very dense grammar, with a very large vocabulary, with long sentences (transactions), and long narratives, in serial (informationally limited) or more parallel (informationally dense) means. And thisso their context retention ability and processing ability would be higher than ours. That said, for reasons that chomsky defends his universal grammar (and for the same reasons that while base number would change and the vocabulary will change, all mathematical systems would be the same) Once you grasp that the term ‘grammar’ means ‘continuous disambiguation’, but that actions in the real world cause languages to eventually converge on the descriptive through nothing other than competition, then This continuous disambiguation is important because it corresponds to falsification (eliminative), just as continuous construction correspondes to justificationism (cumulative). And as such it turns out that since falsehood has a higher truth content than truth claims, the via negativa of continuous disambiguation is the counter intuitive but descriptive and necessary means of communication of truth content. (Apologies if this is too dense an argument.)

  • Will Aliens Use the Same Grammar?

    (and thus be comprehensible?) Um. I don’t think they’ll be different, for reasons I hope to publish this year. Although there is a substantial difference… Chomsky can take 40 minutes to communicate an idea, and if you look at his sentence structure and vocabulary it’s extraordinary. I cannot match Chomsky’s context-retention during his discourses. This is how I know he’s smarter than I am. His ability to ‘maintain state’ while communicating complex relations and stories is exceptional. Despite working at it terribly hard, I find ‘simplification’ extremely difficult, and I find I use a variation on latin grammar, more 19th century sentence structure, and overwhelm the audience very easily with content. If you listen to young adults they often have trouble forming complete sentences, paragraphs, and narratives with any degree of precision (they require shared context). Some people (me when I was younger) and many people in the tech field for example, speak very very fast with very high word counts. Some people cannot manage that at all. Some people use large vocabularies to concentrate more content in fewer words while preserving or increasing precision. Some groups use terms (english, german) and some tones (chinese). Where terms are more precise because they are less demanding of deduction. Some groups use (awful) high context grammar, and some low context grammar. It appears that once you develop the ability to communicate in language all that matters is the increasing content and precision of that communication method. So we evolved from simple vocal sounds serialized. Others might evolved from parallel tones. Maybe others from some other form of display. Language must at least originate with analogy to experience, so its possible that creatures with different senses or processing (octopods) might use analogies that took us time to decode. So if you look across just that set of dimensions you can imagine that some very smart species would speak very quickly, in very precise very dense grammar, with a very large vocabulary, with long sentences (transactions), and long narratives, in serial (informationally limited) or more parallel (informationally dense) means. And thisso their context retention ability and processing ability would be higher than ours. That said, for reasons that chomsky defends his universal grammar (and for the same reasons that while base number would change and the vocabulary will change, all mathematical systems would be the same) Once you grasp that the term ‘grammar’ means ‘continuous disambiguation’, but that actions in the real world cause languages to eventually converge on the descriptive through nothing other than competition, then This continuous disambiguation is important because it corresponds to falsification (eliminative), just as continuous construction correspondes to justificationism (cumulative). And as such it turns out that since falsehood has a higher truth content than truth claims, the via negativa of continuous disambiguation is the counter intuitive but descriptive and necessary means of communication of truth content. (Apologies if this is too dense an argument.)