October 17th, 2018 6:14 PM
(Copied without permission for Archival Purposes)
Jean Piaget: the Superior Psychogenetic Cognition of Europeans, Part I
Part 1 of 2 (Part 2 here)
Everyone has heard about Jean Piaget’s (1896-1980) theory of the cognitive development of children. But no one knows that his theory placed Europeans at the top of the cognitive ladder with most humans stuck at the bottom — unless Europeans taught them how to think.
Piaget is widely recognized as the “greatest child psychologist of the twentieth century.” Unlike many other influential figures, Piaget’s discoveries have withstood the test of time. His argument that human cognition develops stage by stage, from sensorimotor, through preoperational and concrete operations, to formal operations, is generally endorsed in psychology and sociology texts as a “remarkably fruitful” model. This is not to deny that aspects of his theory have been revised and supplemented by new insights. One important criticism is that his fixed sequence of clear-cut stages does not always apprehend the overlapping and uneven process in the development of cognition. But even the strongest critics admit that his observations accurately show that substantial differences do exist between the cognitive processes (linguistic development, mental representations of concrete objects, logical reasoning) of children and adults.
Suppression of Piaget’s Cross Cultural Findings
What the general public does not know, and what the mainstream academic world is suppressing, is that many years of cross-cultural empirical research by Piaget and his followers have demonstrated that the stages of mental development of children and adolescents reflect the stages of cognitive evolution “humankind” has gone through from primitive, ancient, and medieval, to modern societies. The cognitive processes of humanity have not always been the same, but have improved over time. The civilizations of the world can be ranked according to the level of cognitive development of their populations. The peoples of the world differ not only in the content of their values, religious beliefs, and ways of classifying things; they differ in the cognitive processes they employ, their capacity to understand, for example, the relation between objects and concepts, their awareness of objective time, their ability to draw inferences from data, and to project these inferences into the hypothetical realm of the future. Most humans throughout history have been “childlike” in their cognitive capacities; they are not able, for example, to recognize contradictions between belief and experience, or to conceive multiple causes for individual events. Europe began to produce adolescents capable of reaching the stage of formal operational reasoning before any other continent, whereas to this day some nations barely manage to produce adults capable of formal operations.
This aspect of the cross-cultural comparative research conducted by Piaget and his associates has been suppressed. Critics interpreted the lack of formal reasoning among adolescents in many non-Western societies as evidence that his model lacked universal application, rather than as further confirmation that his theory of child development, first developed through extensive research on children in the West, could be applied outside the West. Because many critics erroneously assumed that Piaget’s theory was about how all children naturallymaturate into higher levels of cognition, they took this lack of cognitive development in pre-modern cultures as a demonstration that different cultural contexts produce different modes of cognitive development. Piaget’s stages, however, should not be seen as stages that every child goes through as they get older. They are not biologically predetermined maturational stages. While there is a teleological tendency in Piaget’s account of cognitive stages, with each of the four stages in a modern environment unfolding naturally as the child ages, this criticism ignored the implications of his cross-cultural studies, which were carried in his later years, and which made it evident that the ability to reach the stage of formal operations depended on the type of science education children received rather than on a predetermined maturation process.
It can be argued, actually, that Piagetian cross-cultural studies made his theory all the more powerful in offering a precise and orderly account of the cognitive psychological development of humankind in world history from hunting and gathering societies through agrarian societies to modern societies. This was not just a theory about children but a grand theory covering the cognitive experience of all peoples throughout history, from primitive peoples with a preoperational mind, to agrarian peoples with a concrete operational mind, to modern peoples with a formal operational mind. One of the rare followers of this cross-cultural research, the German sociologist Georg W. Oesterdiekhoff, observes that “thousands of empirical studies across all continents and social milieus, from the 1930s to the present” (2015, 85) have been conducted demonstrating that, depending on the level of cultural scientific education, the nations of the world in the course of history can be identified as preoperational (which is the stage of children from their second to their sixth or seventh year of life), concrete operational (which is the stage from ages seventh until twelfth years) and formal operational (which is the stage of cognition from twelve years onward).
Adults living in a scientific culture are more rational (and intelligent) than adults living in pre-modern cultures. For example, according to studies conducted in the 1960s and 1970s, even educated adults living in Papua New Guinea did not reach the formal stage. Australian Aborigines who were still living a traditional lifestyle barely developed beyond a preoperational stage in their adult years. Without a population that has mentally developed to the level of formal operations, which entails a capacity to think about abstract relationships and symbols without concrete forms, a capacity to grasp syllogistic reasoning, comprehend algebra, formulate hypotheses, there can be no modernization
However, despite all the studies confirming Piaget’s powerful theory, from about 1975-1980 a “wave of ideological attacks” was launched across the Western academic world against any notion that the peoples of the Earth could be ranked in terms of their cognitive development. According to Oesterdiekhoff, “nearly all child psychologists of the first two generations of developmental psychology knew about the similarities between children and pre-modern man,” but “due to anti-colonialism, student revolt, and damaged self-esteem of the West in consequence of the World Wars this theory as the mainstream spirit of Western sciences and public opinion declined gradually” (2014a, 281). As another author observed in 1989, “any suggestion that the cognitive processes of the older child might posses any similarities to the cognitive processes of some primitive human cultures is regarded as being beneath contempt” (Dan Le Pan, 1989).
I came across Oesterdiefkhoff’s research after a long search through Piagetian theory. I was wondering what his stage theory might have to say about the cognitive development of peoples in history. But I could find only sources of Piaget as a cognitive psychologist of children as such, not as a grand theorist of the cognitive development of humanity across world history — until I came to Oesterdiefkhoff’s many publications, which draw on pre-1975 Piagetian research and current research. This research, as Oesterdiefkhoff notes, “no longer belong to the center of attention and research interests. Most social scientists have never heard about these researchers and have only a very scanty knowledge of them” (2014a, 280).
Oesterdiefkhoff is very blunt and ambitious in his arguments. It is about why Piagetian theory is “capable of explaining, better than previous approaches, the history of humankind from prehistory through ancient to modern societies, the history of economy, society, culture, religion, philosophy, sciences, morals, and everyday life” (2014a). He believes that the rise of formal operational thinking among Europeans was the decisive factor in the rise of modern science, enlightenment, industrialism, democracy, and humanism in the West. The reason why India, China, Japan, and the Middle East did not start the Industrial Revolution “lies in their inability to evolve the stage of formal operations” (2014a).
Primitive and pre-modern peoples cannot be described as having a similar rational disposition as modern peoples because they are at the preoperational and concrete operational stages of cognition. Primitive adults share basic aspects of the preoperational thinking of children no more mature than eight years old. Adults in pre-modern civilizations share the concrete operational thinking of 6-12 year olds.
Children and premodern adults share the same mechanisms and basic understandings of physical dimensions such as length, volume, time, space, weight, area, and geometric qualities. Both groups share the animistic understanding of nature and regard stones, mountains, woods, stars, rivers, winds, clouds, and storms as living beings, their movements and appearances as expressions of their will, intentions, and commitment. Premodern humans often manifest the animistic tendencies of modern children before their sixth year. Fetishism and natural religion of premodern humans reside in children’s mentality before concrete operational stage . . . The biggest parts of ancient religions are based on children’s psychology and animism before the sixth year of life (2016, 301).
It is not that adults in primitive and pre-modern cultures are similar to children in modern cultures in their emotional development, experience, and ability to survive in a hostile environment. It is that the reasoning abilities of adults in pre-modern cultures are undeveloped. As Lucien Lévy-Bruhl (1857-1939) had already observed in Primitive Mentality(1923), a work which was recently released (2018) as part of Forgotten Books, the primitive mind is devoid of abstract concepts, analytical reasoning, and logical consistency. The objective-visible world is not distinguished from the subjective-invisible world. Dreams, divination, incantations, sacrifices, and omens, not inferential reasoning and objective causal relations, are the phantasmagorical doors through which primitives get access to the intentions and plans of the unseen spirits that they believe control all natural events.
The visible world and the unseen world are but one, and the events occurring in the visible world depend at all times upon forces which are not seen . . . A man succumbs to some organic disease, or to snake-bite; he is crushed to death by the fall of a tree, or devoured by a tiger or crocodile: to the primitive mind, his death is due neither to disease nor to snake-venom; it is not the tree or the wild beast or reptile that has killed him. If he has perished, it is undoubtedly because a wizard had “doomed” and “delivered him over”. Both tree and animal are but instruments, and in default of the one, the other would have carried out the sentence. They were, as one might say, interchangeable, at the will of the unseen power employing them (2018, 438).
I have reservations about the extent to which the rise of operational thinking on its own can explain the uniqueness of Western history (as I will explain in Part II), but I agree that without children or adolescents reaching the stage of operational thinking, there can be no modernization. The study of the geographical, economic, or cultural factors that led to the rise of science and the Industrial Revolution are not the matters we should be focusing on. The rise of a “new man” with psychogenetic capacities — psychological processes, personality, and behavior — for formal operational reasoning needs direct attention if we want to understand the rise of modern culture.
Cultural Relativism
But first, it seems odd that Oesterdiefkhoff holds two seemingly diametrical outlooks, “cultural relativism and universality of rationality,” responsible for the discrediting of Piagetian cross cultural theory. He does not explain what he means by “universality of rationality.” We get a sense that by “cultural relativism” he means the rejection of the unreserved confidence in the superiority of Western scientific rationality. Social scientists after the Second World War did become increasingly ambivalent about setting up Western formal thinking as a benchmark to judge the cognitive processes and values of other cultures, even though the non-Western world was happily embracing the benefits of Western science and technology.
The pathological state to which this relativism has affected Western thinking can be witnessed right inside the otherwise hyper-scientific field of cognitive psychology today. Take the very well known textbook, Cognitive Psychology (2016), by IBM Professor of Psychology at Yale University, Robert Sternberg; it approaches every subject in a totally scientific and neutral manner — except the moment it touches the subject of intelligence cross-culturally, when it immediately embraces a relativist outlook informing students that intelligence is “inextricably linked to culture” and that it is impossible to determine whether members of “the Kpelle tribe in Africa” have less intelligent concepts than a PhD cognitive psychologist in the West. Intelligence is “something that a culture creates to define the nature of adaptive performance in that culture and to account for why some people perform better than others on the tasks that the culture happens to value.” It is “so difficult,” it says, to “come up with a test that everyone would consider culture-fair — equally appropriate and fair for members of all cultures” (503-04).
If members of different cultures have different ideas of what it means to be intelligent, then the very behaviors that may be considered intelligent in one culture may be considered unintelligent in another (504).
This textbook pays detailed attention to the scientific achievements of Piaget, but portrays him as someone who investigated the “internal maturation processes” of children as such, without considering his cross-cultural findings, which clearly suggest that children in less developed and less scientific environments do not mature to the formal stage. Pretending that such findings do not exist, the book goes on to criticize Piaget for ignoring “evidence of environmental [cultural] influences on children’s performance.”
I am not suggesting that cultural relativism has not taken over Western sciences in the way it has the humanities, sociology, history, and philosophy. But there is no denying this relativism is being effectively used by scientists against any overt presumption by Western scientists that their knowledge is “superior” to the knowledge of African tribes and Indigenous peoples. No cognitive psychologist is allowed to talk about the possible similarities between the minds of children and the minds of adult men in pre-modern cultures.
Cultural Universals
Oesterdiefkhoff does not define “universality of rationality,” but we can gather from the literature he uses that he is referring to other anthropologists who argue that all humans are rationally inclined; primitive and pre-modern peoples are not “illogical” or “irrational.” The “actual structures of thought, cognitive processes, are the same in all cultures.” What differs are the “superstructural” values, religious beliefs, and ways of classifying things in nature. Primitive peoples, Islamic and Confucian peoples, were quite rational in the way they went about surviving in the natural world, making tools, building cultures, and enforcing customs that were “adaptive” to their social settings and environments. They did not develop science because they had different priorities and beliefs, and were less obsessed with mastering nature and increasing production.
The anthropologist Claude Lévi-Strauss, and the sociologist Émile Durkheim, were the first to argue that the primitive mind is “logical in the same sense and same fashion as ours” and that the only difference lies in the classification systems and thought content. George Murdock and Donald Brown, in more recent times, came up with the term “cultural universals” (or “human universals“) to refer to patterns, institutions, customs, and beliefs that occur universally in all cultures. These universals demonstrated, according to these anthropologists, that cultures differ a lot less than one might think by just examining levels of technological development. Murdock and Brown pointed to strong similarities in the gender roles of all cultures, the common presence of the incest taboo, similarities in their religious and healing rituals, mythologies, marriage rules, use of language, art, dance, and music.
This idea about the universality of rationality and “cultural universals” was subsequently elaborated in a more Darwinian direction by evolutionary psychologists. Evolutionary psychology is generally associated with “Right wing” thinking, in contrast to cultural relativism, which is associated with “Left wing” thinking. Evolutionary psychologists like E. O. Wilson and Steven Pinker hold that these cultural universals are naturally selected, biologically inherited behaviors. They believe that rationality is a naturally inherited disposition among all humans, though they don’t say that the levels of knowledge across cultures are the same. Humans are rational in the way they go about surviving and co-existing with other humans. These universals were selected because they enhanced the adaptability of peoples to their environments and improved the group’s chances of survival. Some additional cultural universals observed in all human cultures are bodily adornment, calendars, cooperative labor, cosmology, courtship, divination, division of labor, dream interpretation, food taboos, funeral rites, gift-giving, greetings, hospitality, inheritance rules, kin groups, magic, penal sanctions, puberty customs, residence rules, soul concepts, and status differentiation.
Evolutionary psychologists are convinced that the existence of cultural universals amount to a refutation of the currently “fashionable” notion that all human behaviors, including gender differences, are culturally determined. But if the West is very similar to other cultures, why did modern science develop in this civilization, including liberal democratic values? Evolutionary psychologists search for general explanations — the notion of cultural universals meets this criteria, Western uniqueness does not; therefore, they either ignore this question or reduce Western uniqueness to a concatenation of historical factors, varying selective pressures, and geographical good luck. They point to how modern science has been assimilated by multiple cultures, from which point they argue that science is not culturally exceptional to the West but a universal method that produces universal truths “for humanity.”
Can one argue that universalism is a cultural attribute uniquely Western and therefore relative to this culture?
Piagetian Universalism and IQ Convergence
Piagetian theory is also universalist in maintaining that all cultures are now reaching the stage of formal operational thinking. The West merely initiated formal reasoning. More than this, according to Oesterdiefkhoff, this cognitive convergence is happening across all the realms of social life, because changes in the cognitive structures of humans bring simultaneous changes in the way we think about politics and institutional arrangements. The more rational we become, the more we postulate enlightened conceptualizations of government in opposition to authoritarian forms. Drawing on the extension of Piagetian theory to explain the moral development of humans (initiated by Piaget and elaborated by Lawrence Kohlberg), Oesterdiefkhoff writes that once humans reach stage four, they start to grasp “that rule legitimacy should follow only from a correct rule installation, that is, from the choices of the players involved” (2015, 88).
Thus, they regard only rules correctly chosen as obliging rules. Only democratic choices install legitimate rules. Youth on the formal stage surmount therefore the holy understanding of rules by the democratic understanding. They replace an authoritarian understanding of rules, laws, and customs by a democratic one. Thus, they invent democracy in consequence of their cognitive maturation (2015, 88-9).
The emergence of the adolescent stage of formal operations gave birth not only to the new sciences after 1650 but also to philosophers such as Locke, Montesquieu, and Rousseau, who formulated the basic principles of constitutional government, representative institutions, and religious tolerance. Extensive cross-cultural research has shown that “children do not understand tolerance for deviating ideas, liberty rights for individuals, rights of individuals against government and authority, and democratic legitimacy of governments and authorities” (2015, 93). They are much like the adults of premodern societies, or current backward Islamic peoples, who take “laws and customs as unchangeable, eternal, and divine, made by god and not modifiable by human wishes or choices” (2015, 90).
This argument may seem similar to Francis Fukuyama’s thesis that modernizing humans across the world are agreeing that liberal-democratic values best satisfy the longing humans have for a state that recognizes the right of humans to pursue their own happiness within a constitutional state based on equal rules. The difference, a crucial one, is that for Fukuyama the rise of democracy came from the articulation and propagation of new ideas, whereas for Oesterdiefkhoff psychogenetic maturation is a precondition of democratic rule. Adults who were raised in a pre-modern culture and have a concrete operational mind can “never surmount” this stage, no matter how many books they read about the merits of liberal democracy. These adults will lack the appropriate ontogenetic development required for a democratic mind.
The absence of stimuli and forces of modern culture during early childhood in premodern cultures prevents later psychological development from going beyond certain stages . . . Unused developmental opportunities in youth stop the development of the nervous system, thus preventing psychological advantages in later life. This explains why education and enlightenment, persuasion and media programs could not draw adult premodern people out of their adherence to magic, animism, ordeal praxis, ancestor worship, totemism, shamanism, and belief in witches. Such people, moving in adulthood to modern milieus, cannot surmount their anthropological structures and their deepest emotions and convictions (2016, 306-7).
Moreover, according to Oesterdiefkhoff, with the attainment of higher Piagetian stages come higher IQ levels. Psychogenetic differences, not biological genetic differences, are the decisive factor. “All pre-modern peoples stood on intelligence levels of 50 to 70 [IQ points] or on preoperational or concrete operational levels, no matter from what race, culture or continent they have come” (2014b, 380).
Not only the Western nations, but all modernizing nations have raised their scores. The rises in stage progression and IQ scores express the greatest intelligence transformations ever in the history of humankind and stem solely from changes in culture and education. When Africans, Japanese, Chinese and Brazilians have raised their intelligence so dramatically, where is the evidence for huge genetic influences? Huge genetic influences might be assumed if Europeans had always had higher intelligence and if African, Indians, Arabs and Vietnamese had been unable to raise their intelligence to levels superior to that of Europeans 100 years ago. But Latin Americans and Arabs today do have higher IQ scores than Europeans had 100 years ago . . . Where is the leeway for genetic influences to affect national intelligence differences? (Ibid).
IQ experts would counter that only psychometric data about levels of heritable general intelligence can explain the rise of formal operational thinking. But even if we agree that a gap in IQ scores between American blacks and American whites has remained despite the Flynn effect and similar levels of education and income, it is very hard to attribute the remarkable increases in IQ identified over the last century to heredity. Oesterdiefkhoff’s argument that “all modernizing nations have raised their IQ scores,” and that operational thinking has been central to this modernization, is a strong one.
Formal Reasoning is not a Cultural Universal
The stage of formal operations cannot be said to be a biologically primary ability that humans inherit genetically. They are secondary biological abilities requiring a particular psycho-cultural context. Formal thinking came to be assimilated by other nations (most successfully in east Asian nations with an average high IQ, but far less so in sub-Saharan nations where to this day witchcraft prevails). The abilities associated with the first two stages (e.g., control over motor actions, walking, mental representation of external stimuli, verbal communication, ability to manipulate concepts), have been acquired universally by all humans since prehistorical times. These are biologically primary qualities that children across cultures accomplish at the ages and in the sequence more or less predicted by Piaget. They can be said to be universal abilities built into human nature and ready to unfold with only little educational socialization, explainable in the context of Darwinian evolutionary psychology. These cognitive abilities can thus be identified as “cultural universals.”
The concrete-operational abilities of stage three (e.g., the “ability to conserve” or to know that the same quantity of a liquid remains when the liquid is poured into a differently shaped container) are either lacking in primitive cultures or emerge at later ages in children than they do in modern cultures. These cognitive abilities may also be described as biologically primary, as skills that unfold naturally as the child matures in interaction with adult members of the society. In modern societies, all individuals with a primary education acquire concrete operational abilities. The aptitudes of this stage can be reasonably identified as universally present in all agrarian cultures.
This is not the case at all with formal operational skills. The skills associated with this stage (inductive logic, hypothesis testing, reasoning about proportions, combinations, probabilities, and correlations) do not come to humans naturally through socialization. There is abundant evidence that even normally intelligent college students with a long background in education have great difficulties distinguishing between the form and content of a syllogism, as well as other types of formal operational skills. Oesterdiefkhoff acknowledges that
Only when human beings are exposed to forces and stimuli typical of modern socialization and culture do they progress further and develop the adolescent stage of formal operations (2016, 307).
But, again, as it has been observed by critics of Piaget, even in modern societies where children inhabit a rationalized environment and adolescents are taught algebra and a variety of formal operational skills, many students with a reasonable IQ find it difficult to think in this way. According to P. Dasen (1994), only one-third of adults ever reach the formal operational stage. Evolutionary psychologists have thus disagreed with the idea that this stage is bound to unfold among most humans as they get older as long as they get a reasonably modern education. There are many “sub-stages” within this stage, and the upper stages require a lot of schooling and students with a keen interest and intelligence in this type of reasoning. This lack of universality in learning formal operational skills has persuaded evolutionary psychologists to make a distinction between the biologically primary abilities of the first three stages and the biological secondary abilities of stage four. Formal reasoning is principally a “cultural invention” requiring “tedious repetition and external motivation” for students to master it.
If the ability to engage in formal thinking is so particular, a biologically secondary skill in our modern times, would it not require a very particular explanation to account for the origins of this cognitive stage in an ancient world devoid of a modern education? If the rise of “new humans” with a capacity for formal thinking was responsible for the rise of the modern world, and the existence of a modern education is an indispensable requirement in the attainment of this stage among a limited number of students, how did “new humans” grow out of a pre-modern world with a lower average IQ?
In the second part of this article, it will be argued that Europeans reached stage four long before any other people on the planet because Europeans began an unparalleled intellectual tradition of first-person investigations into their conscious states. This is a type of self-reflection in which European man began to ask who he is, how does he know that he is making truthful statements, what is the best life, and if he is being self-deceived in his beliefs and intentions. This is a form of self-knowledge that was announced in the Delphic motto “know thyself.” It would be an error, however, to describe the beginnings of this self-consciousness as a relation to something in oneself (an I or an ego) from which a predicate, or an outside, to which the subject relates, is derived. The emergence of the first-person consciousness of Europeans did not emerge outside the being-in-the world of the aristocratic community of Indo-Europeans. Europeans began a quest for rationally justified truths, for objective standards of justification, and for the realization of the good life in a reflective self-relation, coupled with socially justified reasons about what is morally appropriate.
References
Brown, Donald (1991). Human Universals. Philadelphia: Temple University Press.
Dasen, P. (1994). “Culture and cognitive development from a Piagetian perspective.” In W. J. Lonner & R. S. Malpass (Eds.), Psychology and culture. Boston: Allyn and Bacon.
Genovese, Jeremy (2003). “Piaget, Pedagogy, and Evolutionary Psychology.” Evolutionary Psychology, Volume 1: 217-137.
LePan, Donald. (1989). The Cognitive Revolution in Western Culture. London: Macmillan Press.
Lucien Lévy-Bruhl (2018). Primitive Mentality [1923]. Forgotten Books.
Oesterdiekhoff, Georg W. (2014a). “The rise of modern, industrial society. The cognitive developmental approach as key to disclose the most fascinating riddle in history.” The Mankind Quarterly, 54, 3/4, 262-312.
Oesterdiekhoff, Georg W. (2016). Child and Ancient Man: How to Define Their Commonalities and Differences Author(s). The American Journal of Psychology, Vol. 129, No. 3, pp. 295-312.
Oesterdiekhoff, Georg W. (2012). Was pre-modern man a child? The quintessence of the psychometric and developmental approaches. Intelligence40: 470-478.
Oesterdiekhoff, Georg W (2014b). “Can Childlike Humans Build and Maintain a Modern Industrial Society?” The Mankind Quarterly 54, 3/4, 371-385.
Oesterdiekhoff, Georg W (2015). “Evolution of Democracy. Psychological Stages and Political Developments in World History” Cultura: International Journal of Philosophy of Culture and Axiology 12 (2): 81-102.
Stenberg, Robert (2003). Cognitive Psychology. Nelson Thompson Learning. Third Edition.
This article was reproduced from theCouncil of European CanadiansWebsite.
Part 2 of 2 (Part 1 here)
Why did the West rise to become the most powerful civilization, the progenitor of modernity, the culture with the most prodigious creators? The answers are plenty. But it may be that a child psychologist, Jean Piaget, has offered the best theoretical framework to explain the difference between the West and the Rest. Part II of this article continues the examination of George Oesterdiefkhoff’s application and elaboration of Piagetian theory in his ranking of the cognitive development of the peoples of the world. It praises the fundamental insights of this elaboration while arguing that the psychogenetic superiority of European children should be traced back to the appearance of new humans in ancient Greek times who started to realize that their consciousness is the highest point on which all else depends.
Oesterdiefkhoff on the Origins of Western Operational Thinking
Why did Europeans reach the fourth stage of formal operations long before any other peoples in the world? When pressed (in an exchange) about the causes of the emergence of stage four, George Oesterdiefkhoff responded that
schooling and other cultural factors must have been more elaborated in early modern Europe than in Asia, antiquity, and medieval times. The trigger to arouse the evolution of formal operations would have been especially the systems of education (2014b, 376).
He then added:
Admittedly, this begs the question about the causes of this alleged fact and necessitates yet another level of causal explanation . . . I rather prefer cultural explanations and think about the possible relevance of the advantages of the Greek/Roman alphabet or Aristotelian logic, phenomena fostering the use of abstraction and logic (2014b, 376).
But this is as far as Oesterdiefkhoff goes in explaining why the ancient Greeks reached the fourth stage first. He prefers, rather, to jump right into the modern era, the seventeenth century, as the century in which formal operational thinking really emerged, from which point he then identifies “the rise of formal operations, the cognitive maturation of people” (in itself) as the “cause” of the rise of modern Europe. He insists that his Piagetian theory “is crucially a causal theory of modernity” (2014b, 375). But no explanation is provided as to the original causes of the rise of formal operational thinking.
If Oesterdiefkhoff’s point is that, without a population in which the children have ontogenetically developed a capacity for formal operations, you can’t have adults engage in formal operational thinking, then I agree that this ontogenetic development is a precondition for a modern society. But we still need an explanation for the rise of “new humans” (to use his own words) capable of formal operational thinking. Does he mean that the Greek/Roman alphabet and Aristotelian logic already contained the seeds of formal reasoning? The alphabet is indeed the most abstract symbolic system of writing in which both consonants and vowels are represented. Can it be denied that Aristotle’s theory of the syllogism is at the level of stage four, considering that this theory teaches that we can abstract altogether from the concrete content of an argument and judge its merits solely in terms of how the terms are formally or logically connected to each other?
Oesterdiefkhoff says that the Ionian philosophers (in the sixth century BC) were the first to establish the concrete operational stage and, in the same vein, implies that Aristotle’s philosophy did not rise above this concrete stage. “Aristotle’s physics strongly resembles the animistic physics of children aged 10 before they establish the mechanistic world view.” “The formal operational stage comes into being predominantly with Descartes in the 17th century” (2016, 304). We can agree that it “comes into being predominantly” in this century, but if we also agree that this stage has “many sub-stages” (as Oesterdiefkhoff points out), why can’t we identify Aristotle’s extensive writings on logic, induction and deduction, affirmations and contradictions, syllogisms and modalities, definitions and essences, species, genus, differentia, and the categories as the beginnings of stage four?
Oesterdiefkhoff knows he needs some origins, and admits he is caught in a chicken-egg dilemma. He writes about “a positive feedback loop” in the interrelationship between “the knowledge taught in schools and universities” in modern Europe and the rise of formal reasoning. But instead of “finding the causes for the emergence of formal reasoning in Europe some centuries ago,” he prefers to say that the “highest stage, the stage of formal operations, directly accounts for the rise of modern sciences” (2014a, 269). “The rise of formal operations in the Western world after 1700 isthe single cause of the rise of the sciences, industrialism, enlightenment, humanism, and democracy” (2014a, 287).
This may be understandable since Oesterdiefkhoff is not a historian. He has, in my view, made a fundamental contribution to the “rise of the West” debate, explaining the direct relevance of Piaget’s theory of cognitive development. None of the participants in this debate care to talk about “cognitive development,” but assume (along with the academic establishment) that all humans across all cultures and throughout history (since we became H**omo sapiens in Africa) are equally rational.
Oesterdiefkhoff wants to fit Western history within a stage theory of developmental psychology in which ancient/medieval times are clearly demarcated from modern operational stages. He writes of the “child-like stages” of peoples living in the pre-modern world, including Europeans, and says that the cognitive age of pre-modern adults “typically corresponds to that of children” before the age of 10. “Medieval philosophy, be it Platonic or Aristotelic, regarded nature and reality as living things, ruled by God, and other spiritual forces. It had no concept of physical laws.” “The rise of formal operations became a phenomenon of major importance as late as the 17th century.” “The kernel of Enlightenment philosophy is the surpassing of childlike mental states, of the world of fairy tales, magic, and superstition, as it prevailed in the pre-modern world” (2014a, 292-295).
He qualifies this estimation a bit when he writes that “formal operations…evolved in the intellectual elite of early modern Europe and slowly spread to other milieus.” But his pervading message is that it was only during the 1700s, or even “after the 1700s,” that Europeans came to reach the operational stage. There is no reason to disagree if he means that only the 1700s and after saw sufficient numbers of Europeans maturing into the last stage, making possible a full-scale industrial revolution. But we still need an explanation of the origins of “new humans,” the first humans who matured into the fourth stage.
I understand that many will be tempted to point to social and educational forces as the causes of this initial cognitive transition to operational thinking. They will argue that as literacy was mastered, and as institutes of learning were established, and arithmetic, reading, and other subjects were taught, a major shift occurred in human mental activity. This emphasis on the educational environment is a view often attributed to the Soviet psychologist A. R. Luria (1902-1977). From this claim, it takes only one step to the identification of the “social and economic” mode of production as the “underlying” factor of this cognitive revolution, thus combining Piaget and Marx’s historical materialism. The ancient Greeks developed operational thinking in the new milieu of urban life, growing trade in the Mediterranean, and money exchanges. The flaw in this explanation is that not only were all these new economic ways present in greater abundance in the older and larger civilizations of Mesopotamia and elsewhere, but all these commercial and urban activities only required concrete operational habits of thinking.
The view I will propose in a later section, albeit in a suggestive manner, presupposing for its understanding what I wrote in The Uniqueness of Western Civilization about the aristocratic culture of Indo-Europeans, and in a number of articles at the Council of European Canadians about the masculine preconditions of individualism, the higher fluidity of the Western mind, the multiple intelligences of Europeans, and the bicameral mind, is that Oesterdiefkhoff underplays the importance of self-consciousness, the awareness of humans of their own identity as knowers, in contradistinction to everything that is not-I, in the development of cognition. Europeans were the first to reach the fourth stage, a long time before any other people, because they were a new breed of humans who evolved a uniquely high level of self-awareness, an ability to differentiate clearly between their conscious “I” and the physical world; that is, an awareness of their own minds, as distinguished from their appetitive drives, the conventions of the time, and the world of invisible spirits. This introspective awareness of the role of the human mind as the active agent of cognition is what allowed Europeans to reach the fourth stage so early in their history.
It is no accident that the main precursor of the modern concept of mind is the ancient Greek notion of nous. Plato’s identification of three distinct parts of the soul — rational (nous), appetitive (epithumia). and the spirited (thymos) — can be classified as the first psychological contribution in the Western tradition. Both the appetitive and the spirited parts of the soul are about desires, but the appetitive part is about the biologically-determined desires humans have for food, sex, and comfort, whereas the spirited part is about “passion,” the emotions associated with the pursuit of honor and glory, feelings of anger and fear. Plato anticipated the Cartesian dualist separation of mind and body when he argued that the mind was immaterial and immortal, whereas the body was material and mortal. He also understood that the Indo-Europeans were the most “high spirited” peoples in the world, once observing that “the Thracians and Scythians and northerners generally” were peoples “with a reputation for a high-spirited character” (Francis M. Cornford, trans., The Republic of Plato, 132). Aristotle added to this observation a distinction between the “high-spirited” but barbaric passions of “those who live in Europe” and the “high-spirited” but “intelligent” virtues of the Hellenic peoples. Aristotle further observed that while the peoples of Asia were intelligent, they were “wanting in spirit and therefore they are always in a state of subjection and slavery.”
I trace this high spirited character to the uniquely aristocratic culture of Indo-Europeans. While one may be tempted to think that the intelligent-rational virtues of the Greeks were able to manifest themselves only when the rational part of their soul was brought to bear on their strong thymotic drives, Plato was correct in observing that the rational part would always be in unending combat with the demands of the appetites were it not for the intervention of the spirited part, the strong sense of aristocratic pride and honor of the Greeks, in helping reason subdue the appetitive part, and, in the same vein, helping reason to channel the high-strung energies of the spirited part away from barbaric and chaotic actions into a will-to-knowledge, a courage to break through the unknown, and thus bring forth the first sub-stages of the formal operational stage.
Before I say more about this explanation, I want to outline why I think Europeans, under their own initiative, not just in ancient Greek and Roman times but again in the High Middle Ages, after the decline of the Dark Ages (500 AD to 1100 AD), were the developers of formal operational habits of thinking long before any other people were compelled to adopt these habits under Western pressure.
Ancient Greeks were the First “New Humans”
Relying on Piaget’s criteria that the ability to think in a deductive way without handling concrete objects is a necessary component of the formal operational stage, it is hard to deny that the first clear signs of this stage were evident in Greek culture around the fifth century BC. We learn in Reviel Netz’s The Shaping of Deduction in Greek Mathematics: A Study in Cognitive History (1999) that Greek mathematics produced knowledge of general validity, not only about the particular right triangle ABC of the diagram, for example, but about all right triangles. This formal operational trait, this ability to think about numbers in a purely abstract way, is what makes Greek mathematics historically novel in comparison to all preceding “concrete” operational mathematics. This type of reasoning was very exclusive, to be sure, restricted to a small number of Greeks; it has been estimated that at most there were a thousand mathematicians throughout Greek antiquity, a period lasting a full millennium.
How about scientific accomplishments during the Hellenistic era (323-31 BC)? Oesterdiefkhoff seems aware of Hellenistic science when he writes that “Roman intellectuals no longer understood the superior contributions of the Hellenistic scholars” (2014a, 281). Can one say that the cognitive processes of the Hellenistic elite were at a level under the age of 10 after reading Lucio Russo’s The Forgotten Revolution: How Science Was Born in 300 BC and Why It Had to Be Reborn? Can one really say that the institutionalization of scientific research in the Museum and Library at Alexandria, which contained more than five hundred thousand papyrus rolls and funded one hundred scientists and literary scholars, was not an educational establishment promoting formal operational thinking? We learn from Russo’s book about the conics of Apollonius and the invention of trigonometry by Hipparchus, about Archimedes’s work on hydrostatics and the mechanics of pulleys and levers, the first formal science of weight, about Aristarchus’ heliocentric proposal, and about Eratosthenes and his calculations to determine the circumference of the Earth. The hypothetico-deductive form of Euclid’s Elementsis undeniable; it is the way in which circles, right angles, and parallel lines are explicitly defined in terms of a few fundamental abstract entities, such as points, lines, and planes, on the basis of which many other propositions (theorems) are deduced. (Newton, by the way, was still using Euclidean proofs in his Principia). While the Romans did not make major contributions in mathematics and theoretical science, it should be noted that Claudius Ptolemy, while living under Roman rule in Alexandria in the second century AD, wrote highly technical manuals on astronomy and cartography. The Almagest, which postulates a geocentric model, employs pure geometric concepts combined with very accurate observations of the orbits of the planets. It postulates epicycles, eccentric circles, and equant points, with the latter being imaginary points in space from which uniform circular motion is measured. Attention should be paid to the “formal-rational” codification and classification of Roman civil law into four main divisions: the law of inheritance, the law of persons, the law of things, and the law of of obligations, with each of these subdivided into a variety of kinds of laws, with rational methods specifying how to arrive at the formulation of particular rules. The rules upon which legal decisions were based came to be presented in categories headed by definitions. The most general rules within each of these categories were the principles upon which more specific rules were derived. This ordering was in line with a formal operational mode of reasoning, for the rules were presented without reference to the factual settings in which they were developed, and the terminology used in these rules was abstract.
This effort at a rationally consistent system of law was refined and developed through the first centuries AD, culminating in what is known as Justinian’s Code, 527 to 565 AD, which served as the foundation of the “Papal Revolution” of the years 1050-1150, associated with the rise of Canon Law. This Papal Revolution, by separating the Church’s corporate autonomy, its right to exercise legal authority within its own domain, and by analyzing and synthesizing all authoritative statements concerning the nature of law, the various sources of law, and the definitions and relationships between different kinds of laws, and encouraging whole new types of laws, created a modern legal system.
Medieval Europeans
Oesterdiefkhoff acknowledges in passing that ancient Greece saw “seminal forms of democracy . . . for a certain period,” a form of state which actually entails, in his view, the fourth stage of cognitive development. If Greek democracy was short-lived, what about the republican form of government during ancient Roman times, and the impact this form of government had on the modern Constitution of the United States? We can also mention the representative parliaments and estates of medieval Europe? To be sure, ancient Greece and Rome, and the Middle Ages, were far from the formal operational attainment of modern Europe (even if we draw attention to the continuation of witchcraft and magic in Enlightenment Europe).
It is telling, however, that according to Charles Radding’s book, A World Made by Men: Cognition and Society, 400-1200 (1985), new lines of formal operational reasoning were “well established by 1100” in some European circles. I say “telling” because this book (one of only two) directly employs Piaget’s theory to make sense of Europe’s intellectual history. Oesterdiefkhoff references this work without paying attention to its argument. From a Europe that employed ordeals of boiling water and glowing iron to decide innocence and guilt, and that “looked for direction” to divinely inspired pronouncements from superiors, kings, abbots, or the ancients, and that was rarely concerned with “human intention,” we see (after 1100) a growing number of theologians insisting that humans must employ their God-given reasoning powers to determine the truth. Whereas the way theological disputes were settled before 1100 was “by citing authority,” “it was even increasingly the case [after 1100] that the very authority of a text’s author might be denied or disregarded” (p. 204). Using “one’s own judgment” was encouraged, combined with the study of logic as “the science of distinguishing true and false arguments.”
Although Radding is not definitive and barely elaborates key points, he understands that this increase in logical cognition entailed a new awareness of the distinction “between the knower and what is known,” between the I and the not-I. Medievalists actually went ahead of the ancient Greeks. For Plato, an idea existed and was correct if its origins were outside the mind, in the world of immaterial and perfect forms, which he differentiated from the untrue world of physical things. Perfect ideas were independent of the human mind, outside space and time, immutable. These ideas were not the products of human cognition. While the only way the human mind could apprehend these ideas was through intense training in geometrical (formal) reasoning, the aim was to reach a world of godlike forms to which the human mind was subservient.
While Aristotle transformed Plato’s forms into the “essences” of individual things, he believed that universal words existed in individual objects, or that abstract concepts could be equated with the essences of things. It is not that Aristotle did not perceive any dividing line between the supernatural, the world of dreams, and the natural; it is that he was a conceptual realist who believed that the contents of consciousness really existed as the essences of particular objects. Medieval nominalists showed a deeper grasp of the relationships between the mind and the external world by abandoning the notion that Forms (or ideas) represent true reality, the source of the mind’s ideas, and arguing instead that general concepts are mere names, neither the essences of things nor forms standing outside the material world. Only particular objects existed, and the role of cognition was to make true statements about the world of particular things even though ideas are not things but mental tools originated by men.
Nominalism represented a higher level of awareness of the role the human mind plays in cognition and of the distinction between the knower and the world outside. While Plato distinguished reason from the world of sensory phenomena, including natural desires, and, in so doing, identified the faculty of reasoning in its own right, he viewed human (intellectual) activity as dependent or subservient to a world of perfect and purely immaterial forms existing independently of the mind. Moreover, among medieval philosophers we find (in Peter Abelard, for example) a greater emphasis on intention, the view that the intention of humans should be considered in determining the moral worth of an action. Human action should not be attributed to supernatural powers or evil forces entering into human bodies and directing it. Humans have a capacity to think through different courses of actions, and for this reason human actions cannot be understood without a consideration of human intentions.
Radding brings up the emerging “idea of nature as a system of necessary forces” in opposition to the early medieval idea about miraculous events, as well as the “treatment of velocity itself as a quantity . . . comparing motion that follows differently shaped paths,” in the work of Gerard of Brussels in the early 1200s (p. 249). A better example of formal operational concepts would be Nicole Oresme’s (1320-1382) depiction of uniformly accelerated motion, which was not about motion in the real world but an effort to explain how motion increases uniformly over time in a totally abstract way. This view anticipated Galileo’s law of falling bodies. Among other examples Radding brings to elucidate this medieval shift to formal operational thinking is the observation that by the reign of Henry (1133-1189) the idea had taken root that consultation of members of the upper classes should be the norm in the workings of the monarchy, as well as the legal idea that mental competence should be a prerequisite in deciding criminal behavior.
The Birth of Expectation in Early Modern Era
Don LePan’s book, The Cognitive Revolution in Western Culture (1989), agrees with Radding that “there is considerable evidence of at least the beginnings of changes in the cognitive processes occurring among the educated elite in the twelfth century” (p. 45). But he believes that new cognitive processes began to spread in the early Modern period (or the Renaissance) when Europeans developed the capacity of “expectancy,” which he defines as “the ability to form specific notions as to what is likely to happen in a given situation” (p. 75). It is around this sense of expectancy, LePan says, that most of the cognitive processes Piaget identifies with the fourth stage are clearly evident. This sense of expectancy involves a “rational assessment of probabilities,” evaluating “disparate pieces of information” within a chain of events and circumstances as to whether something is likely to transpire in the future or not, drawing inferences from this information, and projecting “these inferences into the hypothetical realm of the future” (pp. 74-75). Before this capacity developed, the sense of future expectation that humans had was of a predetermined sort, or accidental and beyond reason, in which an outcome was believed to happen “regardless of the intervening chain of events” (p. 79) and without an objective assessment of human intentions and events about how the future event will likely happen.
This sense of expectancy involved the emergence of an ability to think in terms of abstract universal time, as contrasted to the commonly held notion of pre-modern peoples that “time moves at variable speeds, depending on the nature and quality of the events”. Among primitives, the recounting of past events, or history, is merely an aggregation of disconnected anecdotes without any sense of chronology and causal relationship and no grammatical distinction between words referring to past events or to present events. The past is conceived similarly to the present. While early Christian historians did have a sense of chronology, a universal history where all events were framed within a temporal sequence, they did not have a framework of abstract and objective time. They were more interested in detecting the plan of God rather than in how humans with intentions made their own history.
Because pre-modern peoples lack a framework of abstract and objective time, the “when” of an event is merely about before or after other events and not about the length of time elapsed between it and other events. Pre-modern peoples are also incapable of distinguishing between travelling the same distance and travelling at the same speed. They lack the habit of thinking of velocity as a quantity distinct from those of distance and time. Without a temporal conception wherein one can think of causes as anterior to the effect, it is not possible to consider historical events in terms of causal relations within a sequence of past, present, and future events.
For these reasons, pre-modern peoples were unable to think in terms of expectations of a hypothetical future, that is, to think about what will happen in the future in terms of multiple chains of causation and the ways in which these causes, sometimes happening simultaneously in different places, may bring about a future effect. LePan is particularly keen in showing that William Shakespeare’s originality was a result of his ability to create complex plots which gave the audience “a continual sense of anticipation . . . by drawing them into [an] unfolding pattern of connections with the past and the future of the story” (p. 175). The curiosity of a pre-modern audience is restricted to what will happen next within a sequence of episodes in which the reader or audience is confident about what is likely to happen, or what the final outcome will be, and in which there is, therefore, no sense of expectation whether it will happen, no concern to envisage the hypothetical possibilities of situations, no weighing of causes and intentions against each other, and no judgment of what the probable outcome of the future will be.
As to what brought this new sense of expectation and the spread of the habits of thought associated with stage four, LePan is inclined to follow A. R. Luria’s argument that the causes of cognitive change are due to social and educational factors. He is rather vague; as society changes, literacy is mastered, the level of education increases, and the cognitive processes change. Which came first, new cognitive processes, new ways of educating children, or new “underlying economic changes”? They “reinforced each other.” LePan carefully distances himself from any claim that Europeans were genetically wired for higher levels of cognition. Even though he rejects the establishment idea that “all peoples think with exactly the same thought processes,” he believes that all humans are equally capable of reaching this stage. Without realizing that Piaget laid the groundwork for Kohlberg’s moral stages, he insists there is no “direct correlation between degrees of rationality and degrees of moral goodness” (p. 15). The book ends with a strange “postscript” about how he has been living with his wife in rural Zimbabwe for the last two years. He says he wishes the primitive and the modern mind could co-exist with each other, praises the cultural “vitality” of this African country, and then concludes with the expectation that “if something like a new Shakespeare is to emerge, it will be from the valleys of the Niger or the Zambezi” (p. 307). The subtitle of the first volume of The**Cognitive Revolution in Western Culture is The Birth of Expectation. He did not write a second volume.
The uniqueness of the West frightens academics. They have concocted every imaginable explanation to avoid coming to terms with the fact that Europeans could not have produced so many transformations, innovations, renaissances, original thinkers, and the entire modern world, without having superior intellectual powers and superior creative impulses. The tendency for some decades now has been to ignore the cultural achievements of Europeans, minimize them, or reduce the “rise of the West” to one transformation, the Industrial Revolution, currently seen as the only happening that brought about the “great divergence.” The prevailing interpretation paints these achievements as no better than what transpired in any other primitive culture, and, indeed, far worse insomuch as the West was different only in its imperialistic habits, obsessive impulse for military competition, and genocidal actions against other races.
I agree with Oesterdiefkhoff that the faster cognitive maturation of European peoples “is the decisive phenomenon” in need of an explanation if our aim is to explain the rise of modern-scientific society. I will leave aside the question of whether this is the only factor that needs explanation if our aim is to explain other unique attributes of the West, such as the immense cultural creativity of this civilization. Cultural creativity in the arts presupposes a higher level of cognitive development, but it would be one-sided to reduce all forms of creativity to formal operational habits. Once these cognitive habits are established, formal operations can be performed at the highest level of expertise by individuals who are not creative, but who have a high IQ and a very good education. Computers can be programmed to perform formal operations, but it is hard to say that they are self-conscious beings rather than automata unthinkingly executing prescribed actions. Computers do not understand the meaning of the real world for which they are processing information; they are not “aware” of what they are thinking about, and they have no sense of self, and cannot, therefore, examine their own thoughts, exercise free will, and show a spirited character. Obviously, humans who engage in formal operations are not computers. But if we equate the human intellect with formal operational thinking and identify this capacity as the defining trait of modern culture and Western uniqueness, we are endorsing a computational model of human consciousness.
Self-Consciousness is Uniquely European
Oesterdiefkhoff and LePan wanted to generate the origins of formal operational habits by positing the prior presence of proto-formal habits, the alphabet, Aristotelian logic, and literacy; but knowing this was a self-referential explanation, they also brought in educational institutions, implying thereby that these institutions were created by proto-formal thinkers who taught children to learn formal operations, still offering a self-referential account. We need to step outside the world of formal operations to understand its origins. Oesterdiefkhoff identifies Descartes as the first thinker to offer a systematic methodology for the pursuit of knowledge based strictly on formal operational principles. It is not a coincidence that Descartes is also known as the first modern philosopher in having postulated self-consciousness as the first principle of his formal-deductive philosophy. Descartes showed himself to be very spirited in daring to doubt and repudiate all authority and everything he had been taught, to arrive at the view that the only secure foundation for knowledge was in self-consciousness. The only secure ground for formal operations was his certainty that he was a thinking being, despite doubting everything else. Everything could be subjected to doubt except his awareness that his own mind is the one authority capable of deciding what is true knowledge, not the external senses and not any external authority.
The Cartesian idea that self-consciousness on its own can self-ground itself would be superseded by future thinkers who correctly set about connecting self-consciousness to an intersubjective social context (a social setting I would identify as singularly European, since no other setting could have generated this Cartesian idea). My point now is that Piaget’s fourth stage, in its modern form, would have been impossible without self-consciousness. Descartes did not invent self-consciousness; ancient Greece saw the beginnings of self-conscious new humans; but he did offer its first modern expression, with more sophisticated expressions to follow. It is worth citing Hegel’s treatment of Descartes in his History of Philosophy:
Actually we now first come to the philosophy of the modern world, and we begin this with Descartes. With him we truly enter upon an independent philosophy, which knows that it emerges independently out of reason . . . Here, we may say, we are at home, and like the mariner after a long voyage over the tempestuous seas, we can finally call out, “Land!” . . . In this new period the essential principle is that of thought, which proceeds solely from itself . . . The universal principle is now to grasp the inner sphere as such, and to set aside the claims of dead externality and authority; the latter is to be viewed as out of place here (Hegel’s Lectures on the History of Philosophy, trans. Haldane & Simpson, vol. III, p. 217).
The key idea is that thought proceeds from itself, out of reason, independently of all external authorities. The biological roots of this declaration of independence by the human, thinking subject are to be found in the natural obsession men have shown across all cultures to affirm the male ego in contradistinction to the enveloping womblike environment. This struggle for male identity is only a sexual precondition, and an always-present one, for the subsequent appearance of self-awareness and the first inklings of human individuality. The first cultural signs of individualism are to be found in pre-historical Indo-European societies uniquely ruled by “high spirited” aristocratic men living in a state of permanent mobility and adversity, for whom the highest value in life was honorable struggle to the death for pure prestige. It was out of this struggle by aristocratic men, who were seeking excellence in warfare that would be worthy of recognition from their aristocratic peers, that the separation and freedom of humans from the undifferentiated world of nature and the undifferentiated world of collectivist-despotic societies was fostered.
Cognitive and evolutionary psychologists, and philosophers of the mind, take it for granted that humans as humans are self-conscious beings, aware of themselves as living. “Consciousness is the greatest invention in the history of life; it has allowed life to become aware of itself,” said Stephen Jay Gould. This is true if by self-consciousness we mean the awareness humans have of their first-person inner experiences, pain, feelings, and memories. Human beings are constantly trying “to understand, respond to and manipulate the behavior of other human beings,” and in so doing they learn to read other people’s behavior, their feelings, and interests by self-examining their own thoughts and feelings, imagining what it is like to be in the other person’s shoes. This capacity to reflect on one’s states of mind and emotions in order to understand the behavior of others is a biologically-ingrained trait found in all humans, selected by nature. Nicholas Humphrey, in a very insightful short book, The Inner Eye, identifies this capacity as a form of “social intelligence” that evolved with gorillas and chimps. Consciousness was selected by nature because it enhanced the ability of these primates to survive within social settings characterized by “endless small disputes about social dominance, about who grooms who, about who should have first access to a favourite food, or sleep in the best site” (p. 37). In dealing with these issues, primates “have to think, remember, calculate, and weigh things up inside their heads” (p. 39). They have to learn to read the brains of other gorillas by looking inside their own brains and imagining what it is like to be in the situation of another gorilla.
This social intelligence is very different, but just as important, as the technical and natural intelligence required to survive in the acquisition of food and protection in a hostile environment. I am not going to rehearse Steven Mithen’s additional claim to Nicholas Humphrey’s argument that consciousness can be said to have emerged not when primates learned to predict the social behavior of other members of the group, but when Homo sapiens during the Upper Paleolithic era managed to achieve enough “cognitive fluidity” between the different intelligences: social, linguistic, technical, and natural. Neither will I rehearse Julian Jaynes’ argument that such advanced peoples as the Mesopotamians and Egyptians were still lacking in self-consciousness, without “an interior self,” subservient to powerful gods controlling and arresting the development of their cognitive processes. I have added in the first part of this article Piaget’s scientifically-based argument that pre-modern peoples did have “childlike” minds, which made it very difficult for them to rely on their own reasoning powers, to attain independence from the influence of unknown spirits and age-old mandates accepted without reflection.
I will conclude by asserting that it goes against the entire history of actual cognition and actual intellectual developments, as well as the history of science, mathematics, psychology, physics, and chemistry, to be satisfied with the degree of consciousness found in primates, Upper Paleolithic peoples, and all non-Western civilizations, which never reached the stage of formal operations, and which stagnated intellectually after the Bronze Age, and, in the cases of China and the Islamic world, after about 1300 AD. Europeans reached a higher level of consciousness starting in ancient Greek times with their spirited discovery of the faculty of the mind, and their increasing awareness of their own agency as human beings capable of understanding the workings of the world in terms of self-determined or rationally-validated regularities, coupled with their growing awareness that man was the measure of all things, a subject with a spirited will-to-be-conscious of himself as a free subject who takes himself to be the “highest point” on which all else depends, rather than a mere object of nature and mysterious forces. But this self-consciousness was in its infancy in ancient times, and it would take a consideration of German Idealism during the 1800s to attain a full account of how the (self-conscious) I can be shown to lie at the very basis of all knowledge, and beyond this outlook, to develop a philosophical-historical account that demonstrates a full awareness that this self-conscious I was self-generated only within the particular cultural setting of Western Civilization.
References
Brown, Donald (1991). Human Universals. Philadelphia: Temple University Press.
Dasen, P. (1994). “Culture and cognitive development from a Piagetian perspective.” In W. J. Lonner & R. S. Malpass (eds.), Psychology and Culture. Boston: Allyn and Bacon.
Genovese, Jeremy (2003). “Piaget, Pedagogy, and Evolutionary Psychology.” Evolutionary Psychology, Volume 1: 217-137.
Humphrey, Nicholas (2002). The Inner Eye: Social Intelligence in Evolution. Oxford University Press.
LePan, Donald (1989). The Cognitive Revolution in Western Culture. London: Macmillan Press.
Lucien Lévy-Bruhl (2018). Primitive Mentality[1923]. Forgotten Books.
Oesterdiekhoff, Georg W. (2014a). “The rise of modern, industrial society. The cognitive developmental approach as key to disclose the most fascinating riddle in history.” The Mankind Quarterly, 54, 3/4, 262-312.
Oesterdiekhoff, Georg W. (2016). “Child and Ancient Man: How to Define Their Commonalities and Differences.” The American Journal of Psychology, Vol. 129, No. 3, pp. 295-312.
Oesterdiekhoff, Georg W. (2012). “Was pre-modern man a child? The quintessence of the psychometric and developmental approaches.” Intelligence 40: 470-478.
Oesterdiekhoff, Georg W (2014b). “Can Childlike Humans Build and Maintain a Modern Industrial Society?” The Mankind Quarterly 54, 3/4, 371-385.
Oesterdiekhoff, Georg W (2015). “Evolution of Democracy. Psychological Stages and Political Developments in World History” Cultura: International Journal of Philosophy of Culture and Axiology 12 (2): 81-102.
Radding, M. Charles (1985). A World Made by Men: Cognition and Society, 400-1200. The University of North Carolina Press.
Stenberg, Robert (2003). Cognitive Psychology, Third Edition. Nelson Thompson Learning.