(Palazzi, 2023)
TABLE OF CONTENTS
1. Introduction
2. Background: The Cognitive Limits of Rationality
3. Cognitive Blindspots
4. The Myth of the All-Seeing Eye: The Limits of Perception
5. The Epistemological Crises
6. Conclusion
The essay that follows is being published in six installments, one per section; this is the third installment.
But you can also download and read or share a .pdf of the complete text of the essay, including the REFERENCES, by scrolling to the bottom of this post and clicking on the Download tab.
The Limits of Reason: Cognitive Psychology, The Epistemological Crisis, and Epistemic Humility, #3
There is a crack, a crack in everything
That’s how the light gets in. (Cohen, 1992)
[H]ow would we feel if science came up against experimental and intellectual brick walls, so that after centuries of trying, man finally concluded that the world was constructed – if upon intelligible principles at all – upon principles so bizarre as to be perfectly undiscoverable or unfathomable by the human mind? What if [humankind] became totally convinced that the world simply could not be understood, that the world is and always must remain an intellectual surd? Science might then continue at it pertains to technology, but not as it pertains to theory. What if all hope of theoretical understanding were permanently lost? (Davis, 1987: 293)
Only those who stop at the right moment prosper in philosophy, those who accept the limit and the comfort of a reasonable level of worry. Every problem, if one touches the bottom, leads to bankruptcy and leaves the intellect naked: No more questions and no more answers in a space without horizons. The questions turn against the mind which conceived them: It becomes their victim. Everything becomes hostile: [their] own solitude, [their] own audacity, absolute opacity, and the manifest nothingness. Woe to [that person] who, having reached a certain point of the essential, has not stopped! History shows that the thinkers who climbed to the limit of the ladder of questions, who laid their foot on the last rung, on that of the absurd, have given to posterity an example of sterility, whereas their peers, who stopped half-way, have fertilized the mind’s flow; they have been useful to their fellows, they have passed down some well-crafted idol, a few polished superstitions, a few errors dressed up as principles, and a system of hopes. (Cioran, 1949: pp. 115-116)
3. Cognitive Blindspots
A large body of work in 20th and 21st century psychology extends the critique of human rationality much further than merely demonstrating that animals share in rational capacity to various degrees. Briefly: humans frequently commit a range of cognitive errors such as base-rate neglect errors (Henrion & Fischhoff, 1986); framing errors (Rothman & Salovey, 1997); preference reversals and the prominence effect (Gilovich, 1991); omission biases (Gilovich, 1991, 97); the status quo bias (Ritov & Barron, 1992); availability bias (Tversky & Kahneman, 1974); hindsight bias (Fischhoff & Beyth, 1975); ordering effects (Schwitzgebel & Cushman, 2012); anchoring and adjustment (Wistrich et al., 2005), and probability errors (Bar-Hillel & Falk, 1982; Brilmayer, 1983; Bishop &Trout, 2005; Rosenhouse, 2009). Experts, as well as “ordinary people,” commit such errors (Brilmayer, 1983). Humans, from a behavioural economics perspective, have been viewed as “predictably irrational”:
we are pawns in a game whose forces we largely fail to comprehend. We usually think of ourselves as sitting in the driver’s seat, with ultimate control over the decisions we make and the direction our life takes; but alas, this perception has more to do with our desires—with how we want to view ourselves—than with reality. (Ariely, 2009: p. 321)
According to the historical study by Justin E. H. Smith of the waves of rationality and irrationality in human history, irrationality is an ineliminable part of the human condition, there being something of an historical seesaw between reason and unreason, in unending cycles (Smith, 2019).
Humans, outside of academic environments, usually have limited time, knowledge, access to information, as well as limited computational capacities, so probability and formal logic have a much more reduced role in daily life than found in scientific practice (Gigerenzer et al.,1999). Sound reasoning and decision-making in terms of the laws of probability requires unfeasibly large amounts of time, knowledge and computational capacity, so much human decision making makes use of fast and frugal heuristics, rather than the calculation of probabilities, utilities, and Bayesian models; rationality is “bounded” (Simon, 1982). The cognitive limits of the human mind, and the inability to calculate optimal strategies, “in the field,” means that sub-optimal decision-making using approximate methods must be done in the context of the structured environment (Simon, 1956; Elster, 1979).
Robert A. Burton, in On Being Certain, says that psychology as a discipline faces something of an existential crisis, in the light of research that indicates that much of cognition occurs outside of consciousness (Burton, 2008: pp. 146-147), without “direct access to the “adaptive unconscious,” thereby making human behavior a mystery: hence we’re strangers to ourselves (Wilson, 2002: p. 16). As such, Burton concludes, we do not know what we know by conscious rational deliberation and the careful balancing of reason, and
we are left challenging the common sense and folk psychology understanding of ourselves, including knowing the degree to which we are consciously responsible for our thoughts and actions. (Burton, 2008, 146)
Of course, philosophical defenders of rationality, intentional action, and free agency, will disagree, and can supply challenging counter-arguments (Hanna, 2006, 2018; Hanna & Maiese, 2009).
Gerd Gigerenzer in Gut Feelings, concludes from his review of the psychological literature that “much of our mental life [is] unconscious, based on processes alien to logic: gut feelings, or intuitions” (Gigerenzer, 2007, 1). The present paper will develop this idea, which can be called, “cognitive blindspots,” in more detail and fully embrace the paradoxical consequence, that if true, and accepted, the position is self-undermining, or perhaps “trans-rational,” but in an interesting way, showing yet another antinomy of reason and the self-undermining aspect of contemporary science, and inherent limitations of reason.
To devilishly complicate things even further, much of this research into cognitive errors and biases is based upon psychological studies using W.E.I.R.D (White, Educated, Industrialized, Rich and Democratic) subjects, and it has been argued that the samples used by behavioral scientists to establish various claims about human behavior and psychology, including
visual perception, fairness, cooperation, spatial reasoning, categorization and inferential induction, moral reasoning, reasoning styles, self-concepts and related motivations, and the heritability of IQ (Henrich et al., 2010: p. 1),
may not be universal at all, valid for the human race in general. In other words, research into biases could itself be biased in subtle ways, and perhaps self-undermining as well! We might be lost in the epistemological fog.
As one who is epistemically humble would expect, there is psychological literature challenging the idea that biases, errors and self-fulfilling prophecies at least in the area of social psychology, are as prevalent as social and cognitive psychologists think, to the extent of making human social life a maze of errors, if not deceptions, undermining the rationality and validity of social judgement and perception. Lee Jussim, in Social Perception and Reality, has put the opposing case that this position in cognitive and social psychology has exaggerated the importance, extent, and pervasiveness of cognitive errors, which while in many cases are real, by no means dominate human life (Jussim, 2014, 2017).
Jussim’s position parallels that of L.J. Cohen, who argued in an iconic 1981 paper in Behavioral and Brain Sciences, against leading cognitive error theorists such as Kahneman and Tversky, that the admittedly widespread existence of cognitive errors and biases in human life and science, does not make the prospects of human rationality “bleak,” for
[t]he presence of fallacies in reasoning is evaluated by referring to normative criteria which ultimately derive their own credentials from a systematisation of the intuitions that agree with them. These normative criteria cannot be taken, as some have suggested, to constitute a part of natural science, nor can they be established by metamathematical proof. Since a theory of competence has to predict the very same intuitions, it must ascribe rationality to ordinary people. (Cohen, 1981: p. 317)
In what follows, I’ll be concerned primarily with putting the more general metaphysical case for anti-rationalism, and in particular being concerned with addressing the foundational, epistemological and metaphysical issues raised by Cohen, rather than simple addressing the debate between Kahneman and Tversky, on the one hand, and Jussim, on the other, that deals primarily with the rationality of ordinary life. Here, the rationality of science itself is the target, and that includes, of course, psychology itself; philosophy and formal logic will be dealt with in other essays.
Against Professional Philosophy is a sub-project of the online mega-project Philosophy Without Borders, which is home-based on Patreon here.
Please consider becoming a patron!