Howard A. Doughty
Twenty years ago, I presented a paper to the Hawai’i International Conference on Education wherein I attempted to cool the enthusiasm for “critical thinking,” a phrase then becoming all the rage in postsecondary education.
I didn’t dispute the concept’s good intentions. As adopted across institutions, within academic disciplines, and by classroom teachers alike, it encouraged clear thinking and expression, urged pursuing evidence-based research, and disdained methodological bias, ideological orthodoxy, political partisanship, prejudice, and narrow-mindedness in all their forms. Critical thinking advocates echoed the eighteenth-century European “Enlightenment” in promoting the scientific method and the sentiments built into Max Weber’s canonical essay, “Science as a Vocation.”
Following David Hume, these early modernists distinguished between truth claims that are analytic, a priori, and necessary by definition versus those that are synthetic, a posteriori, and contingent upon support from the knowable external world. The first claims include statements of mathematical certainty and consistency with universal rules of logic. They are exempt from the need to provide operational definitions of concepts and observable referents in the purportedly “real” world. They involve formal statements that set rules for subsequent discussions. They are not at issue.
The second claims involve statements about contestable ethical beliefs and practical choices. They allow for two (or more) sides to a problem. Whereas a triangle cannot have anything other than three sides, moral value choices about “human rights,” for example, acknowledge that statements of preference require empirical support. Their truths are not self-evident; they must have factual support.
No political preferences, literary interpretations, aesthetic approvals, moral judgements, theological speculations, technological improvements, practical applications, etc., can be said to be true with absolute, eternal, transcendental certainty. Scientists must always allow for uncertainty: Even laws of gravity or evolution retain some infinitesimal chance that things could be otherwise; rules may one day be shown to have exceptions; new information may call for revision. The pursuit of truth is mostly the elimination of falsity with new questions always arising.
As Hume argued, you cannot derive an ‘ought’ from an ‘is’; accordingly, critical thinking was intended to protect us against normative arguments unsupported by measurable references to the material real world. Normative finalities were verboten. Einstein’s dictum that “God does not play dice with the universe” notwithstanding, Heisenberg’s uncertainty principle urged modesty in human thought.
Critical thinking was meant to train us to rely upon both logical reasoning to guide our inquiries and verifiable data to supply reliable answers. Open-mindedness guarded against empty-headedness. Like 1950s television detective Sgt. Joe Friday’s interviews with witnesses to crimes, we were taught to seek “just the facts, ma’am!” Hypotheses came first, full-blown explanatory theories linking confirmed hypotheses came last… if at all.
Biblical lore has it that our primordial parents enjoyed life in Eden with total innocence with all needs met and all worries absent. Then, a certain talking snake persuaded Eve (who else?) to eat from the tree of knowledge and, thereby, to invade the territory of her Creator and to usurp a tiny sliver of His hegemony… at which time, the jig became abruptly and permanently up. Our “original sin” was to seek to think for ourselves.
Ever since, we’ve done some formidable thinking. We seem to have sorted out something about biological and cosmological evolution. We have manufactured useful tools as well as weapons of individual torture and mass annihilation. We have made music and decided that ninety feet between bases should be the standard distance for baseball diamonds (a choice that the iconic sports commentator Red Smith provisionally declared to be: “…the closest man has ever come to perfection”).
CFIC thoughtfully advertises itself as committed to critical thinking; however, CFIC’s mission statement conjoins science and compassion, its vision stresses reliance on evidence, its inventory of values contains the elimination of prejudice, diversity and inclusion and a firm commitment to social justice.
I, too, support these commitments, but I wonder about the consistency of favouring both value-free empiricism and explicitly “liberal humanistic” values. Not only revanchist religious “fundamentalists” and biblical literalists, but also strict materialists and the several heirs to the tradition of logical positivism, might find reasons to object.
What to do?
One escape is to acknowledge that pure objectivity is its own style of magical thinking. Ever since Francis Bacon optimistically declared that “Knowledge is power,” we have implicitly understood that all forms of knowledge are for something. Wholly disinterested inquiry (casual curiosity, idle whimsy) may exist, but purposeful research is inherently invested in the aspiration to comprehend and control. So, in his “Epilogue” to Knowledge and Human Interests, Jürgen Habermas pointed to three kinds of knowing and knowledge-constitutive interests: the empirical/analytical sciences, which facilitated control over the material world; the historical/hermeneutical sciences, which facilitated control over social relations; and the emancipatory sciences, which approached self-control (and liberty) beyond the realm of necessity.
This is not the place to explore such flights of imagination and innovation, but I do think that there are some still-unpolished gems of understanding that can be brought forward to revitalize our projects without wasting effort debating the likes of Jordan Peterson and the intellectually lazy practice of tossing verbal bric-a-brac back and forth.
How to begin?
We must recognize that human knowledge is human, that it is innately value-laden. The epistemological “magical realism” that posits a categorical distinction between fact and value is a profound misreading of both history and science. For the adventuresome, a quick peek at Marx’s Contribution to the Critique of Hegel’s Philosophy of Right can be instructive: “The criticism of religion is the prerequisite of all criticism.”
What remains is not to exclude values from facts, but to inquire about which values inhabit our scientific presumptions and to self-consciously select those projects that suit our purposes.
Nothing that I have said implies that humanistic values necessarily flow (or flow exclusively) from Marx, much less that one need be even vaguely a “Marxist” (as I most emphatically am not). Rather, it points to the fact that, in our desperation to build a permanent base for humanist thought and action, we have too easily ceded the discussion of morals, ethics, and politics to those who fail to notice that every institutional authority is operating from a rigorous hierarchy of corporate interests—both public and private. Those economic, political, and educational institutions’ values predestine the results of every funded inquiry we undertake. Officially sanctioned work necessarily reflects the economic, political, and cultural interests of the funders and regulators.
We may be sure that the calculations about our scientific and technological work are not being made in a normative vacuum, but for purposes congruent with the ideas and ideals of those with at least the temporary power to control science and technology. Governments will guide science and technology in ways consistent with their electoral success. Private enterprises will undertake initiatives designed to maximize profits. Educational institutions will design programs with keen eyes on labour market projections. Even CFIC will adhere to guidelines outlining our values.
There is nothing to be gained by maintaining the illusion of ideological neutrality; on the contrary, it compels unforced errors and reveals hypocrisy. We should recognize instead that evidence-based knowledge is “baked in” to the “small-l” liberal project, broadly defined and openly pursued.
Excellent article! Thank you, Howard. I remember back to my university days, where my professors were skeptical of “value-free sociology”. They said that pure objectivity was a myth. They told students that the best they could do was to declare their potential biases up front in their research, and list their working assumptions in Chapter One.
Not a bad beginning for any course – especially in the social sciences. Another (most prominently featured in Anthropology) is to start with an exploration and advocacy of “cultural relativism.” There are other aspects of the issue that arise without falling into some of the traps inherent in what’s elastically known as “postmodernism” as an umbrella term for all sorts of “decentering” exercises meant to undermine “scientism,” but which mainly succeed mainly in providing fodder for the likes of Jordan Peterson and his claim that 75% of all faculty in post-secondary Humanities and Social Science departments should be terminated as covert agents of moral degradation and the decline of modern civilization – one more time.
It’s all very tricky and the sinkholes are many, but if we are alert to general cultural, specific institutional, disciplinary, and personal presumptions and “working assumptions,” we can still produce work of excellence (or as much as our flawed imaginations and tilted “values” may allow. On the whole, a simple admonition to be true to ourselves and locate our work along the spectrum (spectra?) of intersecting ideological patterns, we can at least say that we are being honest and that is something worthwhile … and something AI is inherently incapable of saying since its biases lack what Hume called sentiments, a feature of human beings whom it is, according to Alexander Pope, remains the “proper study” of (and by) our species.
Apologies, by the way, to any hide-bound “originalists” who disdain my opting for “political correctness” in substituting “human” and “our species” for “MAN.” I like to think that Mr. Pope, like most (mostly) men of his time, would have moved on with the times.