My eldest nephew was three years old the first time I let him down. We were watching a video about deep-space exploration, and I had just finished explaining that “we” had launched Voyagers 1 and 2 in 1977, to study the outer solar system and interstellar space.
“We? Did you do that?”
“Well, no, I wasn’t even alive. Scientists back then launched those probes.”
“Then why did you say ‘we’?”
“I meant ‘we’ in the general sense. Human beings. Our species did this amazing thing.”
“Then why didn’t you say that?”
Good point. Why didn’t I? My nephew had observed a rhetorical move often used without close examination. “We” are destroying our environment. “We” have been a brutal and oppressive species. And in a common vein of secular discourse, until the Enlightenment, “we” lived in a hopelessly mystical world.
The danger of that “we” is two-fold. It allows us to rest on unearned laurels, like when I treated Voyager missions as points of personal pride. It also oversimplifies our histories until they no longer offer enough nuance to meaningfully inform the creation of a better world.
The COVID-19 pandemic has been a devastating illustration of this danger. Scientific illiteracy has fueled deadly waves of noncompliance with public-health mandates. Then there’s all the desperate band-wagoning over whatever freshly trending alternative therapy a given pundit, celebrity, or influencer recommends.
It’s easy and perhaps even comforting to blame individuals for their refusal to “understand the science”. Why can’t they simply “be more rational”? From the pandemic’s outset, though, it was clear that many still think of “Science” as an unwavering body of knowledge. It’s not, though, and never was. It’s a methodology for discovery. And its output is a series of well-tested theories that still require further trial-and-error to advance in new terrain.
And COVID-19 was—and is—very new terrain.
Ever was it thus
Individual scientists are also human. Socioeconomic pressures shape their work. And the scientific method relies on replication studies to correct for bias and error. When treated as an absolute authority, abstractly deified concepts like “Reason” and “Science” could only ever disappoint. Throughout the still-unfurling biomedical saga of SARS-CoV-2, we needed (and often failed) to remember the human beings behind the work.
But the failure doesn’t just belong to the average joe reading the news. Popular secular figures who prop up myths of pre- and post-Enlightenment thinking have unfortunately added to this disillusionment. Our species didn’t “level up” its neurobiology in the 17th and 18th centuries, so why do so many pretend otherwise? In wielding histories of Western empiricism as a simplistic bludgeon, rationalists often only end up affirming that magical thinking has not left us after all.
What is this myth of the Enlightenment that secular folk sometimes take for granted when talking about histories of progress? Where does it come from? And what does it leave out about baseline human behaviors, which we could sorely stand to learn from today?
500 years of myth-making: from Renaissance Humanism to the Age of Reason, and then some
We call the study of the writing of history “historiography”. But we don’t need to dig deeply into academic analysis to find our first red flag. Magical thinking permeates every part of the myth of pre- and post-Enlightenment thought.
For one, when did this Enlightenment or “Age of Reason” begin? When did it end? And how, in that not-so-long-off pre-internet era, was consensus achieved between scholars across France, Germany, England, Scotland, Spain, and Italy?
There is no perfect answer to any of these questions, for the same reason that the poet Petrarch (1304-1374) is considered an early Renaissance writer, but later-born Chaucer (1343-1400) is considered medieval. To riff off Drew Carey’s late-20th-century improv show, Whose Line Is It Anyway?: the rules are made up and the points don’t matter. Historical periods are generally named after the fact—and, the naming process? Well, surprise, surprise: It’s often strongly related to the work of nation-building that people do by telling stories about a culture’s greatest hits. Look at “my” people’s great thinkers! Look at everything “my” people accomplished!
One of the most influential definitions of an “Enlightenment” mentality comes from the German philosopher Immanuel Kant. His 1784 essay “An Answer to the Question: What is Enlightenment?” argued that enlightenment was “man’s emergence from his self-imposed immaturity.” He then defined “immaturity” as “the inability to use one’s understanding without another’s guidance.” The freedom to articulate and act upon one’s own thoughts, in other words, was the enlightenment’s core concern.
This essay emerged near what is usually called the end of the Enlightenment. You can see hints in it of the more aggressive history-formation that followed, which imprinted Kant et al. into European canon. Early- to mid-1800s biographies of empirical science and Western philosophy then began to establish a curriculum of Enlightenment-era thought.
How far back does this “enlightenment go”?
Kant’s essay should be a bit unsettling to read today, though, considering where “freedom” discourse finds us. After all, his argument is that man’s “self-imposed immaturity” arises from a lack of courage to speak one’s mind. Enlightenment, for Kant, meant holding views that openly critiqued any formal mandate encroaching on personal autonomy. In his essay, this external encroachment could be a restriction on belief, or even state taxation. And so, yes, as you might have guessed, this thesis differs significantly from ideas of scientific progress and empirical rigor more commonly associated with Enlightenment thought today.
So, to fully understand what “the Enlightenment” tends to signify in secular discourse today, we have to go further back.
Way further back. We’re talking the 1300s, and the start of Renaissance Humanism. That’s when Christians exhausted by medieval theology turned to classical-antiquity thinking, going back “to the sources”. Reviving older Latin and Greek thought, however, required a commitment to expanding literacy, and studies in rhetoric and moral philosophy.
(This earliest version of “humanism”, then, shares with today’s an interest in improving the quality of knowledge used to inform worldly action.)
Here’s the problem, though: what originally emerged to counter convoluted medieval philosophy soon led to another body of thinkers being taken for granted. By the early 16th century, cults of personality and arguments from authority drawn from earlier centuries were everywhere. And they aggravated people like Paracelsus. This German-Renaissance physician could not stand fellow physicians’ refusal to give up the prestige of being known as secret-keepers for ancient and revered practitioners. He wanted them to pursue universal medical knowledge. They wanted money and accolades for their elect learning.
And when I say “could not stand”…
Paracelsus famously burned the texts of lauded (and outdated) writers in the field. This was done to make the point that a good physician learned from the body and world around him. A good physician, that is, would refuse to rest on the laurels of others from prior eras. Paracelsus was part of one of many pro-experimentation disciplinary revolutions preceding what we regard as the “Enlightenment”: a movement with differently canonized starting-points across Europe.
The difference in those starting-points, though, gives us a clue as to why Enlightenment-mythologizing takes the form it does today.
Let’s consider two. For the French, René Descartes’ Discourse on the Method of Rightly Conducting One’s Reason and Seeking Truth in the Sciences launched the Enlightenment in 1637, with its famous maxim, “I think, therefore I am.” For many in English circles, Isaac Newton’s Philosophiæ Naturalis Principia Mathematica got the ball rolling far later, in 1687.
These are very different texts. Descartes was outlining a methodology for truth-seeking from doubt, through reason, that respected social mores. His methodology also took the existence of Self, Reason, and the Christian God (as a guarantor of Reason) as givens. Newton’s mathematics, on the other hand, supported existing astronomical data with elaborate calculations. His calculus established laws of motion, along with the mass and orbital mechanics of celestial bodies.
Potay-to, potah-to, right? Eh…
Today, many secular authorities like to name-drop Descartes and Newton, along with other noted philosophers like Francis Bacon and Blaise Pascal. This, they do to extol the wonders of “reason” triumphing over the mysticism of faith and ending the “Dark Ages”. How glorious their work, which shone a light on an otherwise hopelessly superstitious world!
And yet, all four were strong theists.
Newton was particularly fascinating, because his physics plainly illustrated a mechanical universe that could run itself once in motion. However, this finding only deepened his interest in contrapuntal research into how his Christian God still showed up in Creation. He spent some twenty years on alchemy, trying to find evidence of divine intervention through transmutation. Later, he immersed himself in Biblical prophecy, attempting—much like the Renaissance humanists—to glean secret Christian wisdom from the ancients.
And does this overt spirituality discount these philosophers’ findings and contributions to the advancement of scientific method and formal argumentation? Not in the slightest.
But the misrepresentation of these thinkers, to wield history as a blunt-force instrument against the supposed “rise” in irrationality today? Oh, yes, that changes plenty. It serves as an excellent reminder of one common human behavior throughout all these centuries and epochs. It tells us that “we” have always been interested in elevating past thinkers to condemn the present. (Or, if the present is already deeply nostalgic, then in going the opposite route, by condemning anything but self-reasoned truth.)
What does secular discourse lose, when it employs histories of Western thought this way?
Rethinking the “Enlightenment”
Translation took time. As such, many Enlightenment ideas from the 1700s were still spreading across Europe as its countries moved into what’s known as the Romantic Era or “Age of Reflection”. This period would be followed (with fuzzy temporal borders, and an overlap of 20 years) by the Victorian Era.
The existence of both cultural periods should have cast immediate doubt, though, on the idea that the Enlightenment had radically transformed how human beings understand the world.
For one, the hagiographical biographies of past philosophers that emerged in these eras would have had Paracelsus pitching a fit. All their glorifying of individual “genius” from past eras was not in keeping with the search for universal truths through worldly exploration.
For another, many artists and philosophers of these periods viewed science as too mechanistic. To them, many sciences diminished the value of the “whole” by seeking to quantify the natural world’s constituent parts. Granted, “Science” still flourished in the 1800s. The practice developed more rigid disciplinary frameworks, improved instrumentation, and yielded a range of industrial applications. But a popular preference for aesthetics routinely challenged its advances. After all, Beauty, the arts, and the mystic were surely still vital to understanding the cosmos, no?
Oh yeah, the Romantics and Victorians were superstitious
Contemporary mythologizing of the “Enlightenment” tends to mumble over this part, though. Folks like to skip to post-1859—or better yet, to the 1870s. (A sort of handwavy “and then Darwin!” if you will). But even this period was messy. This was when Darwinian ideas most clearly joined with eugenicist thinking (care of Herbert Spencer’s “survival of the fittest” and Francis Galton’s familial lineages of genius). And yes, this was also when T.H. Huxley and John Tyndall helped to normalize secular approaches to science. But even this most “empirical” moment was also heavily informed by anxieties of empire, race, and class upheaval.
Such biases then carried over to the early 20th century, when the collapse of balance-of-power imperial rule set into motion devastating wars, state oppressions, and military inventions drawing from accelerated scientific discovery. You won’t find much ethical guidance from humanist thought at this time, when “Science” instead served old irrational biases, and new.
When, then, did our species ever live through an era of reason’s triumph? It’s held up often enough in claims about today’s supposed devolution into irrationalism. But when was its heyday? When was this golden age of empirical thought to which rationalists call for a return?
The dangers (and insights) of Enlightenment thought
It is not a neutral act to pretend there was ever a discrete heyday for rationalist and empirical thought. Today’s human beings don’t differ widely, biologically speaking, from the audiences for that wide range of philosophers who, over 500 years, across a handful major European languages, advanced what is now grouped vaguely together as “Enlightenment” thinking.
Historical accuracy isn’t really the point, though, of lumping together a wide range of resistance-thinking under this vague notion of the “Enlightenment”. It’s about fostering the idea that some people are superior simply for having countered the status quo and perhaps fought for independent thought. Alarmist claims today about the rise of inferior mass beliefs give us an easy opponent, and then our speaker gets to be our beleaguered champion.
But this gameification of history only impoverishes all our intellects. For one, it leaves us less likely to recognize bias in our own “purely rational” thinking. For another, it keeps us from a deeper understanding of how human behaviour has always coherently informed worldly actions.
Which leaves us where, exactly?
The Enlightenment didn’t change everything for the human species, and that’s okay. Really. It is.
How we use the idea of the Enlightenment today, though, can make a difference, as we’ll discuss in this week’s “Tooling Around”. Philosophers of past centuries were complex individuals, containing multitudes, just as we are today. They sustained significant amounts of cognitive dissonance in their everyday lives, just as we do now. And the work they produced, like our own, was often loftier in aim than their practical realities could ever hope to match.
Look upon today’s world of warring information silos, then, and don’t despair.
When we see people acting “irrationally” even though the “science is plain as day”, it does not serve us to pretend that we’re witnessing the fall of human intellect. Its golden age never existed. We’ve always been in a muddled mess of reason and superstition, wavering between deference to past authority and self-determined truth.
And if we can hold in humanistic tension the knowledge that every generation has its share of people aspiring to greater clarity? If we strive to remember that every generation has fallen short of the purist’s mark? Then maybe, just maybe, we can shift our expectations into something more pragmatic, and actionable. We can choose to be less adversarial, and less fixated on the construction of historical opponents and champions. Instead of those tired ways of arguing, we can advance some truly “higher” thinking about the problems of our day.
And if we pull it off, who knows?
“We” might even contribute something that future generations would be just as proud to mistake for achievements all their own.
Let’s talk about it, in this week’s next post.