What Happened to the Polymaths?
Preserving the ‘unity of the intellect’ is essential to securing and expanding the possibilities of progress
By Timothy Sandefur
The very phrase “Renaissance man” dates back, in one way or another, to the 15th century. It seems to have originated in connection with Leon Battista Alberti, an Italian architect and poet who wrote pathbreaking scholarship on cryptography as well as Europe’s first treatise on art theory. “A man can do all things if he will,” Alberti wrote, and his compatriots seem to have agreed. Foremost among these was Leonardo da Vinci, the archetypical “universal man” (uomo universale in Italian) whose mastery of art and science made him equally comfortable engineering bridges or painting the Mona Lisa.
But the basic principle of the Renaissance man was not the creation of Italian or even Western civilization. It harkened back to a principle that the 12th century Muslim scholar Averroes called “the unity of the intellect”: the idea that all comprehension is ultimately the same process, whether devoted to the arts, the sciences, or even an ordinary technical task like fixing a broken tool.
Of course, the Renaissance man did not vanish with the end of the Renaissance. Enlightenment polymaths such as Francis Bacon, Robert Hooke, Johann Wolfgang von Goethe, Thomas Jefferson and Alexander von Humboldt were multifaceted talents equally at home in the sciences and the humanities. Even in modern times, figures such as Jacob Bronowski, Albert Schweitzer and Loren Eiseley devoted their energies to wide-ranging pursuits of art, science and philosophy.
Their dream was, in E.O. Wilson’s term, “consilience”: the convergence of multiple intellectual pursuits into a single model of the world, one that would give us the unprecedented ability to empower humanity and relieve its suffering. That consilience would make good on the promise of classical liberalism: that there are universal human values to which all people—regardless of their cultural or historical background—can appeal and aspire. Yet today, the dream of such shared principles is in disrepute, and the idea of a universal intellect making meaningful contributions to multiple intellectual endeavors seems increasingly improbable. Even while the academy trumpets its commitment to “multidisciplinary approaches,” intellectual specialization and cultural relativism have actually made it harder for anybody to add significantly to more than one area of scholarship.
The Knowledge Burden
Part of this is because of how much we’ve learned since Alberti’s and da Vinci’s time. In 2009, Northwestern University Professor Benjamin Jones attributed the eclipse of the uomo universale—or as we would say today, the uomo o donna universale—to the “knowledge burden,” that is, the difficulty of a student getting up to speed on any subject. Nowadays, it takes so long to master the basics that by the time any would-be scholar does so, he or she has lost the ability to think creatively. As a result, innovation today is more likely to come from teams of people than from any single genius with a brilliant idea.
There’s certainly something to that. In 18th century America, when the field of architecture was in a primitive state, it was possible for an amateur like Jefferson to transform the nation’s building by introducing the ideas of 16th century Italian architect Andrea Palladio. That made Jefferson the country’s first significant designer of public buildings (despite the fact that his designs were often rather flawed). But today, with the innovations of great 20th century architects like Frank Lloyd Wright, Ludwig Mies van der Rohe and I.M. Pei in place, it’s less likely that a brilliant self-taught dilettante could significantly influence the field.
Another reason solo creativity seems to have been more common in previous ages is that access to knowledge was confined to a narrower range of people than in our day. Simply put, Jefferson had fewer competitors than any intellectual now does, in part because he had—relatively speaking—more tools at his fingertips. It’s no coincidence that he was both a polymath and the owner of America’s largest personal library (which became the nucleus of the Library of Congress). Rare is the poor genius who can make breakthroughs without knowing what has come before.
It does happen, of course—Jefferson’s contemporary, Benjamin Banneker, for example, was an almanac writer, surveyor, clockmaker and astronomer, despite being born Black in Maryland in 1731, with nothing like the privileges the aristocratic Jefferson enjoyed. But the odds are doubtless stacked in favor of those who can more easily obtain the knowledge they need to work creatively, or who have the encouragement of fellow thinkers, which is so important to creativity.
The fact that few people had the opportunities enjoyed by polymaths of the past explains not only the distinctiveness of such figures in their own day, but also why it’s harder to be a Renaissance man or woman now. Over time, increasing abundance and access to scholarly resources tends to flatten out the bell curve, resulting in a world in which more people make modest individual contributions, and fewer stand out as universal geniuses. In fact, would-be uomini o donne universali nowadays are competing not only against specialists in each field they encounter, but also against a past stocked with figures like Jefferson and da Vinci.
The Ubiquity of Distraction
There’s also the ubiquity of distraction in today’s world. Before the advent of television and radio, many people doubtless found daily life intensely boring and turned to scholarly pursuits to relieve the tedium. We certainly owe more to boredom than we realize. It was to pass the time during a pandemic “lockdown” that Isaac Newton fashioned his theory of gravitation, and Charles Darwin came up with the idea of evolution after reading Thomas Malthus’ An Essay on the Principle of Population “for amusement.”
Modern life is so pervaded by entertainment and “breaking news” that an aspiring Renaissance boy or girl is likely to be sidetracked a thousand times before even cracking a book of physics or biology. This fact once led William F. Buckley to observe that the last person who can plausibly be said to have known everything there was to know was the 16th century Dutch humanist Erasmus—and that anyone today who wanted to have a reputation for knowing everything should just read “People” magazine cover to cover.
It’s not just that pop culture wastes the brain as junk food wastes the body. It’s also the nature of the distractions. TV and smartphones erode the attention span and turn users’ appetites toward immediate gratification to such a degree that they can destroy the habit of concentration. Anybody who has started out to read a book, only to find himself a few minutes later holding the book in one hand and scrolling through Twitter in the other can testify that it’s harder than ever to focus on the deep reading necessary for serious intellectual enterprises.
In his now-classic book “Amusing Ourselves to Death,” Neil Postman bemoaned the way 20th century Americans were being “narcoticized by technological diversions.” Reflecting on the classic Lincoln-Douglas debates of 1858, Postman asked, “Is there any audience of Americans today who could endure seven hours of talk? or five? or three? Especially without pictures of any kind?” And he was writing in the ‘80s, before cable television, let alone the internet and smartphones, became ubiquitous.
But while it’s easy to air cranky complaints about the destruction of “the old ways,” it’s important to keep matters in perspective. For one thing, our age may offer more distractions than existed a century ago, but it’s not as if everybody in 1923 was devouring the collected works of Shakespeare. Most of those who waste their time on intellectual junk food today would probably have done the equivalent then, too, devoting their energies to “narcoticizing” themselves in potentially more dangerous ways—like, say, narcotics. One reason Prohibition became law in 1920 was because American alcohol consumption was tremendously high, and violence was a common result. If smartphone distraction is substituting for barroom brawls, that’s a good thing.
For another, much of the pervasive distraction available today is of greater intellectual or aesthetic value than we tend to realize. Thanks to Wikipedia, Archive.org, Spotify and other resources, we now carry in our pockets the ability to access the greatest scholarship and art in world history, largely for free. Thus, whether we realize it or not, today’s intellectual menu is more nutritious than it was generations ago. With the press of a few buttons, any of us can listen to hundreds of recordings of Rachmaninoff—including Rachmaninoff’s own performances—which no previous generation could have done. Even the best-educated person on earth in 1923 would have been lucky to hear Rachmaninoff play even once. Anyone who sees a beautiful bird while on vacation can look it up online and learn within seconds what its Latin name is, what its eating habits and typical lifespan are, and whether there’s a charity one can donate to, to help prevent its extinction.
As Richard Dawkins once put it, “You could give Aristotle a tutorial”—not because each of us is individually more brilliant than the great Greek philosopher, but because our intellectual world has built cumulatively to a greater height than he could access. And although superstition and fraud are still regrettably commonplace, our atmosphere of knowledge has been purged of much of the nonsense that polluted even the great minds of Aristotle’s day. Thus there’s a sense in which each of us is a Renaissance man or woman without even realizing it. Within a few minutes, any person with access to the internet can learn more about a bird, or Rachmaninoff—or about how to fix a broken dryer or carburetor—than da Vinci or Jefferson was likely to learn in a year. The internet and other technological resources have democratized the uomo universale into the popolo universale.
But that’s true, of course, only if we choose to take advantage of it. There’s a kind of moral hazard in the bounty of our knowledge: It can become an excuse—even a subconscious one—for intellectual laziness. The ready availability of such a rich body of research and art makes it easy to take it for granted, and to devote our energies elsewhere, confident that if a question comes up, we can “just Google it.” That’s not necessarily a bad thing. On the contrary, being a Renaissance man or woman is energy-intensive, and the great advantage of intellectual specialization is that it’s more efficient to let doctors worry about treating cancer, engineers focus on building satellites, lawyers attend to the lawsuits—and to consult them only when one needs their expertise. But both specialization and intellectual sloth pose a threat to mental health no less than physical lethargy does to the body.
As the creators of the “liberal arts curriculum” knew, a broad exposure to ideas in a wide range of fields makes one more creative and insightful, particularly because it enables people to recognize patterns or similarities in different fields of study. When Darwin encountered Malthus’ “growth model”—a principle of economics—he realized that it held the seeds for an insight equally applicable to the world of biology. Today’s life sciences, in fact, owe their very origin to this “multidisciplinary” insight. It’s impossible to guess how many equally brilliant connections today’s researchers may be overlooking right now by not attending to what’s being said on the other end of campus.
The Two Cultures
One obstacle here is a lingering prejudice within the intellectual world—one the 20th century polymath C.P. Snow labeled the problem of the “two cultures.” Snow—a chemist, novelist and government official—coined that term in 1959, in a lecture in Cambridge, England, in which he complained that intellectuals had become divided into two warring camps, which he called the “literary” and “scientific.” The scientific culture was marked by openness to innovation, skepticism and humanism, but was handicapped by a prejudice against the humanities, whereas the literary culture was tainted by a kind of nostalgic prejudice against technology and everything it stood for.
In short, Snow thought novelists, poets and playwrights had failed to embrace, or even comprehend, the spirit of 20th century science. Their work remained tethered to an obsolete romanticism, which viewed science as a vulgar enterprise, destructive to social stability and corrosive to the human spirit; one thinks of William Blake’s “dark satanic mills.” Scientists knew better, Snow continued, but they tended to retaliate by treating the arts as frivolous and irrelevant, and reveling in a certain smug philistinism. Snow concluded that this dichotomy was unhealthy for Western culture, and urged writers to learn basic science and embrace it in their work—and scientists to find ways to integrate their insights into inanimate nature with the moral and political philosophical concerns of the humanities.
Snow’s lecture was a mild criticism of the scholarship of his day, but it triggered an enraged response from literary critic F.R. Leavis a few weeks later. In a lecture of his own, Leavis accused Snow of advocating a crude, dehumanizing consumerism, and lambasted Snow’s own novels in terms so ruthless that newspapers were afraid to reprint his words for fear Snow would sue for libel. Snow, however, took Leavis’ attack as evidence of his thesis. Leavis’ words proved that the gatekeepers of 20th century culture failed to appreciate the improvements in the standard of living that science, technology and industrialization had brought about, and were wasting their time bemoaning the loss of the supposedly more spiritual ways of the past. Leavis and his supporters were essentially reactionaries, nostalgic for a pre-industrial tranquility that had never actually existed. It was a mere illusion, created by romanticism and snobbery.
“Industrialization,” Snow shot back in his response to Leavis, “is the only hope for the poor.” He continued:
I use the word “hope” in a crude and prosaic sense. I have not much use for the moral sensibility of anyone who is too refined to use it so. It is all very well for us, sitting pretty, to think that material standards of living don't matter all that much. It is all very well for one, as a personal choice, to reject industrialization—do a modern Walden if you like, and if you go without much food, see most of your children die in infancy, despise the comforts of literacy, accept twenty years off your own life, then I respect you for the strength of your aesthetic revulsion. But I don’t respect you in the slightest if, even passively, you try to impose the same choice on others who are not free to choose. In fact, we know what their choice would be. For, with singular unanimity, in any country where they have had the chance, the poor have walked off the land into the factories as fast as the factories could take them.
In Snow’s eyes, literary disdain for technological advancement and the creature comforts that industrialism makes possible was a kind of anti-humanism. The only solution was to reconcile the literary and scientific worlds with a new approach to literature that would celebrate the spirit of the scientific enterprise and appeal to Averroes’ “unity of the intellect”—in other words, art that, instead of bemoaning the alleged “loss of innocence” or “destruction of nature” that scientific and commercial innovation brings about, would honor the virtues that make progress possible, and highlight the way it had banished disease, poverty and ignorance.
The Culture of Hope
More than six decades later, it’s clear Snow was right—and that the problem has only grown worse. Except for some science fiction (still a niche genre), the world of culturally respectable drama, poetry and fiction remains deeply reactionary, notwithstanding some of its practitioners’ pretenses to radicalism. Art that rejoices in the conquest of nature is still rare; far more common are works such as “Jurassic Park,” “Avatar” or “Strange World,” which portray technological and industrial progress as a destructive force antagonistic to the “human” values of beauty, peace and authenticity.
Worse still, the humanities have recently fallen under the spell of a doctrine overtly hostile to the idea of the “unity of the intellect” which made the uomo universale possible. The search for such unity assumes that knowledge is a universal good and that understanding nature and mankind is a goal to which all people can and should aspire. Postmodernism, however—the philosophical foundation of such enterprises as critical race theory, third-wave feminism and even fat studies—rejects the notion that there are such things as universal truths, and holds instead that knowledge is always “situated” in experience or status.
This means there are different “ways of knowing,” each of which is fixed within a person’s racial, sexual and cultural background, and none of which can claim greater validity than any other. Indeed, Postmodernism views such claims as “hegemony”—i.e., intrusions upon one’s identity that one must resist, violently if necessary.
Postmodernism’s war against universality has made startling headway, even in mathematics and the hard sciences, where asserting that 2 + 2 = 4, or that males and females are objectively, biologically different, or that obesity is unsightly and unhealthy are regarded as forms of “epistemic injustice.” The inevitable result of viewing knowledge as a form of power in this way—of blurring reason and will—is to fracture the academic field into warring camps, and to wall off disciplines from those who might otherwise “appropriate” it.
Far from being post-modern, this ideology is a reactionary retreat into precisely the medievalism against which Alberti, da Vinci and the other universal men were rebelling. In fact, it would have been familiar to Averroes, the great Muslim thinker who originated the phrase “unity of the intellect.” He was born in 1126, about 30 years after the publication of a book called “The Incoherence of the Philosophers,” which excoriated Muslim intellectuals who found value in the works of such Greek thinkers as Plato and Aristotle.
In its condemnation of Muslim “heretics” for embracing rational philosophy and science, “Incoherence” proved fantastically influential—and helped bring about the Islamic Dark Age, during which the Muslim world faded from its role as the vanguard of human inquiry. Averroes resisted the book’s influence in his writings and in his role as a judge, but eventually its adherents gained the upper hand. He was convicted of heresy, stripped of his title and banished. His books were publicly burned.
Today’s postmodernist assault on the unity of knowledge is generally housed on the political left, but it has a large and growing constituency on the right as well, among postmodern conservatives who reject the “cosmopolitanism” of free markets, or embrace nationalistic or even racial theories to condemn international alliances that were created to protect the democratic world at large. Right-leaning anti-technology activists are just as likely to denounce life-improving technological innovations—based on reheated versions of the stale romanticism Leavis embraced in the ‘50s—as the left is to use politics and even violence to block innovation in the name of, say, protecting the environment—including even innovations that would produce a cleaner world.
But whoever’s to blame, it’s clear that the cleavage between the literary and scientific cultures has worsened in the past generation, notwithstanding the vast improvements brought about by information technology, molecular biology and other breakthroughs. And the consequence hasn’t just been to crack the edifice of the greatest of all human enterprises—the study of the world and humanity—but to retard the curing of disease, the feeding of the hungry, the fostering of new energies, and the creation of art that could be as appealing to the spirit as it is satisfying to the senses.
Fortunately, there are those who continue to champion the unity of the intellect. From computer scientist David Gelernter to Queen guitarist Brian May to the late philosopher Roger Scruton, many scholars still pursue their vision through multiple fields of study, contributing to the growth of knowledge and to what the Renaissance man Francis Bacon called “the relief of man’s estate.” And the bounty of knowledge available to everyone today gives those of us with lesser gifts far more potential to add our own share, even if we don’t rise to their level of achievements and fame.
In his 1995 book “The Culture of Hope,” the poet Frederick Turner—one of the few artists to treat these issues seriously—called for a “rebirth of the humanities, re-founded upon the rich knowledge pouring out of the sciences, and a revival of literature as the tongue of the species.” But, he continued, accomplishing that would require humanities scholars to “give up many cherished principles ... of critical and cultural theory.... They will have to reject most of their current poststructuralist, multiculturalist, and feminist authorities.”
There’s little sign of that happening soon. But humanity has no real alternative. Either we will return to the unity of the intellect and open a new chapter of progress, or humanity’s progress will grind to a halt and the intellectual enterprise will shatter as Averroes’ own culture did. Restoring the unity of the intellect—and bridging the gap between the arts and the sciences—is essential not just to preserving and improving our own lives, but to expanding the possibilities of human progress for future generations.