Defending the Constitution of Knowledge
Benjamin Klutsey and Jonathan Rauch discuss biases, cancel culture, the importance of institutions and more

In this twelfth installment of a series on liberalism, Benjamin Klutsey, the director of academic outreach at the Mercatus Center at George Mason University, talks with Jonathan Rauch about fallibilism, groupthink, cancelers and trolls, and why the constitution of knowledge is better than the marketplace of ideas. Rauch is a senior fellow in the Governance Studies program at the Brookings Institution. His latest book is “The Constitution of Knowledge: A Defense of Truth” (2021). Previous works include “The Happiness Curve: Why Life Gets Better After 50” (2018), “Political Realism: How Hacks, Machines, Big Money, and Back-Room Deals Can Strengthen American Democracy” (2015) and “Gay Marriage: Why It Is Good for Gays, Good for Straights, and Good for America” (2004).
This series also includes interviews with Alan Charles Kors, Emily Chamlee-Wright, Ilana Redstone, Richard Ebeling, Robert Talisse, Danielle Allen, Roger Berkowitz, Virgil Storr, Kevin Vallier, Juliana Schroeder, John Inazu and Peter Boettke.
BENJAMIN KLUTSEY: Today, we’re joined by Jonathan Rauch. He’s a journalist. He’s a senior fellow at the Brookings Institution. He’s the author of eight books and many articles on public policy, culture and government. He’s a contributing writer of The Atlantic and recipient of the 2005 National Magazine Award, the magazine industry’s equivalent of the Pulitzer Prize. His recent book, “The Constitution of Knowledge: A Defense of Truth,” is the subject of our conversation today, and I hope our listeners will order copies. It is a very insightful book, and it confronts some of the serious challenges of our time, and I certainly felt inspired reading it.
Thank you so much, Jon, for joining us and also for writing the book.
JONATHAN RAUCH: I’m very happy to be with you, Ben.
Biases and Groupthink
KLUTSEY: Thank you. Now, I guess we’ll just delve right in. You spend a good bit of the book talking about epistemology and biases. One thing that stood out to me—reason is the slave of the passions; we’re almost inherently biased. One concept that also comes out of that was epistemic tribalism, the idea that we think with our tribes. Can you unpack that for us? Is that the same as groupthink?
RAUCH: Yes, more or less. Humans, in our evolutionary environment, we starve, we die if we’re not in harmony with our group and our tribe, and seeing eye to eye with them on a lot of things. It’s very important to us to find social harmony with our group. One way we do that is by, both consciously but also unconsciously, harmonizing our beliefs with the people around us.
There’s a famous experiment from 1951—it’s been replicated many times since with many variations. You take eight people; you put them in a room together; you hand them what they’re told is a vision test. What it is, is there’s a line on the left and there are three lines on the right of different lengths, and you say, “Which line on the right is the same length as the line on the left?” You make it ridiculously obvious, and individuals doing this exercise by themselves always get it right because it’s obvious the correct answer is B, for example.
Now, you put that person in a room with eight people, and you ask the eight people which line matches the line on the left. But here’s the thing: Seven of the people are confederates of the experimenter, and they all give the same wrong answer. They say the answer is C. When it comes to the turn of that last person, who’s the actual subject, what do they say? A third of the time in trials, they’ll also say C, even though the right answer is staring them in the face and it’s obvious.
75% of people will do this in at least one trial—they’re repeated trials. Then, when you ask them afterwards, some of them will say, “Well, I knew better, but I was going along with the group,” but some of them will say, “Well, I thought maybe it was some kind of optical illusion and they were seeing something I wasn’t.”
What happens here, as you can imagine, you’re getting groups of people who are all reinforcing each other, all saying the same thing. Whether they all have the belief in the same God or Christianity, or they all want to condemn the same enemy or outsider, or they all want to have certain sacred beliefs. They may think that they’re consulting with each other and checking their beliefs, but actually, they’re in an echo chamber.
These effects can all reinforce each other so that we wind up going down these rabbit holes of tribal belief. This is so fundamental. It’s not like evidence goes from our eyes to our brains and then we say, “Oh, what are other people thinking?” and then tune our beliefs; it’s more like evidence goes from our tribes to our eyes and then to our brains because this process isn’t even conscious. There’s other evidence showing that you don’t even have to think about this. Just being with other people will actually change what you perceive.
That’s why we need science, Ben—liberal sciences, I call it—the whole system, our truth-seeking system. It’s to get around these fundamental, groupthink tribal biases.
Fallibilism and Skepticism
KLUTSEY: Right, so we need some guardrails to help us get to the truth and be more accurate. Now, fallibilism is an important concept that you talk about in the book that helps us to improve our knowledge of facts and ideas so we can understand certain phenomenon. How does fallibilism work, and can we incorporate that in our daily lives?
RAUCH: Yes, we can and we should, and if we’re academics or scientists or mainstream journalists or lawyers or judges, we do incorporate fallibilism. Philosophers, ancient and fairly modern, looked at the problem I just described—groupthink. They were well aware of these problems. They were also well aware of a lot of other cognitive flaws that incline us to come to a lot of wrong conclusions about the world.
By the mid-16th century or so, they were well down a path of what’s called skepticism, which is the belief that humans really can’t know anything because we’re never certain. We could always be mistaken. Sometimes, we’re most mistaken when we’re most sure that we’re right, so feeling certain is no guarantee of anything. Talking to other people is no guarantee of anything—we could all be just reconfirming each other’s biases. The skeptics threw up their hands and said, “Well, that means we can’t know anything because we can never be sure.”
A new school comes along, starting around the time of the Scientific Revolution but really building out its main ideas a couple hundred years later. It’s now called fallibilism. That’s the idea—knowledge can always be provisional. The things we believe on any given day might be wrong, and they always need to be checked and rechecked if new evidence comes along, but that doesn’t mean we can’t know anything. Just the contrary. Fallibilists had this revolutionary idea: Knowledge is not what we’re completely certain of; it’s what we’re not completely certain of. It’s what we learn through a constant process of trial and checking.
That doesn’t mean that we start every morning from scratch—we have to get up every day and re-prove that gravity exists or rediscover the speed of sound. It doesn’t mean we get up every day and have to argue with a Holocaust denier. We don’t. The reason for that is if a belief, if a proposition stands up to being tested again and again—lots of different tests, lots of different people, lots of different viewpoints—over time, if it withstands that process, it gets a heavy presumption. That becomes our base of knowledge. It’s going to take a lot of work to unseat something that’s fundamental to our base of knowledge. It’s going to have to be a pretty radical, important new piece of evidence.
That very rarely happens, but in principle, we might all be wrong. That has an important ramification, which is the core of the constitution of knowledge, which is what my book is about. It’s a social ramification. If any of us could be wrong, that means that any priest or prince or politburo or potentate or anything else beginning with a P could also be wrong. That means no one gets special standing to say, “I know it’s true. It’s been revealed to me. You have to believe it. There’s only one possible truth here.”
Which means no one gets to end the conversation, ever. Which means we’re all involved, all of us. You and me on this podcast, the listeners, everyone in the reality-based community is involved in this constant search for where our mistakes might be. That’s the fundamental idea behind fallibilism: We find knowledge by hunting for error, not by hunting for truth. At the end of the day, what stands up in this hunt for error, that’s what we know.
Humility and Compromise
KLUTSEY: I imagine that involves some humility as well. There are a lot of people who say, “I could be wrong, but—” and then they go ahead to state whatever it is that they believe or they’re arguing. Sometimes they mean it, but oftentimes they don’t really mean that they think they could be wrong—it’s just a matter of a statement. I’d imagine that this is something that requires a lot of humility.
RAUCH: Yes, kind of. This gets into some of the really cool ideas behind “The Constitution of Knowledge.” The constitution of knowledge is our social operating system for figuring out what’s true in a way that keeps us in touch with reality and doesn’t involve killing each other, or ostracizing or jailing each other, and it works a lot like the U.S. Constitution. It’s a decentralized process, it works according to rules, everyone can participate but no one person or group can be in charge—very similar structure.
They’re similar in another way, too: When James Madison writes the Constitution, what he’s really doing is creating a compromise-forcing mechanism. He’s saying the only way you can make a law and get the power in society to use force against other people is by persuading them. That’s the only way to do it. You’re going to have to compromise in order to make laws and everything else. But he’s not saying that you need to walk in the door saying, “I’m here to compromise. I don’t really have principles. I don’t feel strongly. I’m happy to take half a loaf.” He wants people to have very strong views, and he understands that’s where the energy in politics is going to come from—people developing strong positions and advancing them strongly with enthusiasm, with zeal.
The key thing is that you force them to compromise whether they want to or not, and you get them to accept the legitimacy of a system that forces them to compromise so that they don’t walk in the door saying, “Compromise is a terrible thing. It should never happen.” They walk in the door recognizing that they’re not going to get everything that they want. That doesn’t mean they, individually, have to be full of doubt and trepidation. It just means they have to say, “You know what? I’m part of a system that’s going to make this decision. I can’t make it alone.”
Well, the same is true in the constitution of knowledge. The biggest driver of scientific advance, knowledge advance, is people who strongly present and defend a hypothesis and really push hard on it. A lot of these people can be very dogmatic. A lot of scientists never change their mind when they’re wrong, they just die off and are replaced by younger scientists. But you don’t need them all to go in thinking, “Well, I don’t really feel strongly about this. I’m not really sure I’m right.” You just need them to submit to the system that forces them to persuade other people and understand they might lose the argument, and if they do lose the argument, they should go ahead and work on something else.
Same idea so, yes, I’m for epistemic humility; I try to practice it myself. But I’m also someone with strong views, and I’m willing to defend them fiercely, and both of those things are needed. Like the U.S. Constitution, the constitution of knowledge holds them in balance. It gets that wonderful creative tension out of this contrast.
KLUTSEY: Really interesting. The constitution of knowledge is the set of institutions—the set of values, the professionals and all the different nodes, the network—that help to advance the accuracy of facts and ideas and things like that. Is that right?
RAUCH: That is right. Shall I elaborate?
The Marketplace of Ideas
KLUTSEY: Yes, and also touch on your qualms with the concept of the marketplace of ideas. Because I think you’ve pushed back on that phrasing a little bit, and because there are certain measures in place that don’t necessarily make it a full market in the way that we think about commercial markets.
RAUCH: If you ask most Americans where knowledge comes from, where truth comes from, how we know it’s true, they’ll probably say marketplace of ideas. That’s a phrase that goes back 102 years to Justice Oliver Wendell Holmes. The notion there is, basically, if lots of different people have a lot of different ideas and they advance them through free speech, knowledge will emerge from the contest of ideas. Whenever you present this concept to an undergraduate, they’ll say, “Well, how do we know that the true idea will emerge from the marketplace of ideas? After all, people in commercial markets make a lot of bad decisions, and a lot of faulty products wind up coming out.”
This turns out to be a very profound question because the marketplace of ideas metaphor—I love it, as far as it goes, but it leaves out the key part. It’s like saying that our U.S. Constitution is a marketplace of votes: We just all get together and vote, and policy comes out of that magically. That misses all of this stuff in between, which is the really important stuff. It’s all the checks and balances that pit ambition against ambition or, in the case of science, bias against bias. It’s all the institutions that incentivize constitutional, lawful behavior—everything from the courts to the schools that teach us civics so we understand how to behave.
Same thing in the constitution of knowledge. This is things like the scientific organizations, the academic promotion process, the schools that teach how to do writing and research, the newsrooms. I’m a journalist by profession. The magic in mainstream journalism, the reason it works, isn’t that you have individuals who always get it right. They often get it wrong. It’s the newsroom; it’s this hub where people are challenged. They come in and say to their editor, “I have a story. I think this is a good story,” and the editor says, “Wait, have you checked this? Why is it important?”
It’s always this process, this structured process, of having to persuade others, having to make your case according to a lot of protocols. Like, I’m going to reveal my age here, but the famous “Saturday Night Live” line is—if I started an article for a magazine or a research journal with the words “You ignorant slut,” or something like that, this would get me nowhere. I wouldn’t get published because that’s not how you do it in the reality-based community. You have to present your ideas in a way that’s not insulting and that’s not ad hominem. You’ve got to expose yourself to criticism by saying, “Here’s the literature that this is based on.” You’ve got to show your work. You’ve got to say, “Here are my sources for this.”
There’s tons of stuff going on that structures this marketplace, so it’s not just anything goes, and if it is anything goes, the outcome is really bad, actually. We just chase each other down these epistemic rabbit holes, confirming each other’s beliefs, following our groups. We insult each other and we go ad hominem because there’s no incentive not to do that. It looks, frankly, a lot like Twitter.
The big part that’s missing in the middle is all of that structure, all of those norms and institutions, where people are forced to present their ideas in coherent ways and forced to expose themselves to criticism from other people in each of those nodes, whether it’s a newsroom or an academic journal or a conference. Each of those then weighs the ideas, and then they pass them on to another node in the network. If something passes again and again, if it goes through all those filters, it becomes knowledge.
Most stuff falls out very quickly because it turns out to be wrong.
Canceling and Trolling
KLUTSEY: Yes, and as you effectively argue, the emergence of digital media challenges the constitution of knowledge, taking phenomena like troll culture and cancel culture to a new level of difficulty in terms of how we foster a reality-based community that is able to transmit accurate information that is available to society. We throw around these terms trolling and canceling. Can you unpack the terms for us?
RAUCH: Yes, there are a bunch of them. The first big theme of my book is the one we’ve been discussing, which is it’s not the marketplace of ideas, it’s the constitution of knowledge. It’s this whole system, and we need to understand it and we need to defend it because, like the U.S. Constitution, it doesn’t just defend itself. We have to teach it in schools; we have to help people understand it; we have to understand attacks on it.
Where are those attacks coming from? Well, the system has always had enemies since the dawn of science, and the enemies are always changing. The big enemies today are using tactics that are known as information warfare; that’s defined as organizing and manipulating the social and media environment for political advantage, specifically to dominate, to divide, to disorient and ultimately to demoralize your political opponents. There are a bunch of ways to do that, and they all basically aim at the same thing, which is undermining this process that we’re talking about where we’re forced to compare/contrast our views, go through these institutions, persuade each other in order to make knowledge.
How can you do that? Well, one critical thing the constitution of knowledge does is it doesn’t put you in jail if you’re wrong; it doesn’t kill you if you’re wrong. The way this system works is we punish our hypotheses instead of each other. What happens if I make a mistake is others find it, and I lose the argument, and that’s it—and that’s not great, but life goes on. That frees me to make many more mistakes. The magic of science is not that it doesn’t make mistakes, but that it makes them incredibly quickly and uses millions of people around the world at any given moment to find them. That’s how you find the good stuff.
How could you subvert that? One way is that you could punish mistakes using social pressure; you could make people afraid to hypothesize and speak out. That’s cancel culture. That’s where you use social pressure so that people get socially punished: They lose their job, they lose their reputation, they lose their friends if they’re accused of saying something wrong. This is actually a way to dominate people. It’s not a way to find knowledge, it’s a form of domination.
The notion here, if you go back to the experiment I told you about earlier, is, of course, you can do two things. One is you can chill people—you can make them afraid to speak out—but you can also actually change their thinking. Remember that one person in the room with eight people; seven of them said the wrong thing? Suppose seven of those people are saying, “The answer you just gave is unacceptable. That is just immoral. Shame on you. How could you believe that? We are not your friends anymore.” That’s actually going to affect the way that eighth person thinks. They’re going to say, “This is a shameful idea. It must be wrong.” You begin to doubt. You begin to feel ashamed.
That’s what canceling is trying to do. It’s actually messing with your brain to change what you think is true and to change what you think is acceptable for people to believe. That’s a powerful effect, such that even relatively small minorities, like on campuses—it’s always small-pressure groups that are using these tactics—but they can intimidate much larger groups by making people feel like, “I don’t know what’s safe to say anymore, so I’m just going to keep my mouth shut.” That’s canceling. That’s the use of social intimidation.
Another one is trolling. That’s very popular now on the right, especially with Donald Trump and his supporters. Trolling is attention hijacking. Something else the constitution of knowledge does—the reality-based community, again, it doesn’t punish you for being wrong. It just directs attention away from you. If I’m going around saying the CIA has planted fillings in my teeth that are controlling my brain, that’s a hypothesis, but I’m very unlikely to be able to get anyone in the reality-based community to spend any time or resources checking it. They’ll just ignore it because it’s inconsistent with everything else, and because I have no standing to say it, and all kinds of other reasons. That’s how we find truth. That’s how we stay oriented. We just ignore the bad stuff, and we focus our resources and time on the ideas that are really worth developing and might actually be true.
Suppose you can subvert that by hijacking people’s attention. It turns out that we are wired so that if someone insults us or our group or our sacred ideas, we want to rush to the defense of our group. We want to prove to our friends, to our colleagues, “I’m a good citizen. I support us. I’m not a racist. We are not racist. This is outrageous.” Trolls use that psychological vulnerability: They use on-purpose offense, ad hominem attacks, harassment, what they call LOLs; they say outrageous things. They use conspiracy theories, knowing that we will rise to the defense of what we think is true and what we think is just, but that allows them to dominate the debate.
Now, this is something Donald Trump did every day. When he had a bad news day, he would do something—like once, he accused a former member of Congress of murder completely out of the blue. That may be some crazy shit, but people cannot stop talking about it and they cannot stop thinking about it because it’s so outrageous. It’s such a violation of everything we hold dear. If they do enough of this, 24/7, they can completely disorient us and dominate the conversation, even though very little of what they’re saying is actually true.
These are powerful tactics of information warfare. They’re hard to resist. We need to understand them; we need to counter them. They don’t just go away by themselves.
Feedback Loops and Disinformation Tactics
KLUTSEY: As I was reading this, I kept wondering whether these cultures are reacting to each other—the trolls are being canceled, and the ones canceling are being trolled. Or are both cultures part of this broader phenomenon of our addiction to outrage and sensationalism that you talk about earlier in the book?
RAUCH: Oh, they definitely build on each other. They build on other things, too, like conspiracy theories. Which—another natural tendency we have is to look for explanations of things that involve intentions and human beings, even if it’s just a coincidence. All of these groups, these factions that are employing these tactics, they can all feed on each other, and they do, in fact.
The Russians are masters at using these things in conjunction with each other. They’ll use what’s called a “firehose of falsehood” technique. That’s a very powerful technique. That’s something else that Trump has perfected and applied to American politics. That’s where you spout so many untruths, falsehoods, half-truths, exaggeration, conspiracies so fast that media gatekeepers, reality-based community, can’t begin to keep up because by the time they’ve refuted one of them, there’s already 10 more out there. And anyway, refuting them repeats them, and repeating them further implants them in people’s minds.
That creates an impossible situation. If you flood the zone with conspiracy theories that are also trolling people and that are also canceling people—if you use all of this at the same time—you can see why you get some pretty mighty snowball effects, and you can see how it would be possible to demoralize and dominate and divide and disorient American society.
That’s a big message of my book, Ben. We say, why is America so polarized? Why are we not getting along? What’s accounting for all this extremism? There are lots of reasons. I don’t deny there are lots of reasons—everything from the decline of organized religion to stagnation of income for high-school-educated white men and everything in between. But let’s not forget, a big factor, and I think probably the single biggest factor, is that our society is being systematically targeted with powerful techniques of disinformation warfare—I should say, powerful techniques of information warfare that are working.
It’s right under our nose. This is not just happening by some natural disaster or accident of fate. There are nameable people and organizations that are doing this to us for power and profit. Some of them are abroad—Russia and Vladimir Putin—but the most important ones are here at home. That includes cancel culture on the left, but especially—the biggest danger of them all right now is the MAGA right, which is applying Russian-style disinformation tactics to American politics in a way, at a magnitude, with an effectiveness that no one ever even dreamed of trying before, and it’s working.
Trust in Institutions
KLUTSEY: One of the models that you offer is Wikipedia, your model for creating a reality-based community. Basically, you need a professional social network that is empirically driven to improve the level of accuracy of information that we’re able to access. However, part of the challenge we’ve seen over the years — as what Martin Gurri has nicely detailed in his book, “Revolt of the Public”—that there seems to be some kind of rejection of the elites and professional classes—journalists, public officials, and so on—because they’ve made some grave mistakes in the past, talking about 9/11, the financial crisis and things like that.
They cannot be trusted—I think that’s the perception of the folks who are revolting against the elites, but you’re saying that we should trust this group anyway because they have a self-correcting mechanism.
RAUCH: You should trust the system of which they’re a part. Not in every individual case; not every individual answer is right. Of course not. The whole point of this system is that it makes errors but finds them quickly. The test of this system is, does it correct its errors? I would argue, for example, that the two most important things the United States government needs to do is protect us from foreign attack and put a bottom on the economy when the bottom falls out massively.
I would argue that, in fact, our institutions have served us remarkably well. If you had told me on September 12, 2001, that there would be zero major successful terrorist attacks on the United States in the next 20 years, I would’ve said you were nuts. Well, guess who won the argument with al-Qaida? When the bottom fell out of the U.S. economy, the reason we didn’t have a major depression—that situation happened many times in the 19th century. Unfortunately, it’s part of the financial cycle that there are these occasional big bust runs on banks. The reason that didn’t happen this time but did happen in the 1930s is that the Federal Reserve and two presidents and the much-maligned Congress took some very strong actions to keep the economy together—and it worked.
I don’t disagree with someone like Martin Gurri that there have been real problems with these institutions because who could deny that? But there’ve also been major successes. There’s at least a major difference of emphasis between me and the predominant school, which is that institutions deserve to lose trust because they’ve done so poorly. I think actually the opposite is true. I ask people, “So, which country’or era’s institutions would you rather have than ours right now?” Because they usually don’t want any others.
My view is actually—a better way to think of this is that there has been a concerted campaign, started in the ’60s and ’70s on the left, then migrates to conservative media and the conservative world on the right, and then becomes pretty all-embracing after that. There’s been a 50-year campaign to demolish the reputation and credibility of our institutions and not even to think institutionally anymore, to just assume we don’t even need these intermediaries. Just the more personal choice, you’ll always get better results.
If you spend 50 years targeting your institutions for abuse, calling them incompetent, saying that they’re a drag on freedom, that’s going to have an effect. Then, if you add to that the rise of the internet and the way it can turbocharge these deliberate tactics of disinformation, which are all about demolishing trust in society—dividing, disorienting, demobilizing, dominating—when you add all of that, you don’t just have, “Oh, institutions fail. That’s too bad.” You have, “No, this wasn’t suicide. This wasn’t death by natural causes. This was murder. This was an organized, sustained effort to undermine the institutions that we depend on society.” So, yes, let’s improve the institutions; they can and should do better. That’s always the case. But couldn’t we please first pivot and look at the people and organizations that are actively working to demolish trust in America?
Viewpoint Diversity
KLUTSEY: I think in terms of epistemic tribalism, one of the reasons the professionals involved in the constitution of knowledge aren’t trusted is that there isn’t as much viewpoint diversity in places like academia. You cited a poll that said that 8 out of 326 experimental social psychologists were conservative. This is similar to other surveys across multiple disciplines. I think you’d find that with the media as well.
I was wondering, will a Wikipedia model foster viewpoint diversity? Because one would think that they face the issue of groupthink and epistemic tribalism that you described earlier as well.
RAUCH: Wikipedia is really interesting because it’s an example of how you can translate the constitution of knowledge to a disaggregated, online world and do it very well. I’ve got a whole section on Wikipedia, which I won’t rehearse here, but the bottom line is Wikipedia is not an unstructured platform where anyone says anything. It’s highly structured.
There are hierarchies, there are levels, there are rules that keep people well-behaved and on the same page, give them incentives to stay reality-based. They have to show their work. If they abuse the system on more than one occasion, they can be suspended or kicked off. Everyone can be corrected at all times by anyone. It’s a huge network, which ensures that there’s plenty of diversity. Keyword that you’re getting at now, Ben, which is so important, diversity.
James Madison, father of the political Constitution, had many extraordinary insights. He’s basically an alien from outer space. One of his greatest was—so you’ve got, in any system, the problem that one faction might take over, using some of the techniques we’re talking about. How do you prevent that? Before Madison, the answer had been you want to have a small republic that can be well ordered and well supervised.
Madison says, “No, the opposite is true. You extend the sphere. You need diversity of politics. That ensures that no one faction can take over, get a monopoly of ideas; that there will always be plenty of raw material to do politics, to introduce new ideas, new groups, new coalitions.” Diversity is your friend—that’s one of the core liberal ideas.
The same is true in the constitution of knowledge. If you’ve got eight people in a room who all believe the same thing about topic X, by definition, they cannot do good science on topic X because science is about finding your errors, finding your biases. If people share their biases, then there is no one to test them. Science works—I mean science broadly defined; I mean the whole shebang, journalism, even law, government—this works by pitting antagonistic biases against each other. That means you have to have a diversity of biases.
Now, flash-forward to an American university campus today. Suppose you have a proposition like, “Affirmative action backfires against the people it’s meant to help.” That’s an empirical proposition. It deserves to be tested. It may be wrong, it may be right. But if you’re in an environment where everyone on that campus, either out of belief or out of intimidation, is saying, “You can’t say that here. That’s racist, sexist, homophobic, whatever, but we’re not considering that proposition”—that means you’re not going to test it, and that means you’re not going to do good science.
Part of the important thing to do here is we need people in the reality-based community, and especially in academia, to do a better job of making sure to cultivate all kinds of diversity—not just demographic diversity, but viewpoint diversity. That’s also true in our newsrooms, where sometimes there just aren’t enough conservative voices and sometimes, as we’ve seen in some famous recent episodes, where there are Conservative voices, they’re called out, they’re silenced, they’re intimidated, they’re even fired.
Diversity is important for two reasons. One is just we’ll get it wrong if we don’t have people who disagree with us to check us. The second is we lose the public’s trust if the public starts to think, “Well, what the reality-based community is, is it’s a racket of people with one point of view who are trying to put that on the rest of us and indoctrinate us.”
COVID-19 and Collective Learning
KLUTSEY: When we think about the emergence of COVID-19 and the efforts to combat the disease, does that perhaps give us a sense of how well the constitution of knowledge has fared and, in some ways, its flaws as well? I’m thinking of the mobilization of brainpower globally to resolve it as a great positive, but I also think about some of the mistakes in the beginning and how dismissive the professional class were about the origins of it, particularly the lab-leak theory. Perhaps this illustrates both the positive and some of the challenges that it faces? What are your thoughts?
RAUCH: Call me Pollyanna, but I think this episode indicates positives all the way around. Remember, there’s nothing unusual about making mistakes, about jumping to wrong conclusions and then getting involved in echo chambers, and then not asking the right questions and then being wrong. But in most human societies, the way you correct errors is by killing the people or ostracizing or excluding the people who have them, but that only makes it worse, right? The only way you develop knowledge is—errors are going to happen, and the test of the system is do you find the errors? When you find the errors, can you correct the errors? Then, of course, you’ll make a whole new set of errors.
Lab-leak hypothesis—for reasons which I think actually, at the time, were good reasons, not bad reasons, journalists piled on too quickly into the idea that that hypothesis couldn’t possibly be true, that it was a nutty conspiracy theory. By the way, not all journalists did that—The New York Times did not make that mistake—but a lot of people did. But other journalists at The New York Times and The Washington Post and The Wall Street Journal stuck with the story; these were literate, fact-based scientific reporters.
They said, “This might have happened.” Partly as a result of their work, the media then comes back later and says, “Wait a minute, we got that wrong.” Then we have the giant post-mortem that’s going on right now—which is the reason you know about this and you’re asking me about it—which is you have this giant examination in the media of, “How did that mistake happen, where did we go wrong, and how do we prevent it next time?”
I’m not praising the initial mistake, of course. I mean, mistakes are not good. I am saying this is exactly what you want to have happen, and this is what never happens in a society that’s run by, say, an authoritarian dictator, just for example. They don’t correct their mistakes. They take the whole society down to the grave with them.
The other plus side is the one you mentioned, and that’s the vaccine in my arm that’s protecting me right now. Humans were evolved for small tribal societies in which knowledge increased basically zero from generation to generation. Because of the constitution of knowledge and its ability to organize and harness error-seeking on a global scale, millions of people with countless resources—all able to swivel and focus on key questions at key moments—bring that brainpower in this search for error.
Because of that, humans are functioning orders of magnitude above our design capacity, and humans are now creating, I would argue—I can’t prove this, but I think it’s true—in a single morning, the human species now produces more knowledge than we did in the entire first 200,000 years of our history.
That knowledge includes the incredible ability, when a new virus comes along, to discover the genome over a weekend, design a vaccine over 10 days, and have it in people’s arms less than a year later. That’s the miracle of the constitution of knowledge, and that miracle is known as objective knowledge, human knowledge, which is independent of you and me, actually.
It is a thing. It is out there. It is bigger than any one of us. All humans could die off tomorrow, and in a thousand years, an alien civilization could land on our planet, decode our books, go through our databases and use the knowledge that we have found. This is a species-transforming technology.
KLUTSEY: That is great ,and that’s positive all around. Reminds me of something that you also talk about in the book “The Republic of Science” by Polanyi. That is considered to be something like a society of explorers, where each node—or each scientist or professional in that realm—is trying to solve a puzzle, and they will discover parts that work and then discover, at the other end of the spectrum, that someone else has discovered another part of the puzzle that makes sense, and they can put it together and they learn and—
RAUCH: Can we learn collectively?
KLUTSEY: Exactly.
RAUCH: I love that. That’s such an important point because the reality-based community, the system we’re just describing—people sometimes talk about gatekeepers, as if there was ever a time when any one individual or organization could vote an idea up or down and determine if it lives or dies. This is a social network, and it’s the greatest social network that’s ever existed by far. There are many, many ways through it. Many journals in which you get published, newspapers, magazines; many different ways that you could move through the courts, which are a key part of the reality-based community.
No one [person] gets to decide, but if they individually decide as you work through the system, “Hey, this idea is better than that idea. We’re going to give preference to this one and pass it on to the next,” what you get is this vast network of people and organizations in which someone on one side of the world can come up with a hypothesis, others in the world will demonstrate it as right. Then it can ramify, can pulse through the whole network, changing all kinds of other hypotheses as other people and institutions realize, “Okay. Well, now if that’s true, we have to adjust this.”
It’s almost a fantastic organism, like a giant biosphere, except what it’s doing is creating and sustaining knowledge. No other system can do that. Trolls can’t do that, cancelers can’t do that, disinformation attackers—they’re all parasitic, nihilistic. All they can do is destroy, undermine faith in institutions. They cannot create the vaccine that’s protecting me right now. That’s one of the great advantages of the reality-based community—it’s reality-based.
KLUTSEY: It’s easier to tear down than to build up.
RAUCH: Yes, it sure is.
Is Jonathan Rauch Optimistic?
KLUTSEY: Usually, this is a segment where I ask whether the guest is optimistic or not, but I would note that you mention in the book that what’s happening is cause for alarm, but not cause for fatalism. My guess is that you’re quite optimistic that a rebooted constitution of knowledge will limit the kind of polarization and the erosion of trust in facts and data that we’re observing right now.
RAUCH: I’m conditionally optimistic, otherwise known as hopeful, and here’s the reason: The constitution of knowledge, like the U.S. Constitution, is not self-maintaining. It just doesn’t automatically defend itself against purposeful attacks. People need to understand these attacks that we’re talking about, and they need to rise to the challenge.
Right now, we’re in a pretty dire situation: 70% of Republicans believe that the 2020 election was rigged, and therefore, by implication, America is no longer a democracy. They’re pushing this—which is a mass propaganda campaign on an unprecedented scale—they’re pushing this through every available channel. The courts, the Arizona recount that’s going on right now, Donald Trump, Republican politicians, conservative TV, conservative radio—they’re pumping this stuff out and undermining our democracy for political gain.
That doesn’t counter itself. People like you and me and many other people need to understand what’s going on and counter-organize, counter-mobilize and push back. There are tons of ways we can push back, and that’s already starting, to a very encouraging extent—there’s a bunch of stuff on that in the book—but it doesn’t happen automatically.
Frankly, I think if the liberal side—that is, the people who believe in the constitution of knowledge, in pluralism, in this critical, fact-based, reality-based system—if we get our act together and begin focusing on defending the constitution of knowledge and its precepts and going after these attackers with countermeasures, I think we squash them like a bug, frankly. On the other hand, if we don’t, if we just sit it out and say, “Well, it’s the fault of flaws in our institutions. Americans are sadly polarized, and gee, that’s too bad. Wish this weren’t happening,” then I think they win because they’re active, they’re organized and they’re using very powerful tactics.
A lot of this, I think, depends on whether we wake up to what these institutions of knowledge are and how we defend them.
The Role of Courage
KLUTSEY: As we get to the end here, I’d love for you to take a moment to reflect on the role of courage in all of this. In order to ensure that the constitution of knowledge doesn’t become dogma, and to push back against canceling and trolling, and to build institutions that will sustain it, that you were talking about, what is the role of courage in the context of—is there a call to action here in your book, and what do you want audiences to take away from this?
RAUCH: There are tons of prescriptions and calls to action and very specific ideas, many taken from actions that are being done right now, but the role of the individual is not sufficient, but it’s necessary. The reason it’s not sufficient is that any one individual cannot do much all by herself to counter, say, a disinformation attack. She needs the help of others. She needs institutions.
For example, in a canceling situation, what you’re usually talking about is a small but very well-organized and often quite disciplined group of people who dominate the rest of us by—if you raise your hand and say what they think is the wrong thing, they’ll come and try to get you fired. They’ll make your life miserable. They’ll get you investigated. They’ll accuse you of stuff. They’ll blacken your reputation. You know how it works. It’s very important for individuals to stand up against that, but they need the help of other people. You need to counter-mobilize so that people know that they will get the support and the mobilization so that they can organize and put pressure on in the other direction.
That’s starting to happen. We’re seeing a whole upswelling of groups and organizations that are pushing back against canceling, against illiberal woke-ism. You’re also seeing counter-mobilization against disinformation: watchdog groups around the world that are actually inside the conspiracy networks, figuring out where the next attack is coming from, notifying social media companies. Social media companies are, in turn, beginning to counter-organize, and that’s very important.
Now, to get to your other point. At the end of the day, none of that works if individuals don’t say, “I care about truth. I am not going to lie about someone in order to take them down. I am not going to join a cancel campaign. In fact, if someone I know, or someone in my academic community or my social world, is being attacked for having the wrong opinion and threatened not just with correction, like, ‘You’re wrong and here’s why,’ but ‘No, you deserve to lose your job’—even if I disagree with that person, I should step forward and say, ‘What you cancelers are doing to this person is wrong. This is bullying. This is propaganda.’”
If enough people do that and begin to stand up and begin to push back, life gets much harder for the cancelers and the propagandists. Because remember, what they’re ultimately out to do is demoralize the other side. If they can convince you that you’re in a small minority, that your beliefs are shameful and no one shares them, you’re demoralized. You feel helpless. If you’re flooded with disinformation at such a rate that you can’t possibly keep up if you’re a journalist, or even keep track if you’re in the public, you’re demoralized. If you’re canceled, you’re demoralized. Demoralization is demobilization. That means you feel weak and helpless; it’s futile to push back. That means you’re politically ineffective and inactive, and that’s the ultimate goal of all of these campaigns on all sides.
Individuals are the starting point of drawing the line and saying, “I know what you’re up to. It’s not going to work; you can’t demoralize me. I am speaking out, and I don’t care if you call me a homophobe or a bigot or whatever. I’m going to stand my ground.” That takes guts, that takes courage, and these people need support. You can’t do it by yourself. But yes, it has to start with individuals arming themselves against these kinds of epistemic attacks.
KLUTSEY: That’s wonderful because courage is critical to this.
RAUCH: Yes, I liken it to—courage is a good word, but I think an even better word is what our founders called civic virtue. That’s saying they understood the U.S. Constitution does not defend itself; that you need a population that understands the importance of being literate, getting facts before you vote, paying attention to public affairs; understanding that you’re not going to win in every situation; that the other party might win, but that doesn’t give you the right to overthrow the system.
There are all kinds of things that require—they put, actually, pretty heavy burdens on us as individuals, which is why Benjamin Franklin famously said, when he was asked, “What kind of system have you given us?” after the Constitutional Convention, his reply was, “A republic, if you can keep it.” That’s up to us.
The same concept of civic virtue applies to the constitution of knowledge. That means you don’t lie. You don’t make stuff up, even if it’s to your political advantage to do so. You don’t use bullying and intimidating tactics, even if it’s to your political advantage. You’re not truthy—you don’t say things just because you think they ought to be true. You don’t gang up on people. You accept criticism; you don’t have to agree with it, but you understand it’s going to happen and that others might disagree.
This all requires a burden on ourselves to look away from our tribal side, our instinctive side, and stand up for these values which are very demanding, but which are required if we want a free society that can actually produce knowledge and keep us moored to reality, and not have to settle our disagreements by going to war.
KLUTSEY: Thank you, Jon. I think this is a great place to end it. Really appreciate you coming on the podcast and talking to us about the book. The title is “The Constitution of Knowledge: A Defense of Truth.” Thank you, Jon.
RAUCH: Thank you, Ben. It was a pleasure to be with you.
KLUTSEY: Likewise.