The Importance of Critical Thinking and Civil Discourse in Today's Polarized World

Stanford’s Robert MacCoun shows how scientific habits can sharpen judgment and strengthen civil discourse in a polarized society.

The Importance of Critical Thinking and Civil Discourse in Today's Polarized World

In a world where confidence is rewarded and humility can feel like a liability, Stanford Law professor Robert MacCoun argues for something radical: fewer unwavering opinions, more critical reflection, and a better way to disagree. On Stanford Legal, MacCoun joins co-hosts Pamela Karlan and Diego Zambrano for a conversation about how “habits of mind” borrowed from science can help citizens, lawyers, and policymakers think more clearly and function more effectively in a pluralistic society.

MacCoun is the James and Patricia Kowal Professor of Law at Stanford Law School, a professor by courtesy in Stanford’s Psychology Department, and the university’s senior associate vice provost for research. Trained as a social psychologist, his work sits at the intersection of law, science, and public policy, with decades of research on decision-making, bias, and the social dynamics that shape how evidence is interpreted. In the episode, he draws on his most recent book, Third Millennium Thinking: Creating Sense in a World of Nonsense, co-authored with Nobel Prize–winning physicist Saul Perlmutter and philosopher John Campbell, to explain why probabilistic thinking, intellectual humility, and what he calls an “opinion diet” are essential tools for modern civic life.

This episode originally aired on February 5, 2026.


View all episodes

Transcript

Robert MacCoun: Most of the things scientists know, they know because other scientists have told them and it’s a system of trust and it’s a fallible system of trust. One of the suggestions we have in the book is that people put themself on an “opinion diet” and that most people have way more opinions than they actually need to have.

Pam Karlan: This is Stanford Legal, where we look at the cases, questions, conflicts, and legal stories that affect us all every day. I’m Pam Karlan with Diego Zambrano. Please subscribe or follow this feed on your favorite podcast app. That way you’ll have access to all our new episodes as soon as they’re available.

A lot of great things come in threes. There’s the three wise men, the three tenors, and we’re going to have a discussion today about three guys whose professions all begin with P: a philosopher, a psychologist, and a physicist. Diego, why don’t you introduce our guest?

Diego Zambrano: Rob MacCoun is the James and Patricia Kowal Professor of Law at Stanford Law School, one of our colleagues and a professor in the Stanford Psychology Department. He also serves as the university’s Senior Associate Vice Provost for Research, and he’s trained as a social psychologist. We’re going to be talking about his book, as Pam mentioned, co-authored with Saul Perlmutter and John Campbell, called Third Millennium Thinking. And this is based on a wildly popular course that the three of them taught at UC Berkeley that’s all about how to use scientists’ tricks of the trade to make the best decisions and to solve difficult problems. And so maybe we should start with getting a broad overview of what motivated the book, Rob.

Robert MacCoun: So, it started with a message to call Saul Perlmutter about a decade ago. I immediately recognized his name as someone who got a Nobel Prize in physics for his work basically discovering dark energy, which is 60- something percent of the universe. And I couldn’t imagine why he was calling me. And my first thought was, ah, committee work, probably going to be asked to be on a committee. But he had this idea about teaching a class for undergraduates, an interdisciplinary course. And it was based on his sense of alarm that we were seeing more and more misinformation and disinformation and less and less critical thinking.

And he also felt that the tenor of arguments in popular discourse was very different than the tenor of arguments among physicists. Even though physicists passionately disagree with each other all the time, they manage to talk to each other without coercion or violence, and only a little bit of name calling. And so we tried to figure out what a course like this would look like, and I think it ended up really being about what is the proper authority for science in a democratic society? I think a lot of us think that science ought to play a really important role in social problem-solving when we’re trying to get things done. But we’re also nervous about the idea of some sort of epistocracy where scientists control our lives and make all the decisions. And the course is about the tension between democracy and the scientific system and trying to help citizens think about that in a constructive way.

Diego Zambrano: So one way to think about this is you give us a cluster of tools, the reader, to think about both public policy questions, but also individual decision-making in your regular life. And your point is that scientists have developed over time a lot of techniques to think about issues and the kind of evidence that you need to come to a conclusion about something. And so give us some ideas of what you think the most important tools are. Just a taste.

Robert MacCoun: Sure. And one thing to say is when we talk about the tools of the trade of science, we’re not talking about methodologies that is taught in methodology, science method, books. We’re not talking about laboratory practices. We’re not saying citizens need to have a com mastery of inferential statistics or any kind of laboratory protocols. We’re really talking what we call the “habits of mind” of a scientist—the things that they learn from their mentors that aren’t actually written down in books.

And some of these will be familiar to listeners and others, maybe less so, but I’ll just mention a few of them. We talk a lot about the distinction between noise and bias. And the importance of being able to distinguish between noise and bias. And noise is random and noise is what I think most people intuit to be the real challenge of science, is that life is just chaotic and noisy.

And one of the things we try to explain is actually noise the easier problem to solve. We know a lot about how to manage noise. A much more difficult problem is bias. And here, bias can range from your demographic category influencing the way you think about a problem, or what part of the country you grew up in, to who your mentor was and what orientation they gave to you. So a lot of the book is really about trying to figure out bias.

Another theme that runs throughout the book is the power of probabilistic thinking and moving away from getting trapped into thinking in absolute “either or” categories. And part of the power of probabilistic thinking is it provides a face-saving way for people to decide that maybe they’re wrong about something. If you categorically assert something is true, it’s very hard to later admit you’re wrong.

Diego Zambrano: Let’s get concrete on that because I think that’s a fascinating part of the book. Most of the time when we have discussions about, say, public policy, people have an opinion, right? Should we raise the minimum wage: yes or no? And generally, people speak as if it’s just a matter of supporting a policy or not. But your point, and this is something that’s been made in a lot of different contexts, is that we actually should quantify how confident we are about that belief. Instead of saying, “yes, I support a higher minimum wage because I think that won’t hurt unemployment,” you should say, “I have about a 65% confidence that this is a good policy and I support a high minimum wage because I don’t think it will affect employment, but it’s possible that it may, there’s still a 35% chance there that I’m wrong.” And when you quantify things that way, it leads to much better decision making and to much better thinking.

So why is that so helpful?

Robert MacCoun: For starters, it works even if you don’t quantify, even if you just use linguistic hedges, like “maybe” or “more likely than not,” or “probably.” The law is very careful about its words about certainty and probability and so on. But lay people are often, in ordinary life, very casual about it.

We would have students go through exercise where they would try to have a conversation about something you might argue about at a dinner party, and actually quantify every statement. And usually the students, after a while, are just bursting out laughing because they…

Pam Karlan: … yes, it would seem to me … you’re right to talk about the linguistic rather than the numerical in a sense, because the numerical might almost reintroduce exactly the problem you’re trying to solve, which is people’s spurious belief that they absolutely know the answer, right? Because if you say “there’s a 62% chance of this happening,” that seems almost as certain, in a weird way, as saying, I’m certain it’s going to happen. Whereas if you say, “I think I’m pretty strongly convinced …” that would put you in a different position.

Robert MacCoun: Yes. I make a distinction between so-called lay life and scientific life. So, scientists, when they’re acting as scientists, they really do need to quantify it. And they need to have some theory of error. And we talk in the book about examples of where scientists get on Twitter and make extremely overconfident pronouncements with no hedging. But I think in ordinary life, yes, the hedging is the key part, not the numbers per se. The notion that … it’s Cromwell’s statement that it beseeches you to think that you might be wrong, just to consider the possibility that you could actually be wrong, and to realize how few things we actually really know, including scientists.

Most of the things scientists know, they know because other scientists have told them, and it’s a system of trust and it’s a fallible system of trust. One of the suggestions we have in the book is that people put themself on an “opinion diet” and that most people have way more opinions than they actually need to have.

And I think for a lot of us, there’s this terror that we’ll be boring if we don’t have opinions—that opinions are what make conversation interesting, and I have some sympathy for that. It can be fun to play with an idea by playing devil’s advocate and taking an extreme position, but in fact there are relatively few things we really need to have an opinion on, if we actually don’t know. Especially if it’s some new topic that’s just entered public discourse, like some new virus. The reasonable thing to, to say would be, “I have no idea. Hey, I really want know more about this, but right now I have no idea.” And I think one of the things we saw in COVID was that overconfidence by both sides of the public debate.

Diego Zambrano: Yeah, there’s too many opinions. Although I just … I don’t want to spend too long on this point, but on the probabilistic thinking, the reason why numbers can be useful sometimes is because we don’t mean the same thing when we use the word “probably,” right?

There’s this apocryphal story about, I think it was the Kennedy administration during the Cuban Missile Crisis, and his advisors were telling him the likelihood that there were nuclear weapons in Cuba. But it turns out that when they used it linguistically, they used the same words—probably, likely—but numerically they had completely different takes on the likelihood.

So that’s why it can be helpful. But of course, I agree with you, both of you, that socially it’s awkward to state a number on how confident you are on something.

Robert MacCoun: What you just talked about is a major professional interest of mine. I’ve done a lot of research on trying to quantify reasonable doubt and preponderance of evidence, and clear and convincing evidence, trying to understand how people understand those concepts.

And it can be quite startling when my research suggests people think they’re applying a reasonable doubt standard, but they’re actually, numerically, what they seem to be doing in their decision making, is much closer to preponderance of evidence in criminal cases. So yes, the numbers do matter.

Pam Karlan: Yes, it’s interesting because if you think about the jury instruction, the sort of standard jury instruction on reasonable doubt is, “it’s not something that is fanciful. It’s not something that’s crazy. It’s something that might keep you up at night and it’s a doubt that if it were about an important issue in your own life would cause you to stop, “which is very different than the sort of when you ask people for a number and they say 98 or the like.

Robert MacCoun: So I have argued, and I’ve got a new paper where I’m making the argument more forcefully, that reasonable doubt in juries is actually not an epistemic concept, it’s a social concept. It’s not…

Pam Karlan: Yes, I remember you gave a workshop on this that was just terrific.

Robert MacCoun: Yes, I believe, that based on studies I’ve done, not just speculation, but that what actually happens in the jury, is it’s not so much that each person is persuaded that there’s some doubt that they didn’t previously have. But rather they see that the other citizens acting in good faith seem to have a doubt and so they say, “I believe the community has reasonable doubt because my peers have reasonable doubt.”

And this brings me to a sort of the second half of the book. So the first half of the book is on these habits of mind, but the second half of the book is on habits of community, and we talk about how scientists can’t rely on habits of mind because it’s too easy to fool yourself even with the best habits of mind. You need to ensconce yourself in a community of people who will call each other’s BS, who will hold each other accountable to their best standards of evidence. And that’s not always fun, but it leads to better science, and I think it leads to better citizenship. And it’s not always fun because that means getting together with people who completely disagree with me.

And if I’m operating in good faith, I have to listen to them and accept the possibility that they might actually persuade me that I’m wrong about something.

Diego Zambrano: I think that’s a fascinating part of the book that I want to get to, especially the Deliberative Democracy Lab at Stanford, which you talk about in the book. There is a kind of more general sense of maybe what you call scientific optimism in the book, and it pervades the entire book, which is a kind of “can do” spirit that you get out of trying to resolve problems, trying to figure out what the truth is. And you talk about how this should suffuse how we think about public policy questions, but a lot of research has shown recently that the news is a lot more negative than it used to be. You see in the culture a kind of more negativism. Why do you think that is? And is that a major reason why you wanted to write this book too?

Robert MacCoun: Yes. So, this is something that emerged in the course of working on this, that it took us a while to articulate this concept of scientific optimism. And I want to be clear, scientific optimism is not a belief that everything’s going to turn out fine. It’s not optimism about outcomes. It is an optimism that problems have solutions and that if you identify a problem, there is a solution out there somewhere, and that we have to work hard enough.

It’s not “we can just wait and everything’ll turn out fine.” We have to work hard. But scientists tend to, when they hear about a problem, they tend to assume it can be solved, and so they work together to solve it. And that may be the most important thing that we teach students in in courses based on the book, is the idea that problems actually can be solved. And I find that younger generations right now are very pessimistic about the possibility of solving problems. And here it’s interesting because I think sometimes we can confuse pessimism and skepticism, and so I would distinguish between optimism versus pessimism as distinct from credulity versus skepticism.

And so science is all about skepticism. It’s about knocking down bad arguments, trying to find weaknesses in arguments. And that’s like the brake pedal on your car. And you need a brake pedal on your car. But if all you learn is the brake pedal, you need the accelerator pedal.

And scientific optimism is the gas pedal that keeps you moving forward. And, if you only have skepticism without optimism, you get cynicism. And there’s a lot of cynicism right now. One of the things I notice online is, like the easiest way to look like you’ve won an argument online is to take the most cynical position possible. It’s just very hard to top someone who takes the most cynical position possible. So you end up looking like the smartest person in the room when you’re cynical, but you also end up deflating everybody’s enthusiasm, everyone’s energy. And we’re now in a culture where we just we just indulge in all this cynicism, and so I think the scientific optimism is a really important trait to cultivate.

Pam Karlan: And some of the cynicism seems to be about the scientific method itself. I’m just thinking about what’s happening in the Department of Health and Human Services as a kind of  … object lesson of this.

Robert MacCoun: Yeah. So, part of the problem is, was brought on by scientists themself, and we try to be open about that in the book. And we talk about “pathological science” and we talk about scientific confidence. But you know what, scientists have become very isolated from the general community and very and very elitist and often very paternalistic. And that’s extremely off-putting. And a lot of people, I think, are disenchanted with science and are quite happy to read stories about scientists screwing up, or scientists who were biased or the pharmaceutical industry making scientists say whatever they want.

And so people have …  I don’t want to overstate the case. Surveys still show that scientists have higher trust ratings than politicians, lawyers…

Pam Karlan: And people believe in a lot of the science, even if they don’t believe in the scientists in weird ways. You think about … modern life depends so much on a huge number of scientific innovations that people take for granted. They take for granted that the brakes on their car, to go back to your example, are actually going to work. They take for granted that the airplane is actually going to fly.

Robert MacCoun: I think people love science and people don’t love scientists. So I think there was a time where people,

Pam Karlan: It’s kind of like how people hate Congress, but they like their own congressmen. It’s kind of the flip side version.

Robert MacCoun: I think science has to meet … scientists have to meet people halfway and the fact of the matter is that the professional reward system of science can reward people for pompous behavior, but also for taking extreme positions and refusing to admit they’re wrong. And my own scientific discipline of psychology is the poster child for problematic science because of all the evidence of the replicability crisis.

Diego Zambrano: Yes.

Robert MacCoun: But I’m actually extremely proud of my field because of the way it responded to that crisis. People in psychology now are really at the forefront of developing new methods to try to minimize personal biases in the research to try to get ahead of that problem.

Diego Zambrano: So Rob, you’re at this intersection of social science and law. You teach at a law school and we’ve been mostly talking about the scientific process. And we mentioned a few of the criminal law standards, but where do you think the legal process fits into this? Do you think law embraces many of these scientific heuristics tools, or actually that we teach the opposite? We teach advocacy rather than truth-seeking, and so is that a problem? Are you indicting law or actually you think law doesn’t…

Robert MacCoun: So, it’s funny you ask because just on Thursday I gave a talk at Yale Law School about adversarialism and inquisitorialism and this traditional distinction that law schools make between these different forms of legal fact finding.

There, there is a point of view that inquisitorialism works best for science. Which this was actually … there was a paper by Thibaut and Walker years ago called “Theory of Procedure,” where they argue that science is about truth conflicts and it requires in inquisitorial methods. Law, they argued, is about conflicts of interest and requires adversarial method.

And I think there are various problems with that stark dichotomy. First of all, every legal case is full of factual questions, as you both know, and scientists interact with a legal system as expert witnesses all the time. Second of all there’s a lot of adversarial behavior in science. So, I think talking about these as ideal types, it gets problematic. I think the rule of law and the rule of science … I think these are two pillars of civilization and in my view these are two pillars of civilization and they share something very important in common, which is, in my view, science and law are both proceduralism on steroids.

They’re both about not just imposing outcomes on people or just grabbing the outcomes you want, but they’re both about going through procedural steps which have a rationale to them that is believed to lead to better outcomes, but the better outcomes come from complying with procedures and not skipping the procedures.

And so that’s a way of thinking, this procedural way of thinking is something lawyers and scientists have in common, and it’s not actually the way businesspeople conduct their lives or, people in their ordinary life behave. So   proceduralism is, I think, really important for both.

Diego Zambrano: I also think law schools, hopefully, do a great job teaching students how to think about the best arguments from the other side, steel- manning the argument from the other side, really coming up with… that’s what we should be teaching a lot.

Pam Karlan: Yeah. Even if you’re just thinking instrumentally about how to do what’s best for your client, understanding what the best argument is on the other side and thinking about how to respond to it, even if the other side doesn’t make that argument, is really important. I find all the time in the clinic, the brief on the other side of one of our cases won’t be very good, but we need to figure out what the best arguments are and figure out how to deal with those, especially because the decision-maker, the judge might come up with those arguments on his own or her own. And if you just rely on what the other side says, you could be missing something really quite critical.

Robert MacCoun: Yes. It’s such an important skill to cultivate. And here, I think, I struggle with my own politics, tend to be left of center, but I take very seriously the idea that universities have lost some legitimacy in the eyes of the public because we are seen as disproportionately representing just one side of political issues. And it’s a very uncomfortable topic for us all to confront and I won’t say I think it’s easy to know what to do about it. But I do think for our students if all you do is argue with imaginary conservatives in your head, you’re going to win every time. And what you need is to actually sometimes argue with real conservatives, not imaginary conservatives.

And part of the challenge here is for argument across very extreme points of view to work, there has to be … there have to be some ground rules of good faith or it just, it doesn’t work. And at the end of the book we talk about … part of the challenges of trying to argue in good faith and have good-faith arguments because for us, good-faith arguments require entering into this discussion actually open to the possibility that you might end up being persuaded by the other side. If the only purpose of the argument is to bully the other side into submission, then you’re not arguing in good faith and that’s a challenge. It’s easier to talk about than to actually do. It’s very unpleasant to do sometimes.

Diego Zambrano: This point that you’re making refers back to what you said earlier about the habits of community, right? That you want to have communities that are intellectually diverse so that you get confronted with different ideas. And of course, in our profession, half of the judiciary is going be appointed by one party and the other half by the other party, on average. And so lawyers have to face judges from different political persuasions and have to be prepared to make arguments that appeal to those judges. And so that’s a really important point.

Now, we’ve been talking so far about two things, and we’ve touched a little bit on the third one. But first, what motivated the book, you mentioned the current state of information, misinformation, the importance of scientific optimism, of a can-do mentality.

And then second, all the different tools that the book talks about: habits of mind, habits of community. Finally, the last part of the book touches on decision-making processes, deliberative democracy. We’ve touched a little bit on having expert communities. I wanted to ask you a couple questions about this. Before we get to the deliberative democracy point, what happens when you have an issue where reams and reams of evidence don’t settle it? And it actually is factual, not just values, though, I do want you to talk about that. So take the minimum wage example that I started with. So does the minimum wage increase unemployment or not. You have decades now of studies almost perfectly aligned on both sides. Meta-analysis upon meta-analysis, basically finding no clear takeaway from the research. Hundreds of economists, right, on both sides. We can’t settle it. What do you do in a situation like that?

Robert MacCoun: I always point out to my students that we only look to experts for hard problems. We don’t look to experts for trivial problems like: is it a good idea to wear a coat in the winter and things like that. You don’t really go to experts for that. You go to experts for the hard problems. And so these are going to be the problems that are very difficult to resolve. And in particular, when it comes to public policy, they’re very … A lot of these problems are very difficult to resolve for ethical reasons, because we can’t just do experiments that might … where we do an intervention, that might actually shed light on, on some key questions.

So the issues can persist for a very long time. I think one of the key tools is provisional thinking, is that when you make decisions—you don’t necessarily have to make a decision for all time. You can make a tentative decision and you can make a decision in a way that allows you to actually learn more by doing. And so you can tentatively adopt a policy, but you have a sunset clause where you’re going to evaluate it. You’re going to set some goals for things you would expect to see if the theory of the case is correct. And if the evidence starts coming in looking like the theory of the case was wrong, then you revisit what you do.

But we tend to act as if every decision is for all times. We’re going to adopt rent control for all times or we’re going abolish rent control for all times, as if it’s a dichotomous variable that’s locked in place as opposed to a dial that we may need to keep adjusting. And I think most real public policy problems, they’re not problems you solve, they’re problems you manage. They’re dials that you’re constantly twisting and, we want this pure, …we want to finally reach this equilibrium where everything is finally perfect, but I don’t think a mature outlook on public policy would lead you to expect we’re ever going to reach that point. You’re always going to be twisting the dials and testing your ideas.

Diego Zambrano: And I, of course, agree with that. Experimentation, changes … although something like the minimum wage is particularly difficult because it’s sticky, once you raise it, there’s no way a politician is going to lower that. So there are…

Pam Karlan: No, but all you need to do is wait for a time of inflation and not raise it. And it’s the equivalent of lowering it, but I think you’re absolutely right, that policies can become very sticky. And that’s the difference in some sense between statutory law and common law, where common law can … it’s much easier to move the dials around.

Robert MacCoun: Yes, and to bring this back to probabilistic thinking, it tends not to dichotomize things. It tends to be shades of gray, but rule making is, it’s often very difficult to make rules in terms of shades of gray. There’s often … we feel a need to draw bright lines and to force reality to comply with these dichotomies that might not actually exist.

Diego Zambrano: I wonder if you could talk a little bit about the Deliberative Democracy Lab and maybe your decision making process that you outlined in the last chapters of the book, where you have both expert input, but also community input for values, and then you try to bring both factual thinking/ values thinking at different stages of a process, and then you make a decision.

Robert MacCoun: Yeah, so the end of the book we consider a variety of different ideas that are being tested for trying to improve public deliberation about real world problem-solving. And I won’t talk about all of those now, we don’t have time but James Fishkin’s idea of democracy, a deliberative opinion poll, is fascinating. The idea is that you bring citizens together for an extended period of time where they get briefed by experts on the issue and they get to ask experts questions and they get to educate themself before they actually weigh in and state their opinions. And citizens who participate in this, their views often change quite a bit throughout the course of the process and sometimes their views don’t change. But when you then poll the people, you’re polling people who actually have learned a lot about the issue and have thought very carefully about it. It also has a nice effect of making the experts … reminding the experts that they’re accountable to a public, and that their claims have great import in public life. And so it cultivates, I think, an important sense of responsibility among scientists. So I think it’s a very exciting model.

Pam Karlan: Thanks to our guest, Rob MacCoun. This is Stanford Legal. If you’re enjoying the show, tell a friend and please leave us a rating or review on your favorite podcast app. Your feedback improves the show and helps new listeners to discover us. I’m Pam Karlan with Diego Zambrano. See you next time.