Curiosity Daily

Inside Pseudoscience And Conspiracy Theories

Episode Summary

Renowned skeptic Dr. Steven Novella, host of The Skeptics' Guide to the Universe and author of the NeuroLogica Blog, joins the Curiosity Podcast to explain how to tell the difference between reality and fantasy – and why it's sometimes hard to do so. He brings years of experience as a neuroscientist and researcher to take a look inside the minds of both skeptics and those who believe in pseudoscience and conspiracy theories. Dr. Novella is an academic neurologist at Yale University School of Medicine. In addition to his work on The Skeptics' Guide to the Universe podcast, he is the president and co-founder of the New England Skeptical Society. His NeuroLogica science blog covers news and issues in neuroscience, but also general science, scientific skepticism, philosophy of science, critical thinking, and the intersection of science with the media and society. More from Steven Novella: The Skeptics' Guide to the Universe New England Skeptical Society NeuroLogica blog Additional resources discussed: "The Righteous Mind: Why Good People Are Divided by Politics and Religion" The Skeptic Movement Why Is There a Skeptical Movement? Carl Sagan's Life and Legacy as Scientist, Teacher, and Skeptic Science Curiosity and Political Information Processing (Study) Bill Nye Had a Fixed View on GMOs. Then Something Happened. Follow Curiosity Daily on your favorite podcast app to get smarter withCody Gough andAshley Hamer — for free! Still curious? Get exclusive science shows, nature documentaries, and more real-life entertainment on discovery+! Go to https://discoveryplus.com/curiosity to start your 7-day free trial. discovery+ is currently only available for US subscribers.

Episode Notes

Renowned skeptic Dr. Steven Novella, host of The Skeptics' Guide to the Universe and author of the NeuroLogica Blog, joins the Curiosity Podcast to explain how to tell the difference between reality and fantasy – and why it's sometimes hard to do so. He brings years of experience as a neuroscientist and researcher to take a look inside the minds of both skeptics and those who believe in pseudoscience and conspiracy theories.

Dr. Novella is an academic neurologist at Yale University School of Medicine. In addition to his work on The Skeptics' Guide to the Universe podcast, he is the president and co-founder of the New England Skeptical Society. His NeuroLogica science blog covers news and issues in neuroscience, but also general science, scientific skepticism, philosophy of science, critical thinking, and the intersection of science with the media and society.

More from Steven Novella:

Additional resources discussed:

Follow Curiosity Daily on your favorite podcast app to get smarter with Cody Gough and Ashley Hamer — for free! Still curious? Get exclusive science shows, nature documentaries, and more real-life entertainment on discovery+! Go to https://discoveryplus.com/curiosity to start your 7-day free trial. discovery+ is currently only available for US subscribers.

 

Full episode transcript here: https://curiosity-daily-4e53644e.simplecast.com/episodes/inside-pseudoscience-and-conspiracy-theories

Episode Transcription

CODY GOUGH: I'm curious, how is a person supposed to know what's true?

 

STEVEN NOVELLA: Yeah, that's the $64,000 question, right? It's a process. You never really arrive at any final conclusion. It's just that you question everything and just always remain open to the possibility that you're wrong and try to keep becoming less wrong as we say. Like, you're never definitively correct. It's more that we're just trying to minimize error as much as possible.

 

[MUSIC PLAYING]

 

CODY GOUGH: Here with the award winning curiosity.com, I'm Cody Gough.

 

ASHLEY HAMER: And I'm Ashley Hamer. Today we're going to learn about scientific skepticism.

 

CODY GOUGH: Every week, we explore what we don't know because curiosity makes you smarter.

 

ASHLEY HAMER: This is the Curiosity podcast.

 

[MUSIC PLAYING]

 

CODY GOUGH: For some people, pseudoscience and conspiracy theories are kind of fun to talk about but others take things a little too far. And when they do, why can't you convince them that a fact is a fact? Here to discuss, my guest this week is Dr. Steven Novella an academic neurologist at Yale University School of Medicine, and one of America's favorite skeptics.

 

I'm here with Steve Novella. He is the founder of the New England Skeptical Society, the host of the Skeptics Guide to the Universe podcast since May 2005. Author of the neurological blog. As if he wasn't busy enough, I think the most important title here is also a pen and paper role playing gamer.

 

STEVEN NOVELLA: Occasionally.

 

CODY GOUGH: Occasionally. What's your poison right now?

 

STEVEN NOVELLA: My brother Jay and I are running a Star Wars campaign for some friends of ours. So we're sort of co-jamming that. And I sort of I reconnected with my medical school friends because we had a campaign at that time and we've been trying to run our old campaign completely online. So we're doing it-- because they all live all over the country, so we can't physically get together. But it's actually working out pretty well.

 

CODY GOUGH: That's really incredible to me because your whole, I guess career or personality is based around being skeptical. But you have a pretty strong interest in science fiction and fantasy and things like that.

 

STEVEN NOVELLA: Oh, yeah, absolutely. Yeah. I'm a total nerd when it comes to all things science, fiction, and fantasy. But I think that that's not only is it not incompatible with being a skeptic, a rationalist a scientist, but I think it's actually helpful. Because I see a lot of people who they also crave something fantastical, something interesting entertaining in their life but they confuse it with reality.

 

I think a lot of people who believe in pseudo scientists or conspiracy theories or whatever, they're just bored. They should seriously just take up role playing and then cleanly separate their fantasy from reality.

 

CODY GOUGH: That actually makes a lot of sense. So a lot of the conspiracy theorists are just thinking about 9/11 is some big thing because they aren't watching Game of Thrones.

 

STEVEN NOVELLA: I think it's part of the appeal. I don't think they would see it that way. But we know from research, actually a recent study in fact, that a lot of people believe in conspiracy theories because they want to feel special. And I think also because they just want the world to be a more interesting place.

 

It's easy just to get into your boring routine, and who wouldn't want the Earth to be visited by aliens or for there to be some big secret thing that you're privy to? Like you're one of the few people who know what's actually going on. It's massively entertaining. It takes over their lives often.

 

ASHLEY HAMER: There have been a few studies into what makes people believe in conspiracy theories. In 2008, Jennifer Whitson and Adam Galinsky made a group of volunteers feel a lack of control by showing them random cards and making them choose the correct ones without giving them the rules.

 

Then, they tested to see how likely they were to see imaginary patterns, like images in noise and correlations and stock market data. People who felt that lack of control were more likely to make up patterns that weren't there.

 

And in 2016, Princeton University researchers, Damon Scribner and Alan Coleman, found that when they made people feel socially excluded, either by having them remember a time they felt that way or by telling them another volunteer rejected them as a partner, those people were more likely to agree with conspiracy theories and demonstrate superstitious thinking. Socially isolated with no control over your life. There's a reason that the stereotypical conspiracy theorist is a loner in his parents' basement.

 

STEVEN NOVELLA: And I have no problem with people finding entertainment in whatever they want to do to make their lives more interesting and happy. But yeah, I think the big risk comes when they want to believe it so much that they start confusing it with reality. And I think they just need to invest more time in some hobbies, some benign hobbies that don't involve distorting reality.

 

CODY GOUGH: That makes a lot of sense. So I've got to ask, how is a person supposed to know what's true?

 

STEVEN NOVELLA: Yeah, that's the $64,000 question, right? And there's no easy answer to that. I think the short answer short-ish answer is that it's a process. You never really arrive at any final conclusion. It's just that you question everything, you follow some kind of valid process to try to figure out what information is reliable and valid and what information perhaps is sketchy or not reliable.

 

And just always remain open to the possibility that you're wrong and try to keep becoming less wrong, as we say. Like, you're never definitively correct. It's more that we're just trying to minimize error as much as possible.

 

CODY GOUGH: But you said that you need to be open to being wrong. Nobody wants to be open to being wrong.

 

STEVEN NOVELLA: Yeah, it certainly goes against our basic psychology. We want things to be simple and we want to be correct. Those are often at cross purposes to each other. Because that being-- something being understandable, simple, gives us a sense of control. It allows us to mentally check that box. Yeah, I get this. I understand that. I totally have my head wrapped around it. I'm going to take this as reality and then move on and not worry about it. That's nice. It reduces the chaos and complexity of our lives.

 

And we have to do that to some extent. But the risk is when you get to the point where you're no longer able to consider the possibility that you're wrong about something, but you always have to be open to that possibility. Because chances are you are wrong about most things, or at least your understanding is massively incomplete.

 

And unless you are the world's expert on some narrow topic, there are other people who are going to know a lot more about it than you do. So anyway, I think that the bottom line is that you should always be in a state of learning and never think that you've ever arrived on any topic or on any idea. And that's probably the best state to be in.

 

CODY GOUGH: I agree that's a good state to be in. But you mentioned questioning everything. And I'm kind of curious about the motivation behind that. If I tell you I believe in God or something like that, is your immediate reaction that you clench up and you think, that can't be possibly logical? I mean, what's the motivation behind being so skeptical? Is it because do you like being a contrarian or do you like when things are accurate? What's the motivation?

 

STEVEN NOVELLA: I think a lot of people confuse being a contrarian with being a skeptic. I have encountered this, especially online. People who I would characterize as a contrarian, who fancy themselves skeptics but they're not. I think scientific skepticism ultimately is about having opinions and beliefs that are as close to reality as possible.

 

And being a contrarian means taking a contrary view just for its own sake. And so sometimes that will involve denying beliefs which are probably close to being true, or taking a position that's actually farther from the truth just because it's contrary.

 

And also it makes you predictable in that way. You're always taking the contrary view, that's actually a simplistic approach as well. It's easy to do that. It's kind of knee jerk. It's like it's easy to be cynical. It's kind of cheap. You're avoiding a lot of the hard intellectual work it takes to thoroughly evaluate something, to really question, to ask God, is this really, really true?

 

You bring up belief in God, which is tricky because a lot of people believe in God not because they want to have beliefs which are true but for other reasons. Because whatever. They think it's a virtue to have faith or because they want or they think that there's something deeper to the universe. It's complicated. It's very cultural, personal know ideological. It's not purely like people wanting to believe what's true.

 

And so you have to ask, if someone says they believe in God, a lot of it depends on who are they to me? What's my relationship with them? And if it's such a relationship that I could take it further, the next thing I'd want to know is why? And if they say, well, I just choose to believe because it makes me feel good, then fine. That's a personal choice they could make for themselves.

 

If they tell me, because I have scientific proof that God exists, that's at the other end of the spectrum. Then now they're in the scientific arena. It's like, OK. Well, if you're going to make a scientific claim, now you have to defend that position with logic and evidence. If you just make a faith-based claim of personal choice, I can't really address that in any way in terms of being a skeptic. It's just that's just a personal choice.

 

CODY GOUGH: So your particular brand of skepticism then is more about a curiosity and wanting to explore the why versus just being right about something.

 

STEVEN NOVELLA: Yeah. I mean, I try to be correct. I mean, that is part of the motivation. I don't want to believe stuff that's wrong. But it is also deeply interwoven with scientific curiosity. So scientific skepticism, a lot of which is modeled after Carl Sagan but we've evolved I think a lot beyond that point, is about combining those two things.

 

Cutting away what's probably not true, but being really enthusiastic about how amazing our universe is and how fun and interesting it is to explore it and to have that curiosity. So I think Sagan probably, as a science communicator, was the best simultaneously combining those two things.

 

ASHLEY HAMER: Carl Sagan is definitely the most famous figure in scientific skepticism. But when you trace its history, you find a lot of founding fathers. Most people generally put the start of the modern skeptics movement in 1976 when philosopher Paul Kurtz along with Sagan, the magicians James Randy and Martin Gardner, psychologists Ray Hyman and B.F Skinner and others founded the Committee for Scientific Investigation of Claims of the Paranormal, or CSICOP.

 

But things really took off in the 2000 with the rise of the internet. And some estimates put the number of skeptics today at more than a million worldwide. In any case, Sagan's style of skepticism was less about exposing errors and more about showing what was correct through the wonder of science. That's why he continues to be such an inspiring force years after his death.

 

STEVEN NOVELLA: And we're always trying to balance that as well. Like we're science enthusiasts, we're technophiles. We're always interested in the next big thing. But at the same time, we have to go, yeah, but we can't get so enthusiastic that we believe in something prematurely or we forget to ask important questions.

 

And that's basically the scientific attitude every working scientist I know combines those two things. You have to be curious and want to know how reality works. But at the same time, you have to really be dedicated to a harsh no-nonsense process of cutting away error and what isn't true. Otherwise, you veer towards pseudoscience. You end up just validating what you want to be true. And it's really easy to do that, unfortunately.

 

CODY GOUGH: Isn't it also easy to mix up real science and pseudoscience if a couple studies maybe disagree on the results of a particular experiment even within the scientific community? Isn't that difficult?

 

STEVEN NOVELLA: Oh, it's massively difficult. It's a mess. It's chaos. First of all, there's no demarcation. There's no sharp demarcation between science and pseudoscience. And philosophers of science literally call that the demarcation problem.

 

There's just this fuzzy gray zone between two ends of a continuum of a spectrum with pretty solid science at one end and completely worthless rank pseudoscience at the other. But there's a lot of stuff in the middle. No study is flawless. Every study has limitations if not outright flaws.

 

Scientists are people and they make a lot of mistakes and they have a lot of biases. And no one study is ever going to give you the definitive complete answer to anything. So that means there's always room for judgment.

 

You have to think about, what does all the evidence show? What's the quality of that evidence? What do people who disagree with it have to say? And why do they disagree with it? How solid a consensus is there? What really is the evidence for this, or is this just our placeholder for now? Is it more just a model that really hasn't been shown empirically yet?

 

You have to ask all of these questions and then really try to understand what the experts say to arrive at some kind of reasonable opinion about any scientific question. And again, that takes a lot of work. There's really no shortcut or simplistic way to do that.

 

And so it opens the door for denialism and pseudoscience and sloppy science, and bad science, bias science of every variety. And it's easy. It's really easy to get lost in that. It really is a very high energy state to maintain a robust evaluation of scientific claims.

 

Most of the time for most topics we really should just be trying to understand what the scientific consensus is because we don't even have the background to do a deep dive on the evidence itself.

 

CODY GOUGH: So I also think in addition to all the scientific factors that are weighing, there's also the outside influences. I think about in some of America's earlier days in the early days of television and radio, you would have advertisements 9 out of 10 doctors say that this cigarette is the best and talking about how cigarettes aren't dangerous and things like that.

 

Then you've got people today that take that use it as an example say, hey, listen. The cigarette people just paid off a bunch of scientists. Well, now climate change. That's the whole thing. Well, aren't a bunch of solar power investors and wind power investors just paying off all these scientists to back up climate change?

 

STEVEN NOVELLA: So it becomes very easy to do that, to deny anything you don't like by saying, oh, it's big pharma, it's big whatever. Big solar, which is ridiculous or it's just scientists are trying to get funding for their research. I mean, you could invent a secondary gain, a motive to deny anything. There's always a bogeyman that you can point to.

 

So the fact that you can make up a shell argument like that doesn't really tell you much. You have to look for evidence that it's actually going on and try to figure out, what does the scientific evidence say despite these accusations?

 

So there isn't any evidence that there's any big conspiracy of the world's climatologists to fake climate change. Certainly the solar industry doesn't have the resources to be controlling the world scientists. If anything, the fossil fuel industry with its billions of dollars, they're putting their thumb on the scale.

 

But they are unable to do it. They are unable to get the world scientists to agree with their agenda. They're trying, but they can't do that. So they've had to satisfy themselves with causing doubt and confusion.

 

And they're actually using the same people in many cases and certainly the same strategy, as the tobacco industry did, to try to delay any kind of regulation against tobacco by just sowing doubt and confusion. Well, we're not sure. There is no consensus.

 

Look at this scientist over here, he disagrees. Just trying to muddy the waters as much as possible. And if you want the waters to be muddied because it goes along with your ideology, yeah, it's really can. It's kind of prepackaged for you.

 

Here you go. Here are five reasons to deny this science you don't want to believe in the first place. You have to have a real dedication to again to understanding what the science actually says to rise above that.

 

Conspiracy theories are just too cheap and easy. I mean, you can't-- if you're just invoking it out of whole cloth just to deny something, your chances of that actually being reality are slim to nothing.

 

All you're doing is then taking one step back, but you have to then provide evidence that there's actually a conspiracy. And of course, they can't do that because there isn't one because the idea is actually kind of silly if you know scientists.

 

That isn't to say scientists are never biased. They are. It just takes a long time for everything to out. You need a lot of scientists from different parts of the world, different kind of approaches and their biases will tend to average out over time.

 

I'm always asking myself, is this a mature science? Is this mature to the point where we've sorted out a lot of the controversies and biases and we're getting to a consensus that is robust? And that's a continuum. There's no sharp demarcation line.

 

So you should always be thinking, not just what does the science say, but how reliable and robust is it? How much of a consensus is there? How mature is it? And not just look for trivial reasons to deny it because you don't like what it says.

 

CODY GOUGH: I know you mentioned that you see a lot of those kinds of arguments online. It seems to me that things are taken out of context more than they ever have and people are using straw man arguments more than they ever have. You've been a skeptic for a long time. Do you think that things are worse than they used to be in terms of the conversation?

 

STEVEN NOVELLA: No. I think overall people have always been making straw man arguments and denialist arguments. And going back to the beginning of my skeptical career 20 plus years, it's the same. I just think more people maybe are getting involved or it's easier to engage with a lot of people so you're seeing it more. But I think that people are fundamentally the same.

 

I think maybe politically things may become a little bit more divisive or polarized. But in terms of how people argue, I haven't seen a big difference there. I do see a lot more awareness of the tools of skepticism, which is a good step in the right direction. I think people are a lot more aware of cognitive biases and heuristics and logical fallacies, but I don't think they understand them deeply enough.

 

And so they have a sophomoric understanding. A lot of people say of logical fallacies, which is a good start but you're not quite there. And so it becomes really easy to shut down meaningful discussion by simply labeling something a logical fallacy.

 

ASHLEY HAMER: Wait, wait, wait. What's a logical fallacy again? It's a principle in philosophy that's defined as an error in reasoning that makes an argument invalid. You may have heard the term non sequitur, which literally means does not follow. All logical fallacies are non sequiturs because their conclusions don't follow their premises in a logical way.

 

One famous one and one Cody and Steve use a lot in this interview, is the straw man fallacy. That's where you argue against some misrepresentation of someone's position instead of the one they actually hold. Instead of trading blows with the idea, you're attacking a straw man you made of it. There's also ad hominem, where you attack the person instead of their idea. And the appeal to authority, where you say your argument is valid simply because some important person says it is. For a deeper dive, just search fallacy on curiosity.com.

 

STEVEN NOVELLA: And they're really not optimized to be used that way. You really shouldn't try to use a logical fallacy that way. You should use it to understand your own thinking and your own arguments better and to make them better and more valid, tighter, stronger arguments.

 

Just using it to say, oh, that's a straw man or whatever just to throw a label, that's an ad hominem. Anyone can do that. And in fact, the denialists and the pseudo scientists are doing that. They immediately adopted the verbiage of logical fallacies and biases that skeptics use and then twisted it to their ends by using them very superficially. Using them wrong essentially.

 

So I think that we're halfway there. I think the knowledge of these tools of critical thinking is much more out there than it used to be, but we need to keep pushing to raise understanding of these critical thinking tools to a much higher level.

 

Certainly a lot of individuals are. Do, I think, have a more nuanced or sophisticated understanding. I see a lot of that too. So I think we're actually making a lot of progress, but there's still a long way to go.

 

CODY GOUGH: That's in encouraging to hear from you that you think we're maybe starting to steer the ship in the right direction. If you had your wish of maybe the next step that people would take, what should people know and be aware of that will help us continue that progress?

 

STEVEN NOVELLA: Again, I don't think there's any one thing, just like all the things that we've been talking about knowledge of science or anything. It's a lifelong process. t And maybe that's the thing that I would want people to know, is don't stop. Don't think because you could name 20 logical fallacies that you're done, you're a critical thinker or you're a skeptic.

 

Because when you first learn these things, you're going to use them wrong. You're going to use them just to validate what you already believe. You have to get to a pretty high level before you actually start making your own beliefs and arguments more valid, more rational.

 

So it's the same thing with science and this has been studied as well interestingly. Again, 30 years ago, we might have said, oh, yeah, the more people understand science, the less they'll believe in pseudoscience. But then we studied that. I mean, the royal way psychologists studied that. And they found that that's actually that relationship isn't that simple.

 

That the more people understand science based upon their level of education, actually the more they believe in pseudoscience. Until you get to the highest levels of like post-graduate science education, then belief in pseudoscience plummets.

 

ASHLEY HAMER: A 2016 study from Yale bears this out. They used a basic science quiz to measure volunteers scientific knowledge then let them choose something to read to measure their scientific curiosity. Those who read about science over sports or politics got a higher curiosity score.

 

Next, the participants rated how concerned they were about a variety of scientific but politically charged issues, like global warming and fracking. Unsurprisingly, Democrats were more likely to judge those two issues as risky and Republicans were less so.

 

The researchers found that the more scientific knowledge a person had, the more likely they were to be polarized on the issues. But one thing made the Democrats and Republicans get the closest to meeting across the aisle. Curiosity. Dems and Republicans with the most scientific curiosity were the least polarized. Maybe we should change our tagline to curiosity makes you less biased. Doesn't quite roll off the tongue.

 

STEVEN NOVELLA: And I think it's the same thing with critical thinking. I think as people get more critical thinking tools initially, they just use them to justify their own beliefs. They get more confident in their own wrong beliefs and are better able to defend them. They're basically just more sophisticated and intelligent about rationalizing what they want to believe in the first place.

 

And you have to get to a pretty high level before you actually start challenging your own beliefs and changing your own beliefs, making them better, more nuanced, more versatile, and more legitimate, more robust because you've actually applied those critical thinking tools systematically to your own arguments and your own position.

 

So you have to sort of pass through that initial phase of just using this knowledge to be a more sophisticated believer in the same nonsense you've always believed. And some people, I think had stuck there. So keep pushing through, I guess is the bottom line.

 

CODY GOUGH: And it's almost like no matter how good the science is or how sound the scientific study is, it almost comes down to the psychology of how to communicate with people and how to convince people to change their way of thinking, right? I mean, do you look at a lot of the psychology of this kind of stuff?

 

STEVEN NOVELLA: Yeah, absolutely. From the beginning, we've always had a very keen interest in social psychology and in the psychology of belief. It's always been part and parcel of skepticism because essentially we are science and critical thinking communicators. And so we're trying to not only do science communication, but get people to think more critically and understand critical thinking.

 

And the only way to really do that is to understand the psychology of your own belief. Because if you're a motivated believer, we call this motivated reasoning, it doesn't matter. There's no way I can give you information to get you out of that belief.

 

You have to, at some point, have some insight into your own psychology. Otherwise, again, you're just going to use all of these tools just to reinforce the belief that you want in the first place. So you have to at some point confront the psychology of belief.

 

And being a skeptic means that you consciously prioritize having beliefs which are valid over beliefs which are not valid. You have to care more about the process of how you go about evaluating beliefs than any particular conclusion. You have to relish being proven wrong as an opportunity to change your belief and to make it less wrong. To make it better.

 

And if you don't sort of turn that psychology around, because I think we inherently start out as being curious but at the same time being really defensive about things we already believe and things we want to believe, is a massive confirmation bias and what psychologists call a desirability bias. And until you turn that on its head, you're just going to be servicing those beliefs, not really being skeptical.

 

CODY GOUGH: I mean, I think of the book The Righteous Mind by Jonathan Haidt. I don't know if you've read that.

 

STEVEN NOVELLA: I haven't read that one.

 

CODY GOUGH: Oh, OK. Yeah, the subtitle is why good people are divided by politics and religion and talks about what you were saying is that humans are basically-- we have this belief system. And when something challenges it. It's almost like our brains make back flips trying to rationalize why my idea-- not fitting that into reality but fitting that into my reality and trying to rationalize that.

 

STEVEN NOVELLA: Yeah, absolutely. Yeah, that's we call that motivated reasoning. We're really good at it. And ironically, smart people are better at it. That's what I was saying before. If you're better educated, you have some knowledge of science some knowledge of critical thinking, that just feeds your motivated reasoning until you get to the point where you really start to turn it inward.

 

You actually, not only are OK with the idea but almost welcome the idea of changing what you believe because that's a marker that you're moving. You're progressing. You're moving in hopefully the right direction. If you're just using it to justify beliefs that you've had for a long time chances are you're just engaging in motivated reasoning.

 

Another way to look at this, I'm always the most suspicious of beliefs that I have, conclusions I come to that are in line with my own ideology. So if I have a particular worldview and this like supports my worldview, I have to be especially suspicious of it because that's what I'm going to be most vulnerable. That's when my motivated reasoning is going to try hard to engage.

 

And also just confirmation bias. I'm going to want to just, oh, yeah, that supports what I believe and want to believe. So yeah, I'll believe that it makes sense. I'm not going to question it. But that's exactly when you should question it the most.

 

CODY GOUGH: Wow. That is a tall order.

 

STEVEN NOVELLA: Yeah, it's a high energy state, absolutely. And it's easy to-- it takes a lot of vigilance, a lot of practice and a lot of dedication. Yeah, it's a lifelong process. There's no shortcut to that you just have to really be dedicated to policing your own thinking.

 

CODY GOUGH: And for that kind of thing, do you recommend-- Some people on social media will say, well you know I watch CNN and Fox News to get both sides of the story. Do you find that kind of behavior helpful?

 

STEVEN NOVELLA: Absolutely. I mean, I never get to the point where I feel even a little comfortable about any topic unless I've listened to what all sides have to say about it. So my process partly is all right, so what are the different points of view? What are they saying about it? OK. Well, what does this group say about it? What the other group is saying? What do they say in return?

 

And you try to work all the way through that process until you see OK, who has the most consistent upper hand here? Who's making the better arguments? Who has the last word when it comes to the evidence or logic? And who's like resorting to conspiracy theories or like horrible logic in order to defend themselves when they've essentially been defeated in their position?

 

But you have to work through the whole thing to really understand that. So yeah. I mean, I can't listen to Fox News for that long because-- I do. I watch Fox News more than I'd like actually, but just to try to-- or I read lots of articles on both sides. I do more reading than watching. But I really try to wrap my head around what the ''other side or what both sides are saying.'' Because otherwise, you're going to probably be attacking a straw man yourself.

 

So give people the benefit of the doubt. Really try to understand their position, try to understand the best version of their position and go from there. Because otherwise, chances are you're going to be just attacking a straw man to confirm what you want to believe in the first place.

 

ASHLEY HAMER: There's one easy thing I like to do to understand both sides of an issue. Whenever I hear something that's brilliant or mind blowing, or that I just want to tell everybody, I'll hop on Google and search for that thing along with the word myth or wrong or debunked. If the thing is popular enough, you'll usually find someone tearing it down. If their take is nonsense, go ahead. Spread the good news. But if it convinces you, you dodged a bullet.

 

CODY GOUGH: You've spoken a lot about vaccines and autism and homeopathy and AIDS denialism and other conspiracy theories, like 9/11 and things like that. What's the craziest thing you've come across?

 

STEVEN NOVELLA: I mean, it's hard to say because there are a lot of them are equivalent in terms of the degree to which they deny reality. I could tell you the most recent extreme belief that I've done a deep dive on is the flat Earth believers.

 

And yeah, the first thing people say when I bring that up is they don't really believe that, do they? It's like no, no. Yeah, they actually do. And I mean, the degree to which they have to distort their perception of reality in order to maintain belief that the Earth is actually flat is astounding.

 

Look, it really is-- and it is a massive conspiracy theory. Because essentially every bit of evidence that the Earth is actually roughly a sphere is a conspiracy. And then you would think, of course, that has to be an extreme conspiracy and you're right. They think that it's a massive multi century international conspiracy.

 

I mean, it's just the degree of altered reality here it really is amazing. And I think it does get on the verge of like being diagnosable sometimes. I mean, this is just more than just a conspiracy theorist. This is somebody who has a difficult relationship with reality.

 

But again, I know people. I know people who are mentally healthy they don't have a diagnosable mental illness, but they've gone down that rabbit hole and have been convinced that there's this big conspiracy to convince us all that the world is round when in fact it's flat. It's amazing.

 

CODY GOUGH: It's kind of caught wildfire. It's like a trend almost, like a fad. How do these catch fire?

 

STEVEN NOVELLA: It catches on. This is where social media makes things happen faster. I don't think it's a different process. Just a faster, more widespread process with social media. But I think it comes largely from some motivated person packaging all of the motivated reasoning in one easy to consume venue.

 

Like here's a YouTube video. Watch this YouTube video for an hour and it'll go through hundreds reasons why the world is really flat. And then people get overwhelmed like that. It's like loose change for 9/11 conspiracy theories. It's a prepackaged motivated reasoning and confirmation bias. So it really challenges your critical thinking skills.

 

The thing that amazes me is that there are certain claims that don't pass the smell test like the flat Earth thing. It's like, I don't necessarily have to be able to know offhand in fine detail why each factual claim is wrong in terms of the flat Earthers.

 

I could start from the position well, this is totally ridiculous. I'm going to start from that position and you have to convince me that it's not absolutely absurd and ridiculous. And that's I think that's a perfectly reasonable default position to take.

 

You should always be able to be convinced, and I do that as well. It's like yeah, convince me. Give me your best shot tell me what evidence you think is the most convincing no matter how absurd I think the belief is. Go right ahead. But it is reasonable to say, but you have the burden of proof. You're making the extraordinary claim here, you have the burden of proof that whatever that the world is flat.

 

And they don't come anywhere close to that burden, of course. All they really have is just really, really bad arguments and completely extreme conspiracy theories. That's really all that they have as you might imagine, because the world is not flat.

 

But it is a good case study. And again, as a skeptic, I look at it also from an academic point of view, almost like a psychological case study. It's like wow, this is what the human brain is capable of. Wow. That is interesting that people can go down that rabbit hole and actually convince themselves that NASA has armed militia on the border where the ring of ice around our planet and they'll shoot anybody who gets close to the edge of the Earth. Wow, people actually believe that. That's amazing.

 

CODY GOUGH: That's part of it?

 

STEVEN NOVELLA: Oh, yeah. Well, because if you say, well, go to the edge and show me the edge of the Earth? Well, you can't because NASA will shoot you if you get too close.

 

CODY GOUGH: I did not know that was part of the conspiracy theory.

 

STEVEN NOVELLA: Oh, yeah. It really is fascinating if you have a couple of hours to go down that rabbit hole. It is amazing. And of course, you say, what about all the pictures and videos of orbit and of the Earth and everything? It's all faked. Everything's faked. Really? All of it? So any amateur astronomer or website, a telescope, they're part of the conspiracy.

 

CODY GOUGH: That's so bizarre. And these people, it's like part of their identity. I mean, you attack this belief and they defend it like you're attacking them.

 

STEVEN NOVELLA: Yeah. Well, because you are kind of attacking them, to be fair. Because when you take an extreme belief, that automatically puts you on the defensive because then people think you're nuts because you believe something like the Earth is flat. So now you have a huge motivation to justify your belief. So it does become part of your identity because you make it that an attack on the belief is an attack on you. You know what I mean?

 

So that's why I think it brings us back to this notion that you can't use beliefs as identity. You have to be able to change-- your identity should be, I'll believe whatever the evidence says. That has to be your identity. I'll believe facts and logic, not any particular belief.

 

Because then otherwise, you're locking yourself into that belief and you're essentially guaranteeing that you're going to have to get defensive and engage in motivated reasoning when people show you that no, actually the facts don't support your belief.

 

CODY GOUGH: Wow. Well, that's really fascinating stuff and I hope that people can take some takeaways from this, that they should just self examine a little bit more and ask the hard questions that are hard. And have you ever, I'm assuming, had something that you believe that you found really, really difficult to change your belief about?

 

STEVEN NOVELLA: Yeah. I mean, we talk a lot about that too about changing belief. And most of the time, there's no Eureka or aha moment where the scales fall from your eyes like, oh, I was wrong about this all the time. It's more of a process and it should be a process.

 

It's just that oh, you incrementally change in reaction to each new bit of evidence and looking at things a certain way. And then over time, your beliefs evolve. And I think my beliefs are constantly evolving, hopefully getting more nuanced not necessarily changing from one thing to the other. I think I'm probably going to always believe that life on this Earth evolved.

 

At this point, the evidence is so robust it's hard to imagine that we were this wrong or that evidence would come to light that would completely alter that fundamental conclusion. But my understanding of evolution and how it works and how it progresses is constantly evolving and has changed a lot. It's very different than it was 20 or 30 years ago. And same thing with a lot of different beliefs.

 

And then there are things where I sort of suspend my conclusions for a while. Again, if you go back like 20 years or so, I was probably agnostic towards global warming until I really wrap my head around it enough to understand what the consensus was and how robust was it, and what they were actually saying.

 

And I'm like, OK. Yeah, this is the belief now. They say it's 90 to 95%, so it's 90 to 95%. That makes sense. Then I was able to process all of the denialist positions and I gave them a fair shot. I always do that.

 

And again, if you think about it this way as a self-identified skeptic and science communicator and critical thinking communicator, if I came to the conclusion-- I've said this in many things. Same thing with vaccines and autism. The first time I didn't write about it until I did a really deep dive on it.

 

And I went into it thinking-- and the first book I read was a book promoting the idea that there was a connection between the two evidence of harm. And I'm like, OK, I'm going to go into this with an open mind and see what the evidence really has to say. Give them a fair shot.

 

And I'm always thinking to myself, now if I come to the conclusion that there's a massive conspiracy to distort the science and that global warming isn't happening or that vaccines do cause autism, I will become famous. If I could really defend that position, if that's really my position, if that's what the evidence actually shows, nothing would be better for my career as a science communicator if I feel I could really defend that position.

 

But I didn't come to those conclusions. I came to the conclusion like yeah, the evidence is really solid that vaccines are safe and man-made global warming is a thing. It's happening. So that's kind of the boring answer because it goes along with the mainstream consensus.

 

But I always find it amusing that people think, oh, you're just motivated to come to that answer. Actually, the opposite is the truth. It would be massively helpful to my career if I at any point could really defend the position that what most people believe is not true about something. That's always the more interesting position to take.

 

CODY GOUGH: Interesting. Even if you're being censored and the scientific community starts to reject you and the media doesn't want to talk to you, I mean, you really still think it would be a boon for you?

 

STEVEN NOVELLA: Yeah, absolutely. But again, the premise here is that if I'm correct. If the science were on the side of there being a connection between vaccines and autism and I felt that I could make that point, then absolutely. I would fight that fight to the end. No question. But of course, you can't do that because the evidence doesn't support that position.

 

As a critical thinking skeptical communicator, the best thing for my career always is to take whatever position is actually supported by science or evidence. I have to always defend my process. I always need to do that. Whatever the conclusion is it's never worth it to me to compromise the process because one conclusion or another is in itself better, or for whatever reason you might think.

 

But if it just so happened that we are being visited by aliens and I could really make a strong case for it or whatever nutty belief you want to include there, yeah, that would be fantastic. I'd love to do that. That would be wonderful.

 

ASHLEY HAMER: One recent example of a high profile figure who changed his mind when he examined the science is Bill Nye. In a 2014 AMA on Reddit, one fan told him that he was disheartened by a 2005 episode of Bill Nye the Science Guy, where he warned of the potential dangers of GMO food.

 

Nye said his views hadn't changed and that led to a flurry of articles by biologists asking him to take a second look at the science. What happened? He took a second look at the science. In a backstage interview for Real Time with Bill Maher in 2015, Bill Nye announced that he would revise the GMO chapter in his recently published book Undeniable.

 

He said, ''I went to Monsanto and I spent a lot of time with the scientists there and I've revised my outlook. And I'm very excited about telling the world. When you're in love, you want to tell the world.'' If a big name like Bill Nye can admit he was wrong, it ought to be pretty easy for the rest of us.

 

CODY GOUGH: Well, thank you so much for spending all this time talking all things pseudoscience. I just want to wrap up with one final segment we call the curiosity challenge. I will try to teach you something or at least quiz you on something that maybe you don't know, but it is kind of in your realm of expertise, so you may know this answer. And then I'll give you the opportunity to do the same for me.

 

STEVEN NOVELLA: OK.

 

CODY GOUGH: There is an airport in the United States that is the home to countless conspiracy theories. Can you tell me which airport, and if you're aware of any of the conspiracy theories?

 

STEVEN NOVELLA: Well, by airport, are you including Area 51?

 

CODY GOUGH: Not Area 51. No.

 

STEVEN NOVELLA: Not Area 51?

 

CODY GOUGH: No.

 

STEVEN NOVELLA: An actual commercial? Like a commercial air airport?

 

CODY GOUGH: Yeah, inetrnational commercial airport.

 

STEVEN NOVELLA: Let's see. It's a good one I don't recall any conspiracy surrounding a commercial airport. No, I don't know that one.

 

CODY GOUGH: Oh, wow. Crazy. So you can learn about this on curiosity but it's actually the Denver International Airport.

 

STEVEN NOVELLA: OK. What's the conspiracy theory?

 

CODY GOUGH: So there's actually a number of them. So apparently, according to a dedication marker inside the airport, the New World Airport Commission is responsible for building the airport,

 

But the New World Airport Commission doesn't exist. And so some conspiracy theorists believe the mention is a nod to the New World Order or the Nazi party. And theorists say that some of the runways kind of looked like a swastika, so that could be evidence.

 

And there's also a weird creepy-- I don't know if you've been to the Denver International Airport but there's a weird giant statue that's kind of a creepy mascot. It's called the Blue Mustang. It's a 32 foot tall blue horse with glowing red eyes and it's been dubbed Bluecifer by the locals. And some think that the statue has evil energy. And the massive Mustang killed its sculptor after falling on him and severing an artery.

 

Other people claim it's the headquarters of the Illuminati, its imagery portrays satanic messages. It's hiding secret bunkers underneath. It's got quite a few conspiracy theories. So the next time you're in the Denver International Airport, take a look around.

 

STEVEN NOVELLA: I'll take a look. Now that you say it, it sounds vaguely familiar, the New World Order bit does. I think I probably have heard that before. I just forgot about it. I think that we may have even mentioned that on the show 10 years ago or something. Yeah, that's a good one.

 

CODY GOUGH: You've been at this for a long time and you've been-- I mean, your podcast has been around since before any podcast craze was happening.

 

STEVEN NOVELLA: 2005. Yeah, before podcasting was a thing on iTunes.

 

CODY GOUGH: Oh, my God. How did you decide to go into that route?

 

STEVEN NOVELLA: Well, we were already doing the skeptical thing and we were looking for more ways to generate content and looking to get into more online, social media content. Friend of mine said, hey, there's this new thing called podcasting. And I stole that idea and we created the Skeptics Guide to the universe and it just worked out really.

 

CODY GOUGH: Well, it's literally has to be one of the longest running podcasts I think of all time. It's been weekly the whole time?

 

STEVEN NOVELLA: Pretty much, yeah. 636 episodes.

 

CODY GOUGH: It's unbelievable. And I've listened to a lot of it and it is very good. So people should check it out. And we'll also pass the ball to you for your question for me for the curiosity challenge.

 

STEVEN NOVELLA: OK. So you asked me for something outside of my area of expertise, but I'm going to-- this is not outside of an area of interest of mine though. So let me ask you the question this way, what land animal has the largest eyes?

 

CODY GOUGH: Oh, wow. I'm thinking not-- an owl would be a bird.

 

STEVEN NOVELLA: Well, bird count does land animals. In other words, not in the water, so not the whale or anything. So not aquatic.

 

CODY GOUGH: I'm thinking it's either going to be a lizard or an owl. So I will guess with owl being my runner up, I will guess iguana.

 

STEVEN NOVELLA: So you're closer with owl in that it's a bird, so I'll give you that. It's a bird but not an owl. And I'll add another tidbit, its eyes are actually larger than its brain. Each individual eyes larger than its brain.

 

CODY GOUGH: Wow.

 

STEVEN NOVELLA: Five centimeters in diameter.

 

CODY GOUGH: Wow. No idea. I mean, eagle would be my next guest but I don't think that's--

 

STEVEN NOVELLA: Bigger. What's the biggest bird?

 

CODY GOUGH: What is the biggest bird?

 

STEVEN NOVELLA: An ostrich. An ostrich.

 

CODY GOUGH: Right.

 

STEVEN NOVELLA: Yeah, ostrich has a larger size of any land animal. Bigger than their brains. They're massive. They're the size of a billiard ball.

 

CODY GOUGH: Have you ever seen an ostrich?

 

STEVEN NOVELLA: Yeah. Oh, yeah. Yeah I've seen them.

 

CODY GOUGH: In the wild or in zoos?

 

STEVEN NOVELLA: In zoos. I've never been to Africa. I really want to go at some point, but I've never been there. Yeah, I've seen them running wild in like parks and stuff but n never in Africa itself.

 

CODY GOUGH: Wow. How did you find out about that?

 

STEVEN NOVELLA: I'm a birder. I'm interested in birds. That's my hobby.

 

CODY GOUGH: Oh, wonderful.

 

STEVEN NOVELLA: One of my hobbies. Yeah. And there is a myth surrounding ostriches, which we'll mention really quickly as well the myth. I don't think a lot of people really believe in this, but it's hard to know that an ostrich will bury its head in the sand when it senses danger or to try to ''escape from a predator.'' But they don't do that because obviously that would be very counterproductive. That would never be an adaptive strategy.

 

We don't actually know where that myth came from, but there are a lot of possible sources. It's always interesting to try to trace back a false belief to where it came from. There were observations that-- so ostriches will bury their eggs in-- they don't bury them, but they're in a hole in the ground, a depression.

 

And when they're tending to their eggs, they will be standing over the nest and putting their head down into their nest. So from a distance, it could look like their head is buried under the ground. They also lay their head against the ground when they sleep. And again, that could create the illusion that maybe it's under the ground, but it isn't. But we're not really sure exactly where that comes from.

 

I think it was Pliny the Elder observed that an ostrich will put its head in a bush and think that it's hidden. So that may be the ultimate source of it. But that was in a bush, not burying it in the ground or in the sand. But interesting, but that is not what they do. They run. They're the fastest bipedal runners on the planet as well.

 

CODY GOUGH: Wow. Well, myth busted on the Curiosity podcast. And you can learn more about the Denver International Airport thing I talked about on curiosity.com or on the Curiosity app. And if you want to hear it more from Steve Novella, you can find the Skeptics Guide to the Universe podcast on iTunes and Stitcher, and presumably everywhere I would presume.

 

STEVEN NOVELLA: Yes, pretty much.

 

CODY GOUGH: Pretty much also your Neurologica blog. We'll link to in the show notes. It's like neurological, but without the L at the end.

 

STEVEN NOVELLA: Yeah.

 

CODY GOUGH: And you're also the founder of the New England Skeptical Society and my gosh, you are a busy guy.

 

STEVEN NOVELLA: Yeah, that's right. People I talk to-- people tell me.

 

CODY GOUGH: Well, thank you so much again for joining me on the Curiosity podcast. We really appreciate it.

 

STEVEN NOVELLA: All right, thanks. It's been a lot of fun.

 

[MUSIC PLAYING]

 

ASHLEY HAMER: I've got an extra credit question for you courtesy of the Curiosity app. It's about treating sleep deprivation. When you're really tired driving your car can be just as dangerous as driving drunk. So please don't do it. But if you have to get somewhere and you need to combat sleep deprivation, what does science say is the best way to do it? Here's a hint loud music is not your best option. The answer is after this.

 

[MUSIC PLAYING]

 

Have you ever been listening to the Curiosity podcast and wanted to share a clip on Facebook or Twitter? Well, here's some super exciting news. Now you can thanks to gretta.com. That's G-R-E-T-T-A.

 

You can stream our podcast on gretta.com/curiosity, and their podcast player will follow along with a written transcript of each episode while you listen. When you hear a clip you want to share, just find it and click Share. Gretta will build a video for you to share with your friends so you can help spread the word about our podcast. Again, that's gretta.com/curiosity. And drop us a line to let us know what you think of this super cool news service.

 

CODY GOUGH: If you still can't get enough curiosity in your life, then why not check out our newsletter at curiosity.com/email. We've got three bonus stories for you every week plus exclusive features you won't find anywhere else. Just sign up at curiosity.com/email and never stop learning.

 

ASHLEY HAMER: If you ever have a question about anything we discuss on the Curiosity podcast, then please email us your questions at podcast@curiosity.com. You should be rewarded for your curiosity. So give us the opportunity to answer your question, and who knows? We might feature it on our show next week

 

Again, our email address is podcast@curiosity.com. Don't be shy. We're always here to help. That brings us to today's extra credit answer. If you're sleep deprived and you have to drive somewhere, then what can you do? Well, the answer is not rolling down your window and turning up your music.

 

A 1998 study showed that a blast of cold air and loud music were ''temporary expedients to reduce driver sleepiness, but had no significant effect on waking you up.'' So according to a 2002 study, here are the best two ways to stay awake on the road. Stopping for a nap or drinking a caffeinated energy drink, or both.

 

The study suggested that coffee might work in lieu of an energy drink, but the caffeine levels are variable. So beware. But anyway, if you're dozing off behind the wheel right now, then pull over and take a break. For more on this and a whole bunch of other things, visit curiosity.com.

 

CODY GOUGH: That's all for this week. For the Curiosity podcast--

 

ASHLEY HAMER: I'm Ashley Hamer.

 

CODY GOUGH: I'm Cody Gough.

 

[MUSIC PLAYING]