Logo
Home
language
Loading...

Introduction to Cognitive Bias: Crash Course Scientific Thinking #1

सुनें/Video/CrashCourse/Introduction to Cognitive Bias: Crash Course Scientific Thinking #1

Introduction to Cognitive Bias: Crash Course Scientific Thinking #1

CrashCourse
3000 Oxford Words4000 IELTS Words5000 Oxford Words3000 Common Words1000 TOEIC Words5000 TOEFL Words

उपशीर्षक (314)

0:002,000 years ago, people looked up at the
0:02sky and they saw that everything up
0:04there seemed to move. So, naturally, the
0:07Earth was staying still and everything
0:10else was rotating around us. It was a
0:12story that just made sense to millions
0:14of people and it stuck around well into
0:16the 16th century. If I'd been alive
0:19then, I would have believed this story.
0:21I mean, even today, I feel like I must
0:23be at the center of something. That
0:25idea, of course, was wrong, but we
0:27didn't believe it forever. We found ways
0:29to step out of our old stories and find
0:33something much more interesting. Hi, I'm
0:36Hank Green and this is Crash Course
0:38Scientific Thinking.
0:44Science. It is a never-ending quest for
0:47knowledge, a way of interrogating our
0:49universe to figure out how it works, a
0:51tool to guide us when our intuition
0:54isn't enough. And also, it can be quite
0:55fun. Sometimes you get to blow stuff up.
0:57In the years since Capernicus put forth
0:59the theory that the Earth revolves
1:00around the Sun, we've learned that some
1:03questions are just too big, too complex,
1:06or too bizarre to trust our gut with.
1:09When we rely on intuition alone to
1:10answer those big, complicated questions,
1:13our brains fall prey to cognitive
1:15biases, predictable weaknesses in the
1:18way we've evolved to think. Our brains
1:20are very good at finding patterns. We've
1:23evolved this skill because it's super
1:25helpful for survival. It helped our
1:27ancestors spot the telltale signs of
1:29predators and recognize when certain
1:31plants might be poisonous. We have
1:33always paid attention to and learned
1:35from our world. Those pattern
1:38recognition skills have also been linked
1:40to some very special human qualities,
1:42like our ability to imagine and invent.
1:45Like, it's what made me notice that Hank
1:47and Angler fish sound vaguely similar so
1:50that I could invent something called the
1:51Hankler fish. Bad puns are still good
1:54pattern recognition. It's also why we
1:57are so good at telling stories because
1:59really that's all a story is a
2:01recognizable pattern of information. And
2:04more importantly, our highly evolved
2:06pattern recognition skills allow our
2:08brains to apply mental shortcuts or
2:10heruristics that help us solve simple
2:13problems quickly and make life liveable.
2:16They are the brain's way of copy pasting
2:18stories we already have onto new
2:20information so that we don't expend a
2:22bunch of brain power in every direction
2:24all the time. They are why I don't have
2:26to stop and think what will happen if I
2:28touch this hot stove top. My brain
2:30picked up the pattern of touching hot
2:32things bad long ago and it keeps
2:34resurfacing it whenever I need it to
2:36keep me safe. And that's all well and
2:38good for avoiding hot things, but those
2:40same mental shortcuts also open us up to
2:43cognitive bias. Now, cognitive bias
2:45isn't inherently bad. And that's good
2:47because everybody's got it. And I'm not
2:50talking about explicit bias here where
2:52someone is consciously aware that they
2:53are discriminating against a person.
2:55Cognitive biases happen unconsciously.
2:58They are implicit biases when our
3:00decision-making is influenced by beliefs
3:03and patterns we aren't even consciously
3:05aware of. Consider this modern-day
3:07example of a cognitive bias skewing many
3:09people's perception of risk. In early
3:122025, after a mid-air collision between
3:15a commercial airplane and a military
3:16helicopter, people started paying a lot
3:19of attention to every near miss and
3:21airport mishap. And it felt like planes
3:23were crashing every day. The media ran
3:26with this and the algorithms amplified
3:28it. At the time, 65% of Americans said
3:31they felt more anxious about flying. But
3:34when we took a look at the actual data,
3:37the number of accidents compared to the
3:39same time in 2024 remained the same and
3:42flying remained an incredibly safe form
3:44of travel. So why did so many of us feel
3:47like it wasn't? Well, for efficiency,
3:49our brains often put more weight on the
3:51most readily available information
3:54around. We call this, wait for it,
3:57availability bias. When people make
3:59judgments based on the information
4:01that's easily available. The truth is,
4:03you're far more likely to be in a car
4:05crash than a plane crash. But when plane
4:07crashes do happen, we hear about them a
4:09lot, especially in today's algorithm
4:12driven news cycle. So that information
4:14is way easier to call to mind than the
4:16fact that over 120 people in the US die
4:19in car accidents every day. Availability
4:22bias is a big way our brains mislead us,
4:24but probably the biggest is confirmation
4:27bias. That's our brain's tendency to
4:29accept information that agrees with
4:31things we already believe and filter out
4:34stuff that contradicts it. Like if
4:36someone already believed that flying was
4:38dangerous, then the news stories about
4:39the 2025 crash likely reinforced that
4:42belief. Or here's an example you might
4:44be familiar with. Have you ever been
4:46told that you are a visual learner or
4:48maybe that you learn best by listening
4:50to other people? In a study published in
4:52the Journal of Educational Psychology,
4:54more than 90% of participants said that
4:56people learn better when they're taught
4:58using the learning style that best suits
5:00them. A similar survey of colleges in
5:02the US revealed that out of the 39
5:04surveyed, 29 of them teach learning
5:06style theory as part of their guidance
5:08for teachers. But here's the kicker.
5:10There is no scientific evidence to
5:13support the idea of personalized
5:15learning styles. So why is this myth so
5:18prevalent? Researchers have pointed out
5:20that it persists at least in part thanks
5:22to confirmation bias. As one researcher
5:25put it, "People are obviously different
5:27and learning styles appear to offer
5:29educators a way to accommodate
5:31individual learner differences." So,
5:33someone who believes they've seen these
5:34methods have a positive impact might
5:36reject evidence against them, just like
5:38I might and sometimes do reject evidence
5:40that the butt is not part of the legs.
5:42And these are just a few of the
5:43cognitive biases we all have. We haven't
5:45even gotten into how we cling to first
5:47impressions. That's anchoring bias. Or
5:49how we tend to believe the events of the
5:51past were predictable. That's hindsight
5:53bias. Ultimately, we all want the world
5:55to make sense. But when we rely on these
5:57simple shortcuts for the wrong things,
5:59they can keep us from being open to
6:01evidence. So, what do we do about this?
6:04Well, over the last few centuries,
6:05humans have developed a new way of
6:07looking at the world. One that doesn't
6:09just explain what feels right, but tests
6:12what is right. This is what people are
6:15usually talking about when they say the
6:16word science. Not the body of knowledge,
6:20but the systems used to interrogate the
6:22universe. From astronomy to zoology,
6:25science is a way of building knowledge
6:27that's durable, communal, and
6:30longlasting. And to help us learn more,
6:32I think we need a little sage advice.
6:43Let's give it up for science, everyone.
6:46>> Hi, Sage, everybody. This is Sage.
6:48>> Hello, I'm Sage the Bad Naturalist. I'm
6:50a dork, a painter, a creator of the
6:53YouTube channel Sage the Bad Naturalist.
6:55I make videos about fungi, plants,
6:57research papers, and learning something
6:59new even when science goes wrong. And
7:01I'm here to help Hank spread some
7:04sage advice. I love that we're talking
7:06about cognitive bias today, Hank,
7:07because the process of science is
7:09actually designed to overcome biases
7:12from methods to reliance on evidence and
7:15especially the fact that science is
7:17communal. It gets vetted by a whole
7:20community, not just one guy in a bathtub
7:22shouting, "Eureka!"
7:23>> No shame to Archimedes, of course. I
7:25personally love peer review, which we'll
7:27talk about in a later episode, but Sage,
7:29what is your favorite bias busting
7:31science method? for me has to be
7:34randomized control trials. It's a
7:35multi-step process for research that's
7:37used a lot in testing new medicine and
7:39each step was designed to reduce chance
7:42of bias. Say scientists want to test a
7:45new diabetes medicine. Well, there's a
7:47lot of potential for bias in that
7:48process. Like they might accidentally
7:50influence the results of the trial by
7:53their selection of the participants.
7:55>> That's why we say everyone has cognitive
7:57bias even scientists.
7:59>> Exactly. So to avoid those biases,
8:01scientists select the members at random.
8:04And when they're testing a new drug,
8:06there's all kinds of potential for
8:07confirmation bias. So they sort some of
8:10the members into a control group that
8:12gets either no treatment, a placebo,
8:14which looks like the real drug but
8:16doesn't actually do anything, or an
8:18older proven drug. That way they can
8:20compare the results. I love a randomized
8:22control trial because it is such a good
8:24example of the ways that scientists have
8:25recognized their own potential for bias
8:28and designed their research to reduce it
8:30as much as possible. Like how sometimes
8:32when researchers need to eliminate bias
8:33even further, they do double blind
8:36studies where neither the participants
8:37or the scientists know which people are
8:40in the trial and which are in the
8:41control.
8:42>> Right. Scientific thinking at work.
8:44We've come so far from thinking we're
8:46the center of the solar system.
8:47>> Science high five.
8:49And that's been your Sage advice.
8:51>> Thanks, Sage. And here's the coolest
8:54part. You can watch out for cognitive
8:56bias in your own thinking, too. One of
8:58the easiest and most important things
9:00you can do to fight back against bias is
9:02just understanding that it's real and
9:04accepting that you have it. Being aware
9:06of bias gives you the chance to look out
9:09for its influence on your decisions.
9:10Anybody who says they don't have any
9:12biases is just waving a huge red flag.
9:15Another way is to interact with lots of
9:17people, especially people who are
9:19different from you. Bias likes to tell
9:22us that our experience is the only
9:24reality. But to really understand the
9:26world, we need community. Scientists do
9:28this too. They are endlessly testing and
9:31vetting each other's claims. Without
9:33expertise from a diversity of
9:35scientists, the scientific process would
9:38fail. And by that same token, you can't
9:40overcome your biases on your own either.
9:43Unlike those gut feelings we get,
9:45science requires evidence. So whenever
9:47we consume science news
9:51that goes against an idea or experience
9:53we believe to be true, it's good to
9:55remember that our cognitive bias might
9:57be working against us. Which ties into
10:00another big bias buster, cognitive
10:02flexibility. Your ability to imagine
10:05options or explanations beyond your gut
10:08reaction. In other words, being able to
10:10say, "You know what? Maybe I was wrong."
10:13I know, right? In this economy, on this
10:15internet, you can do it, though. It's
10:17good for you. Throughout this series,
10:19we're going to talk a lot more about how
10:21the scientific process works, so that
10:23when we see news stories about science,
10:25we'll better understand what's going on
10:27behind the scenes. And that knowledge is
10:29going to help us all respond better to
10:31the science on our social media feeds,
10:33in our group chats, and at our dinner
10:35tables. By the way, uh isn't it just
10:38extremely wild that our brains can think
10:40of ways to outsmart the ways they think?
10:43Remember, we all have cognitive biases.
10:45They're not something to be ashamed of.
10:47They're just our brain's way of solving
10:49problems faster. But the world is very
10:53weird. So often our mental shortcuts
10:55don't work. That's where scientific
10:57thinking comes in. Science relies on
10:59evidence evaluated by a community of
11:01experts, and it has systems that are
11:04designed to reduce bias. It's not
11:06perfect, but it's one of the best tools
11:08we have. Next time, we're going to
11:10explore the wild world of statistics.
11:13I'll see you then. This episode of Crash
11:15Course Scientific Thinking was produced
11:16in partnership with HHMI Bio
11:18Interactive, bringing real science
11:20stories to thousands of high school and
11:22undergrad life science classrooms. If
11:24you're a teacher, visit their website
11:26for resources that explore the topics we
11:28discussed in today's video. Thanks for
11:30watching this episode of Crash Course
11:32Scientific Thinking, which was filmed in
11:33Missoula, Montana, and was made with the
11:35help of all of these nice people. If you
11:37want to help keep Crash Course free for
11:39everyone forever, you can join our
11:41community on Patreon.