Subtitles (594)
0:01- [Derek] Friday, the 17th of July, 1992.
0:03Amidst the chaos of the trading floor
0:05at Singapore's international
stock exchange,
0:07one of the junior traders
makes an expensive mistake.
0:11Instead of buying 20 futures
contracts for a client,
0:14she sold them instead,
0:15costing Barings Bank nearly $40,000.
0:19To save her job, Her boss, Nick Leeson,
0:21a young trader keen to make his mark,
0:23decides to hide the loss.
0:25He puts it in an obscure error account,
0:28that's account 88888.
0:31It's used by banks to solve
small discrepancies and trades.
0:35It's a dangerous move
0:36and it should have been picked up
0:38by the central office immediately,
0:40but when nobody from Barings
notices, he gains confidence,
0:43convinced he can win back the loss
0:45and get the team out of trouble.
0:47- I'm operating in the belief
that I can get it back.
0:51You know, maybe it goes back
to the confidence thing.
0:54I mean, I'm confident I can
get it back at that stage.
0:57- But Leeson's staggering overconfidence
0:59would create a debt of
gargantuan proportions
1:02and lead to one of the biggest
collapses in banking history.
1:07- It's easy for me to make
the case at overconfidence
1:09as the most dangerous of the human biases.
1:12Overconfidence gets us
into all sorts of trouble.
1:17It leads us to take
risks, make commitments,
1:22enter contests, try things
that will ultimately fail,
1:26sometimes in costly,
embarrassing, and dangerous ways.
1:30- Overconfidence has been implicated
1:31in almost every big disaster,
1:33from the sinking of the Titanic
1:35to the Chernobyl nuclear disaster,
1:37to the loss of the space
shuttle Challenger.
1:39But overconfidence isn't reserved
1:42for just a few reckless individuals.
1:44We can all fall victim to it.
1:46For example, 93% of us
think we are better drivers
1:49than the median, which
is, of course, impossible.
1:52The scale of the problem was identified
1:54in some now-classic research
1:56with a set of simple quiz questions.
1:58True or false, Australia
is wider than the moon.
2:03I bet they're about the same size.
2:05- It looks small on a map, but it's huge.
2:07- Are there more or less
stars in the Milky Way
2:12- More stars.
- Than trees?
2:14- One teaspoon of pure olive oil,
2:17- Right.
- Four and a half grams,
2:19contains more chemical potential energy
2:22than an equivalent four and
a half gram amount of TNT.
2:27- I'm gonna say true.
2:32- This feels kind of
like a trick question.
2:33- The energy density of butter
2:36is 10 times that of
lithium batteries, right?
2:41I mean, lithium batteries are rubbish.
2:42Everyone's going on about them.
2:44Just run your car on butter, I say.
2:46- True or false, plants are more efficient
2:48than the average solar panel
at absorbing the sun's energy.
2:53- Now, these aren't the exact questions
2:55from the original research,
more on those later,
2:57but the specifics don't really matter.
2:59The interesting bit is
what they asked next,
3:02how confident are you that you are right?
3:05- 100% confident.
(interviewee speaks faintly)
3:06- 100% confident?
- 100, yes.
3:10So, I've also gotta go 90%.
3:14- I don't think we actually
had any expectations.
3:17This was very early on in the research
3:21on confidence assessment,
3:24but we didn't really
know what we would find.
3:27- The results revealed
a surprising disparity
3:29between how accurate
someone thinks they are
3:31and how accurate they actually are.
3:33When people are 90%
certain, they are right,
3:37only 75% of the time.
3:40In fact, we ran our own
version of this online
3:42and found an even more
extreme discrepancy.
3:45We asked the Veritasium community a bunch
3:48of science questions and then asked
3:49how confident they were in their answer.
3:51Our results showed that those
3:53who were the most confident
describing themselves
3:54as 91 to 100% sure were only
correct 51% of the time.
4:01Since the original research,
4:02these results have been
replicated again and again
4:05in all areas from general
knowledge to motor skills,
4:09and it affects everyone, even experts.
4:12- I have a paper where
we analyze the survey
4:14of professional forecasters.
4:16These are chief economists
at various corporations
4:19and banks who are invited
to forecast the state
4:22of the economy on a quarterly basis.
4:24We find that they are, on average,
4:27too sure that they know
what's going to happen,
4:31say there's something like 53% sure
4:34they have correctly predicted
4:36what's gonna happen to
inflation, for instance,
4:39and they are right, like, 23% of the time.
4:42- How well what you think you know matches
4:44with what you actually know
is called your calibration.
4:47So, if you're perfectly calibrated
and, say, 80% confident,
4:50you should be right 80% of the time,
4:53but most of us are not well calibrated.
4:56- Yeah.
- There are 200 billion stars
4:59but there are 3 trillion trees on Earth.
5:04- 100% confident?
- 100, yes.
5:07(interviewee mimics mocking tune)
5:09- Most of the time, being
overconfident isn't a huge issue,
5:13but for Nick Leeson, the
stakes soon became very high.
5:17Leeson's plan to recover his team's loss
5:19was to bet that the Japanese
stock market would go up,
5:22so he went long on the top
225 companies in Japan,
5:27Ever since the Japanese asset price bubble
5:29peaked two years previously,
5:30the Nikkei had been dropping
from 38,000 to 16,000,
5:35and he was confident the market
5:36would soon bottom out and start rising,
5:39but it continued to fall.
5:41Over the next few weeks, it
dropped to a low of 14,000,
5:45and the whole time, Nick
was betting the wrong way
5:47with ever bigger and riskier bets,
5:50he would double down,
betting double his losses,
5:52so the next win would
bring him back above water.
5:55This strategy should work
when the next win comes along,
5:59but the market collapse was unrelenting,
6:01and his losses ballooned from
$40,000 to around $3 million.
6:06In spite of this, Leeson
remained confident,
6:10and eventually, his luck turned around.
6:12In the spring of 1993, the
market rebounded up to 20,000,
6:16and by the summer, that
account was back in credit.
6:21Leeson goes out to celebrate that weekend,
6:23drinking and dancing on tables,
6:25finally free of the hole
he had dug himself into.
6:30But on Monday morning,
6:31he made another trading
error on the futures market,
6:34a damaging loss that Leeson
didn't want to admit,
6:37so he put the loss back
into account 88888,
6:41confident that if he'd got
outta the previous one,
6:43he could do it again.
6:45He began making risky trades
6:47to try to recover that initial loss,
6:48and as the losses mounted
in the five eights account,
6:51it became harder and
harder to make new trades
6:54to cover the last loss.
6:56By the end of 1993, his
losses exceeded $30 million.
7:01Leeson is obviously an extreme example,
7:03but this tendency to
overestimate our abilities
7:06is something we all share.
7:07So, the question is why?
7:10- The obvious explanation that
a lot of people default to
7:13that is we wanna feel good about ourselves
7:18and pretend like we're well-informed,
7:22that we enjoy enormous satisfaction
7:25from being able to say,
7:30And so, we pretend to ourselves and others
7:35that we know when we don't.
7:36That is the motivated egocentric version
7:40of an explanation for excessive
faith in our own judgment.
7:43- [Narrator] Reichelt
had faith than the idea,
7:46- But what if it's something
more uncomfortable, stupidity?
7:51Franz Reichelt was a tailor
with no formal scientific
7:54or engineering training,
but he was obsessed
7:56with solving the problem
of aviation safety.
7:58He spent months designing
a parachute suit,
8:01which repeatedly failed during testing,
8:04but he became convinced the issue
8:06was that he didn't have
enough height for it to work.
8:09So, in February, 1912, he
took it to the Eiffel Tower,
8:12and rather than using a dummy
to test it, he jumped himself.
8:17Tragic, but also really dumb.
8:20Do you recognize this graph?
8:22- The, like, Dunning-Kruger graph?
8:25- Are you confident that this
8:26is the actual Dunning-Kruger graph?
8:28- Not at all, in fact, I think
8:29that this is not the Dunning-Kruger graph.
8:31I think that this is a bit
of a communication trick.
8:34- This is what people call
the Mount Stupid Curve,
8:37which will pop up if you search
8:38for the Dunning-Kruger
effect into Google images.
8:41- As far as I understand
it, it's an effect
8:44that says that when you
know very little about it,
8:46your confidence is high,
8:48and as you know more about it,
8:49your confidence sinks until
you become a master at it.
8:52- But I think that the actual result
8:54is, like, a little bit
more fuzzy than this.
8:59This graph doesn't actually
show the Dunning-Kruger effect.
9:02It's just a meme that
resonates with a lot of people,
9:04and it became conflated with the effect.
9:07In their original research,
9:08Dunning and Kruger tested people
9:10on tasks like grammar, logic and humor,
9:13and then they asked
participants to estimate
9:15how well they performed.
9:17That produced this curve.
9:19You can see that those who performed worse
9:21had the largest mismatch
9:22between their confidence and performance.
9:24They were the most overconfident.
9:26Those who performed the best
9:28were actually slightly underconfident.
9:30This suggests that
overconfidence may be linked
9:34Those who know less think that
they know more than they do,
9:37but to me, this effect looks a little
9:40like a statistical artifact
9:41because while I was doing my PhD,
9:43I asked students some physics questions
9:45and I also asked for their
confidence in their answers.
9:48And when I looked at the results,
9:50accuracy spanned the full range
9:52from nearly 0% to 100%,
9:55but confidence varied over
a much narrower range.
9:59All the scores were roughly in the middle.
10:01This meant that poor performers
were the most overconfident
10:04and the highest performers
10:06were actually slightly under confident.
10:08This is exactly what
Dunning and Kruger found,
10:11so maybe overconfidence isn't entirely
10:13due to how much we know,
10:15but also because most of us express
10:17at least kind of
middle-of-the-road confidence.
10:20But there is another factor at play here,
10:23which is how much information our brains
10:25are capable of processing.
10:28Monday, the 27th of January, 1986,
10:31it's 5:45 PM at the Kennedy
Space Center in Cape Canaveral,
10:34and the engineers from Morton Thiokol,
10:36who made the Challenger space
shuttle's rocket boosters
10:39make an emergency conference call.
10:41They've just seen the weather forecast,
10:42predicting temperatures
overnight will plummet
10:44to 25 degrees Fahrenheit,
10:46far colder than any
previous shuttle launch.
10:49They know the rubber
O-rings that seal joints
10:51in the boosters become less flexible
10:53in the cold, but have just
a few hours to gather data,
10:56create charts, and present their case.
10:58Over the next six hours,
10:59the engineers present 13 charts with data
11:01on O-ring temperature, hot gas erosion,
11:04joint rotation, and more.
11:06But the data is scattered, incomplete,
11:08and not synthesized
into a clear narrative.
11:11They'd never tested below
53 degrees Fahrenheit,
11:13so NASA managers were
trying simultaneously
11:15to track historical O-ring
data, erosion patterns,
11:18joint dynamics, seal resiliency,
11:21pressure differentials, and more.
11:23No single chart told the whole story.
11:25So, overwhelmed by
seemingly contradictory data
11:28and confident that their
rocket boosters were safe,
11:31Thiokol management overruled
their own engineers
11:34and approved the launch.
11:37At 11:38 the next morning,
73 seconds after launch,
11:43all seven crew members were killed.
11:46(chiming mysterious music)
11:47- Calibrating your
certainty requires thinking
11:51of all the ways that you could be wrong,
11:53and that's hard for finite,
fallible agents like us
11:58if it means considering
everything that we don't know.
12:01- How many chunks of novel
information you can hold
12:04in your head at one time is
your short-term memory capacity.
12:07In 2008, Hansson, Juslin, and Windmann
12:09investigated how
short-term memory capacity
12:11was linked to accuracy and overconfidence.
12:13Participants were asked to give ranges
12:15for factual questions
like the length of a river
12:17or the population of a city.
12:18People's ranges were
consistently too narrow,
12:21which effectively meant they
were being overconfident,
12:24and those who had worse short-term memory
12:26were more often wrong and more
likely to be overconfident.
12:29Another study conducted by
Conte in 2023 asked participants
12:33to keep sequences of letters in their mind
12:35while they judged their own performance.
12:37As the memory load increased,
12:39confidence estimates became less accurate,
12:41even for participants with
higher working memory capacities.
12:44Together, these studies suggest
that assessing your accuracy
12:47is a mentally taxing task.
12:49So, overconfidence isn't
necessarily about arrogance.
12:52It's your brain working at the limits
12:54of what it can track and hold.
12:56And because of this, your brain
can start using shortcuts.
13:00Psychologist Daniel Kahneman describes
13:01these mental shortcuts as
heuristics, that together,
13:04create systemic errors
called cognitive biases.
13:07One shortcut we use a lot is
substituting hard questions
13:10with easier related ones.
13:11Researchers have tested
this by asking students
13:13how happy they were in their life
13:15and how many dates they'd
had in the last month.
13:18Unsurprisingly, these two
questions did not correlate.
13:21There's a lot more to happiness
than matches on Tinder,
13:23at least for most people.
13:25However, if you switch it around
13:27and ask about their dates first,
13:28suddenly, the correlation jumps to 0.66.
13:31Working out how happy you are in your life
13:33is a difficult question.
13:35You have to consider many
things and balance them all out.
13:38So, when you're primed
13:39with the information
about your dating life,
13:41you're very likely to
substitute that hard question,
13:44how happy am I in my life,
13:46with, how many dates have I recently had?
13:48Overconfidence from
misprocessing information
13:51like this can be disastrous.
13:53(low mysterious music)
13:54So, why are our brains wired this way?
13:56I mean, you might expect natural selection
13:58to have wiped out the
confidently incorrect,
14:01but there is evidence that overconfidence
14:04can actually be advantageous.
14:07Overconfidence can massively
improve your status.
14:11In a scientific version
of "The Apprentice,"
14:12scientists in 2012 compared
participants' own assessments
14:15of their skill with objective measures.
14:18They placed them in group
tasks to see who is chosen
14:20as a leader and whose
ideas influenced the group,
14:23tracking status over multiple sessions
14:25and assessing whether the desire
for status led participants
14:27to exaggerate their abilities.
14:29The results were clear,
14:31overconfident individuals
were more likely to lead,
14:33assert themselves, and maintain influence,
14:36even when their actual abilities were mid.
14:39And the evidence certainly
shows people react better
14:41to confident individuals.
14:43Using an fMRI, researchers
at the University of Sussex
14:46measured brain activity of people
14:48after hearing unconfident
versus confident advice,
14:51and those who listened
to the confident advice
14:53had increased activity
14:54in the ventromedial prefrontal cortex.
14:57That's a brain region associated
with processing rewards
14:59and expected satisfaction.
15:01This means that human brains
are biologically tuned
15:04to be influenced by confident individuals.
15:07We literally feel better when
we hear confident people.
15:10- But recognizing that dynamic
15:13highlights a potentially
problematic incentive
15:17for anyone who's contending
for positions that they want.
15:21People who express more confidence
in employment interviews
15:25or political campaigns, for instance,
15:29do earn the confidence of
interviewers and potential voters,
15:34even if they're being overconfident.
15:37They can't actually deliver.
15:39They don't actually know the answer,
15:41but they can talk a good
line to impress others,
15:45even if they can't ultimately deliver
15:46on those grand assertions.
15:48They know the audience
15:51will place more faith in
them, if they express...
15:53Well, they should express
maximal confidence.
15:56- [Derek] Leeson knew this
and he took advantage of it.
15:58While he was secretly racking up
16:00huge losses in the infamous
five eights account,
16:02he was publicly posting huge profits
16:05and everyone believed the illusion.
16:06- They come in and they
don't test any records,
16:10so I can't be happier.
16:12They didn't test one record.
16:14- [Derek] As Leeson's
hidden losses mounted,
16:15he requested huge sums from head office
16:18to keep doubling down up
to $5 million at a time.
16:21Barings management, barely
understanding futures
16:23and believing in their star trader,
16:25they granted his requests
16:27and continued to grant future requests.
16:29- Every day that I make
one of these requests
16:31for additional money, I
never expect it to come.
16:33I expect somebody to say,
16:34"Look, mate, what the hell's going on?"
16:38the account was nearly
$260 million in the red.
16:42To mask this, Leeson began
buying futures from himself,
16:45inflating his perceived profits further.
16:47At SIMEX's 10th anniversary in September,
16:50Barings received two awards,
largely credited to Leeson.
16:53In December, he was flown to New York,
16:55seemingly responsible for $44 million
16:58in Barings turnover that year.
17:00Some traders raised doubts
17:01that these kinds of profits could be made
17:03from the low risk trading
he was supposed to be doing.
17:05But management's faith in
Leeson's talent was unshakeable,
17:08and they dismissed these concerns.
17:11Barings' overconfidence in Leeson
17:12and Leeson's overconfidence in himself
17:15would be the bank's downfall.
17:17The figures are so big,
17:18it seems like the
ultimate confidence trick,
17:21but there's a partial explanation
17:23for the delusion on both sides.
17:25Leeson's and Barings'
overconfidence was amplified
17:28by the complexity and
unpredictability of the market
17:30and the trades he was making.
17:32It's sort of an issue of feedback.
17:36In a controlled environment
where there are clear rules
17:39and guaranteed outcomes
like in a chess match,
17:42there is clear feedback
17:43on whether decisions are good or bad.
17:45With reliable feedback,
professional chess players
17:47are able to learn to make better decisions
17:49with more accurate confidence.
17:51But in a noisy environment
where there is rarely consistent
17:54or timely consequences for predictions,
17:56this feedback is unreliable.
17:58Whether our confidence
is accurate or misplaced
18:01is increasingly hard to
judge, even amongst experts.
18:04It's especially problematic
for political pundits.
18:06- This is going to be a landslide.
18:08I think Romney's gonna win by quite a bit.
18:10- So, right now, we have
Hillary's about a 75
18:14- [Derek] For example, prior to 2024,
18:16political analyst Allan Lichtman
18:17had accurately predicted
the winner of nine
18:19of the 10 previous US
presidential elections,
18:22using his 13 keys to
the White House method.
18:24Using the same strategy in
2024, he predicted that-
18:27- Kamala Harris will be a
precedent-breaking president.
18:32- And look what happened.
18:33Is this crazy?
(crowd cheering)
18:36- He attributed his
miscalculation to the spread
18:38of disinformation that
misled the electorate.
18:40This noisy environment made it difficult
18:42to discern key issues like the
actual state of the economy.
18:46Similarly, the feedback Leeson
18:47had received was inconsistent.
18:49He was making some bad trades,
18:51but he had also won it all back before,
18:53and this significantly
clouded his judgment,
18:55amplifying his overconfidence.
18:57By 1995, Leeson's losses were
in the hundreds of millions,
19:01and Barings had unwittingly
sent him $1 billion.
19:05For a bank with around $700
million in capital base,
19:09they were legally only allowed to lend
19:11around a quarter of that,
19:12but no one questioned it.
19:14They were all blinded
by his apparent success.
19:16At this point, his positions were so big,
19:18he estimates that he was probably half
19:20of the entire Nikkei futures market,
19:22so all he could do was
buy himself some time
19:24and hope that the market went his way.
19:26And for a while, it worked.
19:28The economy was stable and
he couldn't see anything
19:31on the horizon that would change this.
19:33But then disaster struck.
19:36- [Reporter] Japan is tonight
in a state of mourning
19:38and of shock.
(siren wailing)
19:40- On the 17th of January, 1995,
19:41the Great Hanshin Earthquake struck Japan
19:4420 kilometers from the city of Kobe.
19:46With the magnitude of 6.9,
it devastated the city,
19:49which was one of Japan's key ports.
19:52This devastation spread
to the stock market.
19:54The Nikkei index plunged 1055 points.
19:57Leeson, attempting to double down again,
19:59risked even more money.
20:01He bet heavily that the Nikkei
would make a rapid recovery,
20:06And in the end, in today's money,
20:08Leeson had lost $2.8 billion.
20:12On the 23rd of February,
Leeson went on the run.
20:15And three days later, Barings,
20:16one of the oldest and most trusted banks
20:18in the world, collapsed.
20:20Overconfidence was at
least partially to blame
20:24Realizing the walls were closing in,
20:26Leeson fled to Malaysia and then Thailand,
20:28but his escape was short-lived.
20:30He was eventually arrested in Germany
20:32and extradited to face justice,
20:33marking the end of a
spectacularly destructive gamble.
20:36He was just 28 years old.
20:40Now, we don't all bring down banks,
20:42but we are all vulnerable
to overconfidence.
20:45In a complex world with
unclear, noisy feedback
20:48where our brains are overwhelmed,
20:50a set of simplistic biases can take over.
20:53And we all too often end up thinking
20:55we know more than we do.
20:56So, what can we actually do about this?
20:59- I try to get better at
calibrating my confidence judgments
21:03by keeping track and keeping score.
21:06So, when a colleague asks me,
21:09"How long is it going to take you
21:10to get me comments on this
paper draft we're working on?"
21:15I don't promise I'll do it by Friday.
21:18I'm much more likely
to say something like,
21:20"I think there's a 60% chance
21:22I can get you comments by Friday."
21:23They'll often react with a
quizzical look or a laugh.
21:30- Well, practicing and being
aware of our calibration
21:32is the obvious way to improve,
21:34but so is being intellectually humble.
21:37- I think that the best
medicine for overconfidence
21:42is not so much information
as feedback, (chuckles)
21:45and I get plenty of that. (laughs)
21:48Though also, I think people are right,
21:50that sometimes I do have a
little bit of overconfidence.
21:53- If we wanna become more accurate,
21:55we should capitalize on
the wisdom of the crowd
21:57by listening more to others.
21:59In particular, we should listen
22:00to people who disagree with us.
22:02Understanding the best
arguments of your critics,
22:05understanding what information
22:06those who disagree with
you have that you lack
22:09is very helpful for
making better decisions.
22:11- The best calibrated people
aren't those who know the most.
22:15It's those who know what they don't know.
22:17So, true wisdom lies not in being certain,
22:20but in knowing the limits
of your own certainty.
22:23And that's an idea that's
inspired our latest project.
22:27All those questions I asked-
22:28- These are good questions. (laughs)
22:31- They come from a new
board game that we've made.
22:33It's called "Elements of Truth."
22:36The game contains over 800 fascinating
22:38science trivia questions with a twist.
22:40The number of points
you win on each question
22:43depends on how confident you are.
22:46You can bid any number from 1 to 10.
22:48If you're not sure, you
can play a low number
22:51or you can try to follow
the lead of someone else
22:53who you think should know the answer.
22:55We've tested this game with scientists,
22:57teachers, and students,
and what we've found
23:00is that it regularly leads to discussions
23:02that go way beyond the initial questions.
23:06The core game comes with 200 questions
23:08that cover all aspects of science,
23:09and there are five additional
packs on specific topics
23:12like physics, technology,
engineering, and astronomy.
23:16Plus, there's a Veritasium
pack on concepts covered
23:18in many of our most popular videos.
23:21We're launching the
game through Kickstarter
23:23to give you the opportunity
to shape it with us.
23:26Over the next month, you'll
be able to submit questions
23:28for a special community pack.
23:30This is your chance to etch your name
23:32into Veritasium history
23:33with a credited question in the game.
23:36To reserve your copy and get involved,
23:38use this QR code or the
link in the description
23:41to head over to Kickstarter.
23:43It is only because of
you that I've been able
23:45to make this channel and this game,
23:47so as always, I wanna
thank you for your support
23:50and thank you for watching.