Home
Login
Register
Learning Content
Loading...
The Story You’re Not Hearing About AI Data Centers | Ayșe Coskun | TED - Video học tiếng Anh
Practice Listening
Practice Listening
/
Video
/
TED Talk
/
The Story You’re Not Hearing About AI Data Centers | Ayșe Coskun | TED
The Story You’re Not Hearing About AI Data Centers | Ayșe Coskun | TED
Select learning mode:
View subtitles
Pick word
Rewrite word
Highlight:
3000 Oxford Words
4000 IELTS Words
5000 Oxford Words
3000 Common Words
1000 TOEIC Words
5000 TOEFL Words
Subtitles (213)
0:04
Right now, the world is in an AI race.
0:08
Companies, governments, universities
0:11
are all racing to build bigger models, smarter systems.
0:16
And behind the scenes,
0:17
they are racing to build more data centers to power AI.
0:22
But there's a problem.
0:24
We are running head first into the limits of our infrastructure.
0:28
The power grid includes all the infrastructure,
0:31
power plants, transmission lines and all
0:33
to generate and deliver power to our homes, our businesses,
0:37
and now to AI data centers.
0:40
In the United States,
0:41
the grid operators are reporting that new AI data center projects
0:46
are requesting power loads equal to entire cities.
0:50
In some regions, utilities simply can't keep up.
0:55
So when you hear “AI data center,” what comes to mind?
1:00
For many, it's one thing:
1:03
energy hogs.
1:05
And they are not wrong.
1:07
AI is dramatically accelerating the electricity demand of data centers.
1:12
Just training GPT-4
1:14
is estimated to have consumed around the annual electricity use
1:19
of thousands of US homes.
1:21
In another striking example, in Ireland,
1:24
nearly 20 percent of the nation's electricity
1:28
is drawn by data centers today.
1:32
And these are not just statistics.
1:35
They are also community stories.
1:37
In the data center alley in Virginia,
1:40
residents recently saw higher electricity bills,
1:44
20 percent higher already compared to just a few years ago,
1:48
as utilities scramble to serve massive new AI facilities.
1:53
So energy-hog label seems well deserved.
1:59
But that's only half the story.
2:01
Here is the new view.
2:04
These facilities are not just energy-hungry brains.
2:08
They can also be the muscles of the grid, flexing on demand.
2:14
Unlike our homes or hospitals,
2:16
AI data centers run jobs that are predictable,
2:21
controllable and often delayable.
2:24
That makes them ideal to help balance supply and demand on the grid.
2:30
By making AI data centers power-flexible,
2:34
we can connect them much more rapidly to the grid,
2:37
while at the same time making electricity more affordable and resilient.
2:44
What's more, the AI boom is arriving
2:48
just as the renewable boom is also taking off.
2:52
Wind and solar don't follow our schedules,
2:56
but data centers can.
2:58
Which means we can align the rise of AI
3:02
with the rise of clean energy,
3:04
if we are bold enough to rethink their role.
3:08
All this transformation to power flexibility
3:12
didn't just come out of thin air.
3:15
It builds on decades of research
3:18
on energy-efficient computing,
3:20
scheduling, optimization and many others.
3:25
I've lived this journey myself.
3:27
Early in my career,
3:29
I asked a question that many found unrealistic.
3:34
Could computer systems adapt their behavior
3:40
depending on power grid needs,
3:42
but without breaking their performance promise
3:46
to their users?
3:49
At the time, this sounded radical
3:51
because why would we ever design a system that would slow itself down
3:57
on purpose?
3:59
But then came the breakthroughs.
4:02
First, we discovered
4:04
not all computing tasks are urgent.
4:07
Some can wait for minutes or hours,
4:10
and some can be slowed down without anyone really noticing it.
4:15
For example,
4:17
a researcher analyzing hundreds of medical images with AI
4:22
may be OK with waiting just a little longer.
4:25
Or, if you are fine-tuning your AI model
4:28
over the course of the next few days,
4:30
you may be OK with slowing it down for just a few hours.
4:35
This inherent flexibility in computing
4:38
gives us the flexibility we need to manage power.
4:40
Second,
4:42
we reframed the problem.
4:45
Instead of asking
4:47
how do we compute as fast as possible,
4:50
we asked,
4:52
how do we make computer systems meet the constraints of the power grid,
4:57
while at the same time still delivering on user performance agreements?
5:02
This shift led to new strategies:
5:04
capping power,
5:06
shifting workloads
5:08
and provisioning the data center as a flexible reserve to the grid.
5:13
A key aspect here is that we do keep the performance promise to users,
5:18
so it's not arbitrary.
5:20
User experience remains as a key target.
5:24
And better yet, it becomes more predictable.
5:28
So we built prototypes on real data-center servers,
5:33
and they worked.
5:34
Systems that could follow a power target
5:37
while still delivering results.
5:40
But all this journey wasn't smooth.
5:42
There were paper rejections, funding rejections,
5:47
colleagues telling me this would never work.
5:51
Well, since I was a kid, I was told I'm a persistent person.
5:55
Perhaps stubborn at times.
5:58
And bold ideas require persistence
6:03
because change almost always looks impossible
6:07
before it looks obvious.
6:09
So you take that feedback, you reframe it again and again,
6:13
and you keep building.
6:15
You keep proving.
6:16
So what began as scribbles on a whiteboard 12 years ago,
6:21
is now running on real AI data centers.
6:25
Why does this matter now?
6:26
Because the power grids challenge
6:29
isn't just to generate more power.
6:32
It's about timing.
6:34
Solar gives us a glut of electricity at noon,
6:39
but demand might peak in the evening.
6:41
Wind might be abundant one day and scarce the next.
6:45
Nuclear takes decades and billions of dollars to build
6:51
and is often hard to locate in urban areas.
6:55
Batteries are critical,
6:57
but scaling them is costly, slow,
7:01
and often not environmentally clean.
7:03
Meanwhile, AI data centers themselves face five to seven-year wait times
7:10
just to connect to the grid
7:12
in places like Virginia.
7:14
In AI time,
7:15
where technologies shift in a major way every six months,
7:18
five to seven years is an eternity.
7:21
So here's the opportunity.
7:23
With the right orchestration,
7:25
AI data centers can be flexible today.
7:28
No waiting, no new massive power infrastructure construction.
7:33
They can soak up excess solar in the afternoon,
7:38
scale down at peak times
7:40
and act as virtual batteries today.
7:43
And the stakes are real.
7:44
Take Texas, August 23.
7:47
During a brutal heat wave,
7:50
the rising electricity demand pushed the grid to its limits.
7:55
Wholesale electricity prices spiked over 800 percent
8:00
in a single afternoon.
8:02
So flexible loads, if they were widely available,
8:06
could have reduced the costs
8:08
and could have prevented the emergency alerts that went to the consumers.
8:12
So we have two opportunities here.
8:14
One, we can make current data centers flexible
8:19
and help prevent blackouts
8:20
and reduce electricity costs.
8:23
Two, and perhaps the more significant,
8:26
by making future data centers power-flexible,
8:31
we can connect them much earlier
8:33
without waiting for major power grid upgrades.
8:36
If we ignore this opportunity,
8:40
we are not just wasting renewable energy
8:43
and we are not just raising our electricity bills.
8:46
We are also slowing AI adoption,
8:49
making it delayed,
8:51
more expensive and less accessible to society.
8:55
But there's a catch.
8:58
Orchestrating this flexibility is not easy.
9:02
Prices change hourly.
9:05
Workloads may arrive unpredictably.
9:08
Grid rules change across states, across countries.
9:12
So no human operator
9:14
and no single fixed data center management policy can keep up.
9:18
This is where AI itself comes back into the story.
9:23
The very technology driving this unforeseen demand
9:27
is also probably the only thing smart enough to tame it.
9:31
AI can learn patterns, anticipate grid needs
9:36
and coordinate across data centers, across utilities,
9:40
even nations in real time.
9:43
Imagine a data center
9:44
or a whole network of them,
9:46
as an orchestra,
9:48
with hundreds of instruments, all playing at once.
9:52
Left on their own, it can sound like chaos.
9:57
But bring in a conductor,
9:59
suddenly all that noise turns into music.
10:02
The conductor in this case is AI.
10:06
AI can direct data center operation
10:10
so that the data center can precisely match power constraints,
10:15
depending on what the grid needs, what power is available
10:19
and what users demand.
10:21
The result is harmony.
10:24
Reliable electricity, efficient computing
10:27
and a system that works beautifully together.
10:30
And that's exactly what we've built.
10:33
We built software that slows down, speeds up,
10:37
or pauses workloads in a data center,
10:40
or shifts workload among data centers.
10:43
Our conductor platform tunes performance and power at real time,
10:49
all the while respecting user and cloud-provider performance needs.
10:54
In this way, by flexing when needed,
10:57
we can connect AI data centers much faster to the grid.
11:03
Make better use of the available power in the power grid
11:07
and enable faster AI adoption.
11:11
I've been inside this story
11:12
from an idea that once seemed impossible
11:15
to prototypes in a lab,
11:17
to systems now running in the field,
11:19
and I believe this is just the beginning.
11:21
AI is already reshaping how we compute,
11:24
but it could also reshape how we power the world.
11:28
So the question isn't how much energy AI consumes.
11:33
The real question is how much flexibility, resilience
11:38
and clean power can AI unlock?
11:41
If we are bold enough to rethink AI data centers,
11:44
the very machines that now seem like a burden
11:48
could be our greatest assets
11:50
in building a sustainable AI future.
11:54
Thanks.
11:55
(Applause)