Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:01
Ted Audio Collective Hello
0:12
there, I'm Chris Anderson. This is the
0:14
Ted interview. Now this
0:16
season we're expanding on an idea that
0:19
I believe can offer a response to
0:21
many of the issues we're currently facing
0:23
as a society. It's
0:25
an idea I wrote a book about
0:28
called Infectious Generosity. In
0:30
the spirit of generosity, we're offering free copies
0:32
of both the e-book and the audio book
0:34
to Ted interview listeners. You can
0:37
go to ted.com/generosity,
0:40
fill out the short form there to claim yours. Now
0:43
this podcast series is designed to amplify
0:45
the themes of the book by
0:48
bringing some of its main characters to
0:50
life before your very ears. Today
0:53
we're going to focus on how
0:55
we make our generosity more thoughtful
0:57
and more effective. And the
1:00
guest I'm about to introduce you to has
1:02
probably spent as much time thinking about this
1:04
topic as anyone on the planet. He
1:07
is Will McCaskill, a
1:09
leading moral philosopher, co-founder
1:12
of the Effective Ultraism
1:14
Movement, EA for short. And
1:17
he's the author of some
1:19
amazingly influential books like Doing
1:21
Good Better and What
1:24
We Owe the Future. Now
1:28
EA, you may have noticed, has been subject
1:30
of a lot of discussion in the last
1:32
couple of years, some of it quite heated.
1:36
I can't wait to get Will's perspective on
1:38
recent controversies. But even more important
1:40
than that, a true
1:42
understanding of his thinking about
1:44
generosity, I really think
1:46
the questions we'll be asking each other today,
1:49
are right up there among the most important
1:51
questions anyone can ask of their own life.
1:55
What's the wise way to be kind? How
1:58
can we maximise our impact on the world for the future? for
2:00
good? How much money should
2:02
we give away? And how can we do that
2:04
effectively? And what are
2:06
the risks if we get
2:09
this wrong? Okay, let's dig
2:11
in. Canva
2:20
presents unexplained appearances. It was
2:22
an ordinary work day until...
2:25
That presentation appears out of
2:27
thin air. No, it's eerily
2:29
on brand. Wait, did that
2:31
agenda just write itself? Words
2:34
appear, making this unexplainable case...
2:36
Unexplainable? That's Canva's A.I. tools.
2:39
I can generate flies and rose in
2:41
seconds. Really? The real
2:43
mystery is why I'm only learning
2:46
this now. canva.com. Designed for words.
2:53
Will McCaskill, welcome to the TED interview. Thanks
2:55
for having me on. Well, let's
2:57
start with a bit about you. I mean, I'm
3:00
just curious what it was about you that took
3:02
you on this journey to
3:04
some pretty exciting, pretty radical, pretty
3:06
provocative ideas. Sure.
3:09
So I grew up in Glasgow in
3:11
Scotland. And even from a
3:13
pretty early age, I wanted to
3:15
try and make a difference in the
3:17
world. So I worked
3:19
at an old folks home. Yeah,
3:22
elderly people who had severe disabilities.
3:24
I helped her on a scout
3:26
group for disabled children. I volunteered
3:28
at a local school. I donated some amount
3:30
of money to charity and so on. This
3:33
was all kind of a bit haphazard, a bit ad
3:35
hoc. And it was
3:37
once I started learning about global
3:39
poverty and the extremity of
3:42
poverty that people live in, that
3:44
this started to change. So
3:46
I remember first when I was, I think,
3:48
17, learning that 46 million people at the
3:51
time had died of AIDS. And
3:53
honestly, I just thought, how are we
3:55
not talking about that more? How
3:57
is that not on the front page of every
3:59
newspaper? This is not that typical
4:01
behavior for a kid. I mean, did
4:04
your friends think you were weird?
4:06
Did they admire you?
4:08
How do you think this actually happened? Like,
4:11
why did you care about these things? Not
4:13
that many kids spend their time volunteering like
4:15
that. Yeah, so I did
4:17
have friends who were also, you know, just
4:20
concerned to make a difference. So
4:23
actually, all my close friends, literally 100%
4:25
of my close friends became doctors. And,
4:28
you know, we were doing some of these volunteering
4:30
activities together. So I certainly
4:32
had some social encouragement from some really
4:34
kind people who I'm still friends with
4:36
today. But I do
4:38
think there's part of this that was just,
4:41
you know, quite innate and really
4:43
just something I was, a kind
4:46
of denial I think I was born with. So
4:49
you think of that almost as good
4:51
fortune, not as something
4:54
to boast about necessarily, but just this is
4:56
just who you are. Yeah, I mean,
4:58
there's certainly a huge amount of good fortune in the
5:00
sense that I've been born with
5:03
so many privileges. Born into a
5:05
rich country, middle class, I got to go to a
5:07
private school, and then from there to
5:10
Cambridge for my undergraduate. I really just
5:12
had all the benefits you could hope
5:14
for in life. And so it
5:16
was very salient to me then
5:19
that I should be
5:22
thinking about, well, given that I've got all of
5:24
this privilege, how can I take that privilege and
5:26
turn it into a way of making the world
5:28
better? It feels
5:31
like there's always been a bit of a radical side
5:33
to you, someone willing to go against the grain. I
5:35
mean, I think Macaskill wasn't the
5:37
surname you were born with. Yeah,
5:40
that's right. I definitely think that many
5:43
of the decisions I feel happiest with
5:45
in my life are ones where I've
5:47
gone against societal norms. Sometimes
5:49
that's to do with effective altruism, so I'm sure
5:52
we'll talk about my giving and so on. But
5:55
in other cases, it's not. So when I got
5:57
married, I took my now-born son, who was a
5:59
child. ex-wife, we're still good friends,
6:02
yeah, took my ex-wife's grandmother's
6:04
maiden name, as did she, where
6:07
the underlying thought was just what is with
6:09
this tradition, where when a
6:11
man and a woman get married, they by default
6:14
take the man's name. And so instead I thought,
6:16
well, why not just choose a name that we
6:18
have some connection with that we really like? And
6:21
I think that was one of the best decisions I ever made. So
6:23
I love this. So you're someone who's just
6:25
willing to look at the world and say, huh, that
6:29
makes no sense. The fact that
6:31
everyone else is doing it isn't
6:33
a reason necessarily to do it. I'm going to do
6:35
what I think makes sense. That is who you are
6:37
at your core, I think. Yeah.
6:40
And I think that has been
6:42
something that I
6:45
found consistently. And in fact, even over
6:47
time, I've learned more and more that
6:50
very often, if there
6:52
are really good arguments for something,
6:55
whether that's arguments for the
6:57
value and importance of charitable giving, for
7:00
the importance of focusing on effectiveness, for
7:02
changing your name, or for catastrophic risks
7:04
from AI even, if
7:06
there are good arguments, even if society
7:08
hasn't yet caught up to those arguments,
7:11
that doesn't mean the arguments are wrong.
7:13
And in fact, you have potential to
7:15
have an outsized impact in the
7:17
world by focusing on things where
7:19
the arguments do make sense, but the world
7:21
is not yet caught up. Yeah.
7:24
So you studied philosophy at university,
7:26
I think, as did I.
7:28
Most people I know who study philosophy at
7:31
university didn't do anything. It is like
7:33
in some ways, it's the least practical
7:35
topic you can kind of study. You
7:38
did. You ended up becoming a
7:41
co-founder of this, what do we call
7:43
it, a field, a form of thought,
7:45
a movement, effective altruism. What is effective
7:48
altruism? So effective
7:50
altruism is about using your
7:52
time and money to
7:55
try to make the world better, But
7:57
using those things as effectively as possible.
8:00
The possible. So rather than
8:02
just donating. To whatever
8:04
charity at filters you in the street
8:06
and asks you to donate in said
8:08
sinking the meal he carefully and really
8:11
going with well the best evidence and
8:13
arguments are to donate to. The.
8:15
Organization where you think he'll have the biggest.
8:18
Positive. Social Impact. Or.
8:20
When you're thinking about your time in
8:22
particular, your career where you have a
8:24
huge amount of time that can possibly
8:27
be used for good or really thinking
8:29
carefully about okay, where with my scarce
8:31
hours on this planet can I have
8:33
the biggest positive impact I can. And
8:35
then it's about just not only thinking
8:37
about it but we we going and
8:39
putting it into practice as well. So
8:42
this all began with an organization called
8:44
given What We Can which encourages people
8:46
to give at least ten percent of
8:48
willing come to what have a chance.
8:50
He's they believe will do the most good
8:52
and some their this idea of can build
8:54
and out and became what's now known as
8:56
the Effect of Else was movement. Sweaty
8:59
suskind tests You encouraged us
9:01
to us. Three. Questions
9:04
to guide us towards more
9:06
effective giving can remember what
9:08
there's one I think I
9:10
asked about what's problems are.
9:13
Biggest. In scale. What I.
9:16
Am most flexible so when you actually putting
9:19
effort you can actually make a difference. And
9:21
then finally what most neglected. Where.
9:23
If a problem is unusually neglected, then
9:26
that suggests that additional resources going into
9:28
that problem within a few time and
9:30
money can we? We have an outsized
9:32
impact. So. I quit.
9:34
The three questions in the book committees
9:36
struck me as a very good combination
9:38
that of encouraging someone to move beyond
9:40
what is the way that we normally
9:43
get involved to can't causes his s
9:45
oh I know someone who's suffered from
9:47
that to trying to imagine what it
9:49
would take to be most effective than
9:51
and those are certainly a each other's
9:53
in their different way help guide you.
9:56
To. That for you asking those
9:58
three questions. What? That encouraged
10:00
you to focus on. So for
10:03
me I think that's led me
10:05
to a particular focus on global
10:07
catastrophic desks and in particular risk
10:10
some new technologies, emerging technologies. So
10:12
in that Oct twenty seventeen I
10:15
talked about ask them pandemics and
10:17
also asks them artificial intelligence and.
10:20
Those. Who have any neglected? And
10:22
that was in part because they occur
10:24
so valley like. Really large scale pandemics.
10:27
New car every thirty years. I have
10:29
a hundred years as a society isn't
10:31
appeared in the way that it should
10:34
be when it were looking to the
10:36
future. The possibility of manmade pandemics due
10:38
to advances and biotechnology mean that we
10:41
might get much larger pandemics and much
10:43
scarier pandemics than we've even hadn't the
10:45
past. And then, similarly, with artificial intelligence,
10:48
this is something where. Because.
10:51
At least up until a couple of years
10:53
ago. The technology was. Not. Quite
10:55
yet that you could see the early.
10:58
Development. Within labs being enormously impressive,
11:00
but it hadn't yet hit the
11:02
mask consumer. And
11:04
so because we were concerned about these
11:06
things when because they were getting so
11:09
little attention. That means that if you
11:11
were to then work on these areas
11:13
you to the wanted just a handful
11:15
of people taking seriously a concern that
11:17
was not on other people's may.time. Particular.
11:20
Focus of this I think comes
11:22
because. In. Your furnace
11:24
say you've met at a specific decision
11:26
that not everyone makes. I think most
11:28
people when they think about their moral
11:31
obligations said think about. That
11:33
family, that finity you.
11:36
In. A loss of the early work
11:38
of a set of altruism encourage people
11:40
to think more global into synth about
11:42
suffering and a global scale. We have
11:44
a moral obligation to. Sit.
11:47
Said generation so we could.
11:50
An intensely. And.
11:52
Human civilization and that that
11:54
is abandoning off my obligation
11:57
to countless millions of sense
11:59
of future. Humans are
12:01
sentient beings descended from us. Can you
12:03
make that case? Smokes? It's effect, It's
12:05
it's it's the set out that sir
12:07
I'm happy to. So he thinks about
12:09
Susan's by plan to do as much
12:11
good as you can. We have time
12:13
and money. What is good mean and.
12:16
Well. We think about that
12:18
in terms of how much benefit
12:20
you providing to people, but in
12:23
particular taking everyone's infests equally thinking
12:25
everyone has an equal. Model
12:27
claim upon us and at least for
12:29
this part of modality, not claiming it's
12:32
the whole of mentality. but for this
12:34
part we should take everyone's and infests.
12:37
Who's see everybody equally? And.
12:41
Never. Naturally means that you
12:43
start looking at. Santa. Help
12:45
the people who have a very poorest
12:47
in the world because national boundaries don't
12:49
seem model imported from his point of
12:51
view. But it also means you should
12:53
start looking to the future as well
12:55
where I think the fact that someone
12:58
is born next week Thousand Tomato makes
13:00
known model deference to the claims they
13:02
have upon me. Know in fact if
13:04
someone is born and hundred years time
13:06
we been a thousand years time. The
13:09
fact that someone can experience die or
13:12
suffer and if we can make a
13:14
difference to their life. That's
13:16
is still my we important and
13:18
that become so crucial because I
13:21
think what are really pivotal moment
13:23
in time when they are new
13:25
technologies in particular. That.
13:27
Has the potential to completely derail
13:30
civilization, send us back to the
13:32
stone age, or leads to some
13:34
sort of dystopian future? I'm very
13:36
worried at the moment about the
13:38
capability for a Ita to magically
13:41
empower dictators and would be dictators.
13:43
And that could be a future
13:45
that we have indefinitely. Democracy is
13:47
by no means an inevitable part
13:49
of. A technologically advanced society.
13:51
And so I really do think that the
13:54
things we do today and have an impact
13:56
not just in the present generation, though the
13:58
impact there's very great. But an
14:00
impact for hundreds of years to come with As
14:02
and see us to com or even much longer.
14:05
Ah, my servants test impact Yes,
14:07
I like to start actually by
14:10
just. Challenging. What
14:12
least asking this question about
14:14
or lives have equal. Value.
14:17
No matter how far away they are either.
14:19
And distance. Or. Time I'm
14:21
in. Part. Of me
14:23
totally resident so that that seems
14:25
like a pure philosophical position. And
14:27
certainly if you are a God
14:29
who could step back from the
14:31
present day and look at the
14:33
whole of the universe and imagine
14:35
different versions of it, you would
14:37
want one where humans lived for
14:39
countless generations and the future with
14:41
as much joy and thriving and
14:44
so forth as possible you would
14:46
said that that was a better
14:48
outcome. And yet. A.
14:50
Lot of people might not their heads and you say
14:52
every human life is that the same but then if
14:55
he said to them. Are. You willing
14:57
to sacrifice your children's interest for those
14:59
of children on the other side of
15:01
the world they would say. Oh.
15:05
Now. The immense net know? or if they didn't say
15:07
that, that is almost certainly how they would act.
15:09
It's how I would. So.
15:12
Is this a Darwinian bug that that
15:14
we care more for our children than
15:17
four children elsewhere? In out that is
15:19
the house is burning. You go in
15:21
and he would receive your child had
15:23
some analysis is a bug or feature.
15:26
So I think it's totally reasonable to
15:28
care more about your near and dear
15:30
your family and friends than. Distance.
15:33
Ranges and in the world would be
15:35
a much darker place to be honest
15:37
if you had no special affinity to
15:40
your loved ones. And so
15:42
in what we are the future Had
15:44
you talk about special. Moral.
15:47
reasons we have so these of
15:49
partiality also reasons of reciprocity to
15:51
so people can benefit us and
15:53
that gives us a reason to
15:55
be pay them however the world
15:57
is today is very well attuned
16:00
to those special relationships. You know, people
16:02
are just very generous to their
16:05
friends and family and very caring
16:07
towards them, which is a wonderful
16:09
thing. But
16:11
I'm saying that at least part of
16:13
a good life should involve taking that
16:16
impartial perspective and actually thinking at least
16:18
if you're lucky, like I am, to
16:20
be in the middle class
16:22
of a rich country with the ability
16:25
to choose a career or to donate
16:27
some amount of your resources, at
16:29
least part of your life should be
16:32
about trying to make the world better from
16:34
this impartial perspective. I
16:37
think that's powerful. I mean, definitely, you know,
16:40
we don't want our children or grandchildren
16:42
to go through a horrific dystopian future.
16:44
But I wonder whether there's almost more
16:46
power in saying, look,
16:48
what would be lost if this went?
16:52
We've been many, many millions of years in the
16:54
making. And over the
16:56
last few thousand years, it's absolutely
16:58
extraordinary what humanity
17:00
has built. We
17:03
cannot let all of this go
17:06
for naught. Like part of me thinks
17:08
that that's a more viscerally felt
17:10
argument to a lot of people
17:12
than the possibility that, oh, if
17:16
all this goes in 200 years
17:18
time, there's some person who I can't
17:21
fully picture who won't ever enjoy the
17:23
beauty of life, even if
17:25
it's not doesn't pass the philosophical test, just
17:27
as a sort of persuasive
17:29
human argument. Is there is there a case
17:31
that it's almost more powerful to focus on
17:34
what is lost than the loss of what
17:36
might be? Yeah, I think there is a
17:38
powerful argument here. And it's actually something that
17:40
my colleague Toby Ord discusses in his book,
17:42
The Percibus, which is that
17:45
you can see human history is like a
17:47
relay race. Every generation
17:49
passing the baton on to the next
17:51
generation. And in particular, when
17:54
I think about my life and
17:56
all the good things in
17:58
my life, So many. Them
18:00
are owed to the effort
18:02
of previous generations. Whether that's
18:05
fruits and plants that have
18:07
been selectively bred over hundreds
18:10
of years, or whether that's
18:12
technology like Med Sin again
18:15
product of hundreds of years
18:17
of slow and. Often
18:19
faltering technological progress, Or.
18:22
Whether that's the kind of model
18:24
and political landscape I live in, his,
18:26
well, I think my life would
18:28
be worse if I didn't live in
18:30
an egalitarian society that takes the
18:32
incest says women and minorities and people
18:35
of all definitely says seriously. And
18:37
so. I. Have a
18:39
kind of responsibility to the past as
18:41
well as to the future to ensure
18:43
that we continue that relay race and
18:45
enable that there is a next generation
18:48
that we can pass the baton on
18:50
to and ensure that they has as.
18:52
Happy and forcing lies as possible.
18:55
Part of this discussion that I think
18:57
leads to. Some. Say key
18:59
criticism since Sept voters and that is
19:02
amazing the last couple years. I think
19:04
it's along the lines of the more.
19:07
Distance. You get in terms of
19:09
the moral. Obligations.
19:11
Of the moral calculations that you
19:13
encourage people to take the more
19:15
risk, there is a things going
19:18
harvey wrong. So in theory, if
19:20
you believe that there's a possibility
19:22
of the future in which say
19:24
a trillion humans live and service
19:27
and you think there is some
19:29
act you could take today that
19:31
would has a tiny percentage impact
19:33
on reducing the risk of that
19:36
suits and not happening, say one
19:38
percent. Any single that made the
19:40
tiniest. Difference to. Prospects.
19:45
You. Could justify in as that a utilitarian.
19:48
calculation way or basically just saying
19:50
we should act in a risk
19:52
adjusted way to maximize the overall
19:54
risk adjusted probability of good for
19:56
the whole universe calculate several times
19:58
you can and with some
20:01
wild and crazy decisions. And
20:03
arguably, a certain
20:05
person, Sam Bankman-Fried, was
20:07
guilty of some of this kind of
20:10
thinking. Do you think part
20:13
of him was making
20:17
EA-type calculations? Was
20:19
he just confused? Or was he
20:21
always... What's your view on him?
20:24
So, yeah, Sam committed the most
20:26
horrific fraud. A million people lost money,
20:29
some of whom lost their life savings.
20:31
So the prosecution recently released some messages
20:34
that Sam had received on Twitter during the
20:37
collapse. And it's really harrowing to read, like
20:39
a man who thinks
20:42
he's going to be made homeless has four
20:44
children, another man who was fleeing Ukraine and
20:46
put his savings onto FTX.
20:49
It's really just quite hard to read. And so
20:51
what he did was enormously harmful. Was
20:54
it the result of some careful calculation,
20:58
like some gamble that they made sense as a bet?
21:00
And I think the answer is quite clearly no. I
21:02
think there's no perspective
21:04
on which what happened
21:07
to FTX made sense. Part
21:09
of the EA logic
21:11
has been that there
21:13
are many ways to make a difference in the world. Some of
21:15
them are to give away money now.
21:17
But for some people, if they have
21:20
extraordinary earning potential for the future, it's
21:22
actually better for them to focus
21:24
on accumulating wealth. And
21:27
then at some point in the future,
21:29
then they direct that wealth
21:31
to doing good. And I wonder whether to
21:34
the extent that you say that there was some good intent in his
21:36
mind that
21:38
part of him was saying to himself,
21:40
I can justify taking any kind of
21:43
risk here for the prospect of making
21:45
countless billions of dollars. Because I know
21:47
that one day I'm going
21:49
to spend that money well.
21:53
And therefore the usual rules don't apply to me.
22:00
other people. And certainly when the
22:02
collapse happened, I was very worried that, wow, maybe
22:05
what had happened was exactly this sort of
22:07
calculation. Now that we've gotten
22:09
the evidence that's come out over the last year
22:12
and a half, including at the trial and books
22:14
that have been written about the topic and so
22:16
on, that's at least in my interpretation, not what
22:18
happened. I think there was a
22:20
combination of them being criminally
22:23
and recklessly negligent. And
22:26
literally, I mean, it comes up, they had
22:28
a meeting in June of 2022 where they
22:31
thought that Alameda had borrowed $16 billion from
22:33
FTX, but it turned out it was a
22:35
bug in the code and it was only
22:37
$8 billion. They did not know where all
22:39
the money they had was. They had complete
22:42
absence of corporate controls, even the most basic
22:44
sorts of risk management. And
22:47
my understanding of what happened, that
22:50
gross and criminal negligence put them into
22:52
a hole that they only discovered they
22:54
were then in June of 2022. And
22:57
it was then at that point that they
22:59
start very seriously engaging in Ford
23:01
to try and get themselves out. But
23:03
the kind of key thing that happens is not
23:05
some calculated decision, which
23:07
would have made no sense, like
23:10
no sense doing the maths
23:12
on it. Instead, it
23:14
was just something kind of mindless. And over
23:16
time, I've actually learned that quite a lot
23:19
about other sorts of white collar crimes, in
23:21
particular from this book, Eugene Soltas, is why they do
23:24
it. And that's something he points
23:26
to over and over again. White collar crime
23:28
is not a result of some careful calculation.
23:30
It's a failure of intuition. It's kind of
23:32
this mindless, reckless mistakes that people make. Give
23:35
us a sense of what
23:37
this whole thing was like for you personally, because before
23:40
the fall of Black & Free, he
23:42
was regarded by many as the sort
23:45
of poster child of effective altruism. I
23:47
think he credited you with changing his
23:49
philosophy of life, and at least on
23:51
the surface, he was a very
23:54
visible proponent of EA. And
23:57
then this happened, must Have felt.
24:00
Like the must have slipped betrayal I'm it's
24:02
hard to imagine see you for all that
24:04
you've built. You must have felt for a
24:06
bit that the that. The. Whole thing
24:08
was coming. Tumbling. Down. What
24:10
was it like on the inside? Yeah
24:13
Absolutely. I mean I felt it was
24:15
a lot of emotions so one was
24:17
just yeah. the absolute horror that the
24:19
harms the have been caused seconds. You're
24:21
absolutely right. like a feeling of it's
24:24
fail you know I admired this person.
24:26
I respected him for his to a
24:28
huge amount of goods and a yeah
24:30
satellites are not a fool. And
24:33
then the finding things as confusion as
24:35
well honestly and ask and season access
24:37
to this day. So.
24:39
Many of whom he felt at a time like I'd been.
24:42
Punched to stabbed for something
24:44
like I remember as a.
24:47
Kid. In Glasgow eight years old or
24:49
something as playing in a nearby school. And
24:52
Glasgow has glove from the violence and again a
24:54
gang of kids to steam up to me in
24:56
beat me up. And
24:58
other members of the time not fighting back and
25:01
I just asked. Why? He doing
25:03
buses and the how is this The same
25:05
feeling I had was just like why why
25:07
on earth would you have done this It
25:09
makes no sense to me and so I
25:12
think I was. Maybe even. When.
25:14
The collapse is happening even slow on
25:16
the uptake compared to rest of the
25:18
world. To. Appreciate it for
25:20
the thought that it was because
25:22
if south so in Congress and
25:24
inconsistent with and experiences I've had
25:26
with Sam and with the others
25:28
who are high up Stx so
25:30
his phone. Gave. Many
25:33
people license to pylons
25:35
and. Criticize.
25:38
Ea from all angles me some
25:40
of it justice for my view
25:43
points out like six ugly and
25:45
planning on and possibly I would
25:47
argue because it relieves people of
25:49
the responses. I'm it's it's it's
25:52
ask any difficult questions of themselves
25:54
but nonetheless that was legitimate criticism.
25:56
At this point I'm curious well
25:59
as to. What?
26:01
Your. Take ways on to what extent you
26:03
have felt that. He needed
26:06
to refrain a bit. how he
26:08
a should be. Sort.
26:10
Of yeah, we've always
26:13
emphasized that. Effective
26:15
altruism. Does. Not entail ends
26:17
justify the means reasoning. you know there's
26:19
good reason for that. Non consequences to
26:22
reasons as and just. It's intrinsically Ivanka
26:24
do harm for the greater good, but
26:26
also just it doesn't work. This has
26:28
been known for hundreds of yes there
26:31
are certain models liberals that has evolved
26:33
in our culture for reasons like don't
26:35
inflict harm for the purportedly to good
26:38
fats. In terms of communication about the
26:40
a going forward, historically we've talked that
26:42
was distinctive about living and good life
26:44
which like I said is using. More
26:47
of your these or says the time
26:49
your money to help others and with
26:51
that trying to do as much good
26:53
as you can. But now that Yea
26:56
has gotten more successful and certainly in
26:58
the light of the Stx scandal. I
27:01
think we need to emphasize more of
27:03
was wholly virtuous life looks like were
27:05
that involves all a common sense that
27:08
is being honest, being cooperative, being high
27:10
integrity, being kind and what we're saying
27:12
is is take all of that know
27:14
so away but crank up the dial
27:17
on these other virtues of benevolence. how
27:19
much does care about others for their
27:21
own sake whenever they on the wound
27:23
kind of. the dialogues with seeking as
27:25
well and like the bigger than your
27:28
thinking especially when applied to that attempt
27:30
to help. Us is said he of that's
27:32
something I think I'm gonna be emphasizing much
27:34
more going forward. Welcome
27:45
to the Canva guided meditation for
27:47
stress at work. Impending
27:49
deadline? Generate Canva presentations
27:52
in seconds. So fast.
27:55
Brainstorm got too big? Summarize
27:58
with AI in a click.
28:02
Writer's block? Release
28:04
with Canva MagicWrite. Magical.
28:08
Stress less and save time at
28:10
canva.com. Designed for work.
28:13
Canva. Will.
28:18
He is upsetting to me a few months though
28:20
that as the I think a lot of people
28:22
don't get about a which is they think of
28:24
the a as the said as says his. Moral.
28:28
Prescriptions or you know these are
28:30
the organizations uses Support these of
28:32
the causes you should care about
28:34
and is described as a complete
28:36
misunderstanding and that that exley. He.
28:39
A was intended as.
28:42
A process as as it's as
28:44
a way of people thinking, people
28:47
asking the right questions. I.
28:50
Found that powerful I'm interested. Take
28:52
the basic question which I would
28:54
ask have any pylon critic of
28:56
he I. Would.
28:59
You rather your. Altruism
29:02
was effective or innocent. This set
29:04
at a i read it is
29:07
right there. Most people I think
29:09
one that Ottawa's them to be
29:11
effective. Yes to This is exactly
29:14
right and so. The. Model
29:16
Inspiration. I will take this like science.
29:18
So what is science is Not a
29:20
body of pieces of knowledge that we
29:23
have no even is a body of
29:25
widely accepted series is primarily a process
29:27
as the use of experiments and former
29:29
reasoning to help us get to the
29:31
truth that is the core of what
29:34
science says. What is Effective altruism again
29:36
is not a set of recommended charities
29:38
and recommended career paths that I am
29:40
very confident that we're still in the
29:42
dark about lots of things and very
29:44
concerned with an enormous. Amount we don't
29:47
know. I change my mind on a
29:49
huge number of topics in the last
29:51
fifteen years. In the next fifteen, I
29:53
expect to change my mind many times
29:55
again. Instead, what it is is a
29:57
question. The question as well. With.
30:00
The time and money that I'm willing
30:02
to put towards. Doing. Good. How can
30:04
I make that as effective as possible? How can I do
30:06
as much good as possible? And gonna
30:08
really just look at the evidence as
30:10
best I can and gonna engage with
30:12
all the arguments that I and I'm
30:14
gonna release find take this seriously work
30:16
at Flu because it's so important that
30:18
I'm really want to get my my
30:20
answer. So. Let's talk
30:22
about ah. Lot
30:25
more is you. Must.
30:27
And want to be do about it. Sure,
30:30
where I want to place most of
30:32
the emphasis would affect the ai is
30:35
on the idea of explosive growth and
30:37
capabilities. So. This is a way
30:39
in which A I is different I think
30:41
than any other technology is that at some
30:43
level of capability which we might well
30:45
death in. The. Next year's next
30:48
decade we will have Ai that can
30:50
build better ai that is a I
30:52
that significantly helping with a process of
30:54
the certain development of ai systems themselves.
30:57
And those better ai systems will be
30:59
able to build better, they on and
31:01
so on. And so this is an
31:03
argument goes all the way back to
31:06
I J Good, a computer science pioneer
31:08
because it intelligence explosion and more recently
31:10
much more indepth work has been done
31:12
to scrutinize the argument embedded into former
31:15
models of economic growth. And
31:17
I really think that the argument
31:19
is surviving. And that's really quite
31:21
a dizzying thought. Because what it
31:24
means is that. What?
31:26
You would naturally think of as
31:28
many centuries of technological progress. So
31:31
everything that I happen technologically speaking
31:33
between now and the year Two
31:35
Thousand Five hundred Let's say. All.
31:38
Of that might occur within the course
31:40
of just a few. Yes, because you
31:43
get Ai that can create better a
31:45
I and quite soon you have. Billions.
31:48
Of Ai scientists working twenty
31:50
four hours a day to
31:52
creates better than more powerful
31:54
technology and. I think
31:56
by default had says he go very well.
31:59
It means that. Inventing. New
32:01
weapons of Mass Destruction. Including new
32:03
weapon to me that we haven't even conceived of.
32:05
Yeah, It means that were
32:08
able to automate military power such
32:10
that's a single person in principle
32:12
could console all military force. Basically
32:14
a robot Ami it means potentially
32:16
dictating beings sets have model status
32:18
themselves that she was gifts and
32:20
model consideration to and then finally
32:22
if we are creating in such
32:24
a short period of time a
32:26
I systems that are far far
32:28
more intelligent than us and more
32:30
capable than us if things that
32:32
the risk that we could lose
32:34
control of those systems to. And
32:37
so the way I think of ai
32:39
is like. In analogy
32:41
to the industrial Revolution, where the industrial
32:44
revolution just meant that the pace of
32:46
technological saints increased by a factor of
32:48
about thirty. and I actually think a
32:51
I could increase that pace of change
32:53
by a factor thirty even even more
32:55
than that again. And so it's like
32:57
this accelerator that brings a whole suite
33:00
of different concerns that we need to
33:02
pay attention to and and carefully govern.
33:07
It ready to sit as if. Oh. That's
33:09
or off and it also feels as if
33:11
it's in gonna be incredibly hard to stop.
33:14
The Train. Now A lot of people
33:16
in the space say. That.
33:18
There's a five percent chance that
33:21
seems to go horribly wrong, but
33:23
they'll probably go right. they'll probably
33:25
be better. and but part of
33:27
the for hunts can include consigning
33:29
all of humans to irrelevance. And
33:31
that stephanie the outcome. I at
33:33
the amongst as says that we're
33:35
not going to be the main
33:37
game in town months longer and
33:39
what was com will be amazing,
33:41
an astounding and who knows what
33:43
we're unlocks. I also. Worry.
33:46
About the sap that. Any
33:48
good thing that happens in the world. You
33:51
know when it happens? As an
33:53
intentional battling against accounts at Universe
33:55
in Oak, we build things. Kathleen,
33:57
Soliah New, put them together and.
34:00
You make gradual progress and then bad things
34:02
happen very quickly and said of blow up
34:05
part of that progress than you try again
34:07
try again. but the more powerful the things
34:09
we built the bigger those fans. Can.
34:11
Be and the fact that were. Creating.
34:15
Technological. Powers that could take
34:18
out eight billion humans very quickly
34:20
is so kings. Is there a
34:22
pathway to too good at home
34:25
watching this? Avoiding this without without
34:27
sounding like complete luddites. We
34:30
had become luddites. So. I
34:32
think that is a pathway and I
34:34
think Ai has. Potential. To do
34:36
enormous amount of good as well and I
34:39
think part of the solution in fact will
34:41
be using a I to help us with
34:43
the problems we face, including the problems of
34:45
a I alignment. So.
34:48
One thing I think we can do. Is.
34:52
Try. To accelerate the helpful parts of
34:54
Ai and push back the kind of
34:56
scary and more dangerous parts. So there's
34:58
an idea called to Ai which is
35:00
a I that just you ask you
35:02
to do something and helps. Yeah it's
35:04
kind of just like an input output
35:06
Sunset Cbt is like this contest with
35:08
the density I agency hi that's where
35:10
you can tell it to do something
35:13
and is not just giving you an
35:15
answer like a kind of article instead
35:17
of actually going out in the wild
35:19
and making a lot changes. That.
35:21
Is a lot more dangerous and upon
35:23
the release of Sympathy for A one
35:25
year ago now many people sides to
35:27
turn it into an agent including some
35:30
people who created what they called chaos.
35:32
Tp T that was explicitly and succeeds
35:34
to take over the world and achieve
35:36
digital immortality. A Thankfully it wasn't very
35:38
good but what we could do is
35:41
to say look to Lay I is
35:43
amazing to see I can really help
35:45
us out. A dense okay I we
35:47
are not prepared for so let's maybe
35:49
even subsidize Accelerate to Lay. I that
35:51
really pretty intense regulation and the
35:54
brakes on a density I sat
35:56
I think would help an awful
35:58
lot like that the such. conversation.
36:00
And if this was a cause that
36:02
someone wanted to support and get into,
36:05
is there any
36:07
resource that you can point them
36:09
to? So for effective altruism in general,
36:12
the single place I'd most love to
36:14
point people is giving what we can.
36:16
So that's the organization that I helped set up
36:18
15 years ago now. And it
36:21
encourages people to give at least 10% of
36:23
their income, to take a pledge to give
36:27
10%. And I believe you're a member, Chris?
36:30
I am a member. And this is actually what I
36:32
want to turn to you now. It is a big
36:34
and bold thing where you're asking people
36:37
to do what
36:39
certainly religions in a way have asked people
36:41
to do for a long time, but which
36:44
has fallen out of fashion is to make
36:46
a pledge pledging anything, I
36:48
think shifts you from
36:50
impulsive charitable giving
36:52
to thoughtful charitable giving. And that to me is
36:55
almost like the biggest single thing. If you know
36:57
that each year you're going to give away a
37:00
material amount, then it
37:03
just makes it natural, completely natural to start to
37:05
think, well, okay, what should I give it to?
37:08
Yeah, absolutely. And it's certainly the case for
37:10
me that I don't think I would be
37:12
giving nearly as much as I am if
37:14
I hadn't made those earlier statements. And I
37:17
didn't have a community of people around me
37:19
who are encouraging and think this is like
37:21
a cool thing to do rather than something
37:23
weird or abnormal. I mean,
37:25
look, in the book, I argue for
37:27
this pledge and specifically, I think giving
37:29
what we can or actually is the
37:32
best place you can go to join
37:34
a community, make the pledge public. They've
37:36
got a lot of great tools that
37:38
allow you to potentially step up to
37:40
10% if you're not able to
37:43
do that initially. But I
37:45
think for a lot of people, we need more
37:47
sort of life hack type motivations
37:50
to get there. Maybe you need
37:52
to see a picture of the
37:55
child from the organization who you're supporting. Maybe you need
37:57
to go on a trip with them from time to
37:59
time. I'm and actually see the
38:01
work in person so that you
38:03
can feel it. Maybe you need
38:05
to hook up with a community
38:07
of supporters who will help you
38:09
feel okay? We're doing this as
38:11
a team effort. Yeah, see know.
38:13
when I was thinking about how
38:15
much am I going to pledge,
38:17
am I gonna stick a ten
38:19
percent? My gonna go even more.
38:21
I actually did just go through
38:24
photos of children in poor countries
38:26
suffering from neglected tropical diseases and.
38:28
What? I thought to myself
38:31
was. Can. I. Look
38:33
these children in the I'd either it's
38:36
photos of them but looking them in
38:38
the eye and say look, I can
38:40
justify having this money to myself and
38:42
I remember in particular. One
38:44
child who had an unusual
38:46
condition lymphatic filariasis of the
38:48
face that also known as
38:51
else and passes so conditioned
38:53
to make your body parts
38:55
swell up and so their
38:57
faces this and incredibly swollen
38:59
and possibly debilitating disease. And
39:01
honestly I just thought. Look.
39:05
At my donations this past my income
39:07
or wherever can prevent this one sized
39:09
some having a condition like that it
39:12
will have been worse as to will
39:14
have been well worth it. And
39:17
let alone the fact that actually, it's
39:19
hundreds of times more than the. Who.
39:24
Saw think it's completely legitimate to do
39:26
that, to try and find, to think
39:28
of yourself as as complex make sense
39:31
hadn't heart and to say I need
39:33
to find the ways that the brig
39:35
my along on the journey here because
39:37
the head is important but and story
39:40
not minutes to be going the hallway.
39:42
Speaking of which, I don't worry that.
39:46
For some people, but you eat. You
39:48
describe yourself as in a you're not
39:50
getting more than fifteen cents. Hearing color
39:52
mean? that's that's incredible. Some.
39:54
People will share that and go oh my god
39:57
is this is the logical outcome of the journey
39:59
on the scale. I don't want
40:01
to hear any more. And Sir
40:03
Peter Singer's arguments is so powerful
40:05
and yet. They. Scared
40:07
a lot of people inside. I.
40:10
Tried in the book and I'm
40:12
I'd like to bounce this off
40:14
the or philosophical critique I tried
40:16
in the books to. Put. Together
40:18
an argument that said that if you
40:20
embark on the senate, it's actually not
40:23
a journey towards unlimited giving that a
40:25
pledge can be a ceiling. As
40:27
well as a flaw in terms of
40:29
what we require from each other and
40:32
the argument with assess what it was
40:34
tix found the pledge so that it
40:36
wasn't just an income pledge for the
40:38
very rich and income pledge is at
40:41
the scene that that challenging many of
40:43
the billion as for example don't have
40:45
much income well as to that well
40:47
I'm and for the very rich I
40:50
think the the traditional muslim. Pledge.
40:52
as I thought of turn off the Sense
40:54
of Your net wealth. And. Nearly
40:56
as a much more challenging and a
40:59
much more important has to make. So
41:01
I then did a calculation well that
41:03
that said what would happen if. A
41:06
meaningful number of the people
41:08
who can afford to do
41:10
so gave the higher odds
41:12
either. Ten. Percent of
41:15
their income. Or. Two and a half
41:17
percent of than that was. Annually, How
41:19
much would that actually race? And even
41:21
if only a third of the people
41:23
who could do that did that, I
41:25
think on the numbers we came up
41:27
with his three or four trillion dollars
41:29
annually. And working with another
41:31
colleague of yours Natalie Cadel see
41:34
helped calculates that the amount that
41:36
this could achieve you could argue.
41:38
I'm not sure whether she would
41:40
argue for I said he could
41:42
hug It's that it kind of
41:44
talents as as almost maximum philanthropy
41:46
that at this level the bottleneck
41:48
suits. making a better future is
41:51
no longer than anticipated is just
41:53
it's execution of that huge amount
41:55
of land speed. Therefore. It's
41:57
reasonable to say that that
42:00
the new that let people
42:02
commit to pledging. The.
42:04
Higher of ten percent of their income
42:06
or to upset the network. they don't
42:09
have to feel guilty. Beyond that they
42:11
have done, they have fulfilled their obligation
42:13
and that. We.
42:16
Actually want. Obligations
42:19
out that that a typical
42:21
human can reasonably fulfill that
42:23
if we have moral obligations
42:25
that a self high, but
42:27
just. Ignored some
42:29
in a clumsy way trying to
42:31
articulate. I'm a mirthless to the
42:33
says moral principles need to take
42:35
into account human limitations. Yeah, I
42:37
mean, I think it's just an
42:39
excellent point. So you know you
42:41
mentioned Peter Singer who had the
42:43
most hardcore games until you have
42:45
the Poverty line. When he doesn't
42:47
do that he gives about he
42:49
gets forty percent or something of
42:51
his income which is a good
42:53
income. I don't give as
42:56
much as I could and I don't give
42:58
until I feel miserable. and mean, it feels
43:00
embarrassing to say that I saw a lot
43:02
of go to. I felt like that. really
43:04
huge problems in the world. The world is
43:06
going to hell in a handbasket. I.
43:09
Am not doing anything about how I don't
43:12
know what to do and I just felt
43:14
really bad. And once you
43:16
start giving ten percent okay, it's not
43:18
as much as he could in principle
43:20
do, but it's more than most people
43:23
are doing a civil give one percent
43:25
sports you person and it's meaningful. It's
43:27
really making a meaningful difference in the
43:29
world and you know that feeling of
43:32
guilt significantly does go away and said
43:34
he stopped to think more like okay,
43:36
I'm actually embarking on this practical protective
43:39
making the world better that feels inspiring
43:41
rather than demoralizing. whom.
43:44
Are consistent with us in our different ways.
43:46
Saying we can do two things at once:
43:48
We can embrace the full power of the
43:50
Peter Singer arguments that we have an obligation
43:53
actually to every. Person. On
43:55
the planet, snippets of repetitive to seats
43:57
at every facet of the pods we
43:59
have us. Huge, unlimited moral obligation,
44:01
but we also can be human,
44:03
we can we can live within
44:06
reasonable human capabilities and it it
44:08
is. A moral objectives and
44:10
lead a life of permanent guilt. It
44:12
really isn't. And
44:14
I love. The. Site.
44:17
He. Built their those A given what we
44:19
tend to organ on us out at
44:21
any one, the sense of this is your
44:23
persuaded by any of this to consider heading
44:26
on over there. I'm using
44:28
some the tools they have and you
44:30
have to start with ten percent or
44:32
two and up since net worth one
44:34
of these it's it's start with something,
44:36
start with something and then use it
44:38
as an excuse to have in our
44:40
a regular discussion with us cause to
44:42
you about how to spend that money
44:44
as sectors he skates and space and
44:46
be amazed the joy that that actually
44:49
brings with it. Is. There
44:51
any final thing you'd like to say
44:53
to people. I just
44:55
thought I was yet. Utterly
44:57
to acts and be the cranes
44:59
filings this and I just wanna
45:01
thank you as well for your
45:03
support. So both Britain giving into
45:05
practice and then not only that
45:07
but going out to bats to
45:09
find get people to donate more
45:11
we all have this huge amazing
45:14
opportunities to get in. The world's
45:16
just by moving a fraction of
45:18
our money spending has a different
45:20
way and that's something that should
45:22
be celebrated and I'm so glad
45:24
that you're being part that celebration.
45:26
Well with other travellers. Here and
45:28
recently welcome and it's help us who
45:30
wants to come. Ambience: son or tablet.
45:33
Come. On and he part of the
45:35
semi. It's an amazing journey. well
45:37
even an incredible thought leader for
45:39
a long time and even some
45:41
a lot And it's than very
45:43
powerful to hear your version of
45:45
the story here of the last
45:47
couple years. So thanks so much
45:50
Feel Black and Forks! This time
45:52
was composition out and Texas. Okay,
45:55
that's about all for today. Like
45:59
to discuss. Please consider
46:01
sharing it with others. I mean you
46:03
could take that as your own active
46:06
and sexist interested in of a no
46:08
one person you mentioned this to or
46:10
someone they might mention it to might
46:12
be inspired and us by Will Mccaskill
46:15
to do something big and if they
46:17
do that will be because of you.
46:20
Next week were talking to Hum
46:22
De Luxe is Sandra to Bonnie
46:25
the most popular Greek yogurt brand
46:27
and the Usa. District
46:30
is Succeeds. Please refer to
46:32
himself as an anti Cel
46:34
because it is unique approach
46:36
to bring a spare some
46:38
generosity into his business strategy.
46:40
I can't wait to introduce
46:42
you to his work! For
46:45
more, follow along in my
46:47
book and Sexist and Rusty
46:49
you can access free copy
46:51
of the audiobook at ted.com/generosity
46:54
that doesn't do his part
46:56
of the Ted Audio Collective,
46:58
a collection of podcasts dedicated
47:00
to Spark. And curiosity and
47:02
sharing ideas that matter. This
47:04
episode was produced. My Just
47:07
Shooting team includes Constanza Guy
47:09
Auto, Grace Rubenstein and Ben
47:11
Ten, Michelle Quint, Roxanne, Hi
47:13
Last and Daniella While Or
47:16
A So this show is
47:18
mixed spies, Cerebral Gap, thank
47:20
you so much.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More