Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Hello and welcome to this podcast
0:02
from the BBC World Service. Please
0:04
let us know what you think and tell other
0:06
people about us on social media. Podcasts
0:10
from the BBC World Service are supported
0:12
by advertising.
0:16
On the Tools and Weapons podcast, hosted
0:18
by Microsoft Vice Chair and President Brad
0:20
Smith, global leaders discuss the promise and the
0:23
peril of the digital age. From environmental sustainability
0:26
to cyber security, the need to develop
0:28
AI in a principled and ethical way so that
0:30
it serves all humanity. Guests like businessman
0:32
Strive Matsuiwa, journalist Kara
0:34
Swisher, and Microsoft CEO Satya
0:37
Nadella share lessons from their past to
0:39
reframe some of society's toughest challenges
0:41
and seek new solutions. Find Tools and Weapons
0:43
wherever you like to listen.
0:45
Surgeons keep our hearts beating.
0:48
They do the amazing, help save lives,
0:50
and so can you. Your CSL Plasma
0:53
donation can help create 24 critical
0:55
life-saving medicines that can give Grandpa the
0:57
chance for his heart to swell when he meets his
0:59
new grandson or give a bride the chance
1:02
for her heart to skip a beat on her wedding
1:04
day. Every plasma donation helps
1:06
more than you know. Do the amazing,
1:09
help save lives. Donate today
1:11
at your local CSL Plasma Centre and
1:13
be rewarded for your generosity. Hi,
1:18
Zoe Kleinman here. For the next few weeks,
1:20
we're bringing you a brand new podcast from
1:22
the BBC World Service here in the Digital
1:25
Planet feed. It's called Tech
1:27
Life, and each week it brings you the top
1:29
technology news and trends. If
1:31
you want to keep getting the podcast, please search
1:33
for Tech Life wherever you get your podcasts and
1:36
hit subscribe or follow so you never miss
1:38
an edition. But for now, on with
1:40
the show.
1:41
The rapid development of artificial intelligence
1:43
is mind-boggling for most of us, but
1:46
you might expect Google boss Sundar Pichai,
1:49
speaking to CBS News, to be a little
1:51
more ahead of the curve.
1:52
There is an aspect of this which we call,
1:55
all of us in the field, call it as a black box.
1:58
You know, you don't fully understand. You can't
2:00
quite tell why it said this
2:02
or why it got wrong. Maybe
2:05
not then. This week, we ask if even the
2:07
boss of Google can't fully comprehend how his
2:09
own chatbot works. Is it time to
2:11
press pause on the AI race? I'm
2:14
Zoe Kleinman and this is Tech Life.
2:32
Artificial intelligence feels like the hottest
2:34
and certainly the most lucrative topic
2:36
in tech right now. The competition
2:38
between the world's biggest firms to develop
2:41
products and services powered by it
2:43
is fierce, with Elon Musk becoming
2:45
the latest to promise a game-changing chatbot.
2:48
It's both an exciting era of rapid
2:50
innovation and also a little bit terrifying,
2:53
as some experts are questioning whether we fully
2:55
understand the power of what we're creating.
2:58
And now this has happened. A Google AI
3:00
programme has taught itself Bengali,
3:03
the language of Bangladesh, despite not
3:05
being trained in it. I'm not kidding. Google's
3:08
bard learned another language, seemingly
3:10
off its own bat. Have a listen to James
3:13
Manneka, who leads on AI at the firm, and
3:15
Google boss Sundar Pichai. They're both
3:17
speaking to Scott Pelli on 60 Minutes on
3:19
CBS News.
3:21
We discovered that with very few
3:23
amounts of prompting in Bengali,
3:25
it can now translate all of Bengali.
3:28
There is an aspect of this which we call,
3:31
all of us in the field, call it as a black box.
3:33
You know, you don't fully understand. And
3:36
you can't quite tell why it said
3:39
this or why it got wrong. We have some
3:41
ideas and our ability to understand
3:43
this gets better over time.
3:45
But that's where the state of the art is. You don't
3:47
fully understand how it works, and
3:50
yet you've turned it loose on society?
3:53
Let me put it this way. I don't think
3:55
we fully understand how a human mind
3:57
works either. Hogarth,
4:00
co-founder of the tech firm Plural,
4:03
an author of the status of AI reports.
4:06
Ian, how worried should we be that AI
4:09
like Google's bard is doing things that
4:11
we don't fully understand? I
4:13
mean first of all I mean I'm an investor in
4:15
frontier technology so you know my
4:17
day job is investing in things that are kind
4:20
of really cutting-edge technology you know like
4:22
nuclear fusion or quantum computers or
4:24
applications of machine learning and so I'm
4:26
you know extremely excited about about new technology
4:29
I think it has remarkable potential
4:32
to transform kind of every aspect of our lives and I think
4:34
AI is is is in some ways
4:37
maybe the most powerful technology. The
4:40
main thing that I believe is that we should be having
4:42
a much more public discussion about
4:45
how rapidly the systems are progressing and how different
4:47
it is to prior generations of software.
4:49
Because a lot of people
4:51
discovered it really with the launch of chat
4:53
GPT in November last year
4:56
but obviously there have been years of research
4:58
getting us to this point and then when you
5:00
look at something like chat GPT that's so powerful
5:02
and it's only the fourth generation and
5:05
you know what are the next steps where are we going with
5:07
this?
5:09
I think that's exactly right Zoe so chat
5:11
GPT is an incredible achievement
5:13
of engineering but it really builds on
5:15
an exponential that's been running for
5:17
about the last decade and so if
5:20
you kind of go back to 2012 and
5:22
you compare the systems we were building then and the
5:24
systems we're building now we've been very
5:27
kind of very consistently increasing
5:29
the amount of data and the amount of computing
5:31
power we use to develop the largest and most capable
5:33
AI models and you know just as a snapshot
5:36
for you we've we've gone from
5:38
you know a relatively small number of
5:40
images being fed into the largest AI
5:42
systems in 2012 to now
5:45
feeding you know a large amount
5:47
of the internet into the most powerful systems like
5:49
a system like GPT-4 and we've increased
5:51
the amount of computing power consumed
5:54
by these models by about a factor of a hundred
5:56
million in the last decade and so although
5:59
it feels like
5:59
chat GPT kind of came out of nowhere
6:02
to most people. In practice this
6:04
is a very long-running trend that is
6:06
going to continue. And
6:08
it's kind of not building
6:10
on what we might consider
6:12
to be existing tech, it's a whole new lane isn't
6:15
it really? Yeah that's right.
6:17
When we write traditional software we have
6:19
quite an explicit understanding of how and
6:21
why the inputs relate to the output, outputs,
6:24
sometimes people would describe that as a white box
6:27
and these large AI systems are really quite different,
6:30
you know they're closer to a black box in lots of ways
6:32
where you don't really understand what's happening inside. We
6:35
don't so much as kind of you
6:37
know
6:38
program them to do very explicit
6:41
things. I like to think that we grow
6:43
them that feels like the best
6:46
framing I've heard of how we kind of build these systems
6:48
today. And the tricky
6:50
thing is as we grow them you
6:52
get these big sharp jumps in capabilities.
6:55
So you add you know 10 times more computing
6:57
power or 10 times more data and suddenly
6:59
the system can do something that it couldn't do before.
7:02
The challenge with AI is there are lots of
7:04
different ways in which it's going to
7:06
affect us and a lot we've talked a lot
7:08
I think
7:08
as a society about the potential
7:12
impacts on the economy. We've talked
7:14
about the potential impact on military
7:16
technology but we've had less discussion
7:18
I think of these kind of these
7:22
broader risk of disrupting us as
7:24
a species.
7:25
Are you on team Musk then
7:28
in terms of
7:30
being of the view that we really need to slow this down for
7:32
a minute and figure out how we're
7:34
going to live with it and where it's going to go? So
7:37
I think a pause does seem
7:39
like a sensible idea but for
7:41
me the the real question is how do we have
7:43
a public conversation about this because the
7:46
reason that personally I think a pause would be good is it
7:48
would give us time to have a public conversation
7:50
about what we should be doing with this. You know
7:52
I think that most of your listeners would not be
7:54
aware that these these small number
7:56
of companies are trying to build a kind of super
7:59
intelligent god-like machine and
8:01
I think we need to know what people think about
8:03
this broadly before we raise
8:06
to develop it any further to avoid
8:08
a situation where the public is really left behind.
8:11
Ian Hogarth there.
8:18
Coming up, a sound to represent
8:21
all online human knowledge. And
8:27
a rather chilling claim. I could
8:30
come up with a date predicting
8:32
when you would die. I have a way to calculate
8:35
it. Can you tell me? No.
8:37
And a reminder that we want your thoughts. Hopefully
8:40
a bit less morbid than that please. Email techlife
8:42
at bbc.co.uk or
8:45
WhatsApp us on 0330 123 0320.
8:54
Next, Meta is embroiled
8:56
in a legal dispute in Kenya over
8:58
the moderation of Facebook contents. That
9:01
legal battle has intensified in recent
9:03
days and it's left people wondering who, if
9:06
anyone, is reviewing harmful
9:08
and hateful material on the platform. Our
9:11
reporter Michael Colocchi has been following the case
9:13
and he joins us from Nairobi. Michael,
9:15
hello.
9:16
Hello. Michael, just remind
9:18
us of exactly what this moderator
9:20
dispute is all about. So
9:23
there are a number of different elements to this
9:25
dispute. So first, let me try and give you
9:27
a bit of background to this. Basically, Meta
9:29
had contracted a company called Sama as
9:31
a content review partner to offer
9:33
it content review services here in the sub-Saharan
9:36
Africa region. Sama has a hub
9:38
here in Kenya and it had a number of people it had hired
9:40
in the region to carry out content
9:42
moderation on Meta's platforms. Now,
9:45
at some point it seems Sama
9:47
decided to wind down its content moderation
9:49
arm to concentrate on labeling
9:51
and this is computer vision data
9:54
annotation. It seems a decision
9:57
may have been made by Sama to lay over 200
10:01
content moderators working from its hub
10:03
here in the country. So some of the moderators
10:05
filed a petition at a court here in Kenya
10:07
earlier in March this year claiming illegal
10:10
firing by Sama and the
10:13
court blocked Sama from undertaking
10:15
the layoffs. Now while all this was
10:17
taking place on one end, on the other
10:19
end, Meta was apparently looking for a new content
10:22
review partner and seemed to settle
10:24
on one called Majorel. However,
10:27
a Kenyan court barred Meta from
10:29
engaging its new moderation partner
10:31
Majorel and reserved that role
10:34
for Sama. So
10:36
as you can see, it is a bit of a mix of
10:38
different issues going on but perhaps
10:41
to sum it up is that some of the content
10:43
moderators at Sama would like
10:45
to continue with their jobs as content
10:48
moderators for Meta and they also
10:50
claim they have been blacklisted by Meta
10:52
and Majorel. That is the new content
10:54
review partner for Meta that Meta seems
10:57
to have settled on. So basically it
10:59
seems that content moderators are worried
11:01
also about possibly not being hired
11:03
by Majorel or Meta if they
11:06
lose their jobs at Sama. Now
11:08
on Meta's side they are basically saying
11:10
that hey, you know, we are not part of this
11:13
issue. What is happening at Sama
11:15
has really nothing directly
11:17
to do with us and Meta
11:20
feels they should be allowed to get on
11:22
with carrying out their regular business.
11:24
So who or perhaps I should ask
11:27
what is actually moderating
11:29
Facebook content at the moment? Is it
11:31
purely being done
11:33
by computers, by algorithms or are there
11:35
some human moderators too?
11:38
Well, it seems unclear at this
11:40
point who is moderating Facebook
11:43
at this stage. You know, from my point I think
11:45
it would be quite difficult for
11:47
them to operate without content
11:50
moderators but at this
11:52
stage they have not revealed who
11:54
or what might be handling
11:56
their content moderating at this point
11:59
in time.
11:59
I should say it might be
12:02
noted by many perhaps as
12:04
a fringe part
12:06
of the working sector.
12:09
But however, it is essential and essential
12:11
because content moderation
12:14
has been viewed to play an essential
12:16
role here in Kenya and Sub-Saharan Africa.
12:19
Misinformation has been singled out
12:21
as one of the key elements that have fueled some
12:24
of the conflicts we have witnessed here in the region
12:26
and hate messaging has of course been attributed
12:29
to this as well. So
12:29
definitely content
12:32
moderation plays a key role in
12:34
trying to stop misinformation and hate
12:36
messaging.
12:37
Also as well as should not, there is the issue
12:39
of culturally acceptable language
12:41
and visuals for example. So content
12:44
moderators would also play a crucial role
12:46
in this area as well.
12:47
What's the next step? Well
12:50
the next step is that the judge is
12:52
expected to rule over
12:55
whether this case can actually
12:58
continue in a Kenyan court and
13:00
whether a Kenyan court has the
13:02
jurisdiction to hear this case or
13:05
not. So a ruling
13:07
on this matter is expected this
13:10
week. Both sides would likely
13:12
be earnestly waiting to hear what the
13:14
ruling is because it will likely determine
13:16
what direction this case will take.
13:19
Michael, thank you so much for joining us here
13:21
on TechLife.
13:23
Thank you. And
13:27
now Wikipedia is adding to that soundscape with something a little more
13:29
challenging.
13:35
A
13:40
single sound to represent all
13:43
human knowledge. They could have asked chat
13:45
GPT but they didn't. They turned to the public
13:47
worldwide for ideas. We've chatted
13:49
on the show before about some of the more left
13:51
field suggestions and guess what? Well
13:54
now there's a winner. Our reporter Alistair
13:56
Keen has taken a listen.
13:57
Street
14:00
I've just arrived at a huge black
14:02
gate and behind it a big
14:05
black door. This is the home of
14:07
massive music. It's a recording
14:10
studio but they're not recording a new
14:12
song for a pop artist. There's
14:14
something very different going on in here. Hello,
14:19
hi, thank you. Hi, Alistair.
14:22
I'm being met at the door by Emma
14:24
Byford from the company.
14:26
We're a global creative music agency
14:28
and essentially we deliver anything
14:30
that Brian could need in the world of audio,
14:32
in music and sound and one of those key
14:34
things is sonic branding and part of sonic
14:37
branding is audio logos and I mean
14:39
they're known as many different things so it could be you know a
14:41
sonic logo, audio logo, sound logo,
14:43
even a jingle but yeah essentially that's
14:45
what we do.
14:47
These often subtle sounds are
14:49
found across the tech we use
14:52
every day. We really are
14:54
living in such an audio driven world
14:56
now and there's things like podcasts,
14:58
voice assistant devices, audio
15:01
streaming, apps and the likes and kind of who
15:03
knows where the internet will evolve to in the future. The
15:05
latest
15:05
logo being worked on here is the
15:07
result of an international competition. More
15:10
than 3,000 entries from 135 countries
15:12
all wanting to be the sound of Wikimedia,
15:16
most well known for the information sharing
15:19
website Wikipedia. The competition
15:21
winner...
15:25
Thaddeus Osborne. I'm a nuclear
15:27
engineer based out of Norfolk,
15:30
Virginia in the United States of America and I
15:33
have a hunger for knowledge and I also
15:36
have a hunger for music so
15:38
I listen to music as much as I can. It's
15:40
mostly as a hobby so it's quite
15:42
a shock to wind up here. But why does
15:44
this organization want a sonic logo?
15:47
Tass Elias from Wikimedia is
15:49
here for the recording. We understood
15:52
a lot of information that
15:54
people around the world were receiving through
15:57
voice assistants and smart speakers were...
15:59
coming from our free knowledge projects.
16:02
So we felt it was a great
16:05
time to start working towards
16:07
identifying that content as
16:09
coming from Wikimedia.
16:15
We've just come into a recording studio which
16:17
looks kind of exactly as you would expect a recording
16:19
studio to look. There's a keyboard in front
16:22
of us, a big computer screen showing different
16:24
wavelengths of the different sounds that are recorded.
16:26
There's an arm coming out of the desk with a
16:28
microphone attached to it. This is
16:30
the kind of room where you would expect maybe to be
16:32
recording music or
16:34
some songs but actually in here
16:36
they've been recording the sounds of clicks and
16:38
keyboards and books being shuffled.
16:41
How easy or hard has that task been
16:43
for you? It's kind of
16:45
funny. It's much the same kind
16:48
of task though. You're trying to create a nice
16:50
product and I do want this to be emotional
16:52
too. I want it to be welcoming and kind
16:55
of friendly. Shall we have a go at
16:57
doing some of this stuff?
16:58
Here
17:02
we're flipping a book. We are
17:04
just trying to capture how it can
17:06
sound like a book which is
17:08
not something that's easy to do without the
17:11
visual impression of a book. The
17:14
page turn it's kind of ubiquitous. That
17:16
sounds about right but
17:18
sometimes you flip the book and it sounds like
17:23
a floppy fish. I wanted
17:26
the first thing that people hear
17:28
to be
17:29
very recognizable. I didn't just want
17:31
it to be books because a lot
17:33
of times we interface with Wikimedia from
17:35
a computer so I threw a couple
17:38
of keyboard clicks and typing
17:40
sounds in there
17:42
and I thought all of those together wouldn't really
17:44
be much of a memorable jingle.
17:46
It just sounded like a library or something
17:49
so I threw in a little bit of
17:52
musicality.
17:58
logos
18:00
are definitely the ones that are unique. They've got
18:02
to be super memorable, they've got to represent
18:04
a brand so you hear the phrase earworm
18:07
stick around when we talk about sonic
18:08
crowning a lot. Now I'll leave
18:10
you with a challenge. How many
18:12
sound logos do you hear in your day?
18:15
Here's
18:15
our own Tech Life Jingle to start
18:17
you off.
18:21
Now from human knowledge to human life
18:23
itself, would you like to live
18:25
a longer and healthier life? Sounds alright,
18:27
doesn't it? For generations, people have sworn
18:30
by various concoctions which they believed
18:32
would make a difference, and now that timeless
18:35
preoccupation is increasingly going high-tech,
18:37
as researchers, laboratories and wealthy
18:39
individuals seek ways to postpone
18:42
illness and death. This particular
18:44
tech scene, you perhaps won't be surprised to hear,
18:47
is focused in California, and BBC
18:49
cliques Laura Lewington's been there to check it out.
18:52
She told me about her trip, and she started by
18:54
describing Brian Johnson, an eccentric
18:56
tech multi-millionaire who perhaps more
18:58
than anyone else embodies this ever
19:01
more extreme quest to remain in
19:03
good health for as long as possible.
19:05
Oh it was all quite an experience. I went
19:07
to his house where he had lunch prepared,
19:09
but when I say lunch, he gets up
19:11
at 5am, eats breakfast
19:13
at 6am, and has this lunch
19:16
at 11am and that's his final
19:18
meal of the day, can you imagine? So
19:21
he only has a window of eating between
19:23
5am and 11am, oh my word. That's
19:26
right, like everything, he's taking fasting
19:28
to an absolute extreme. He takes 54
19:31
supplements and off-label tablets
19:33
a day, so that's proper medication
19:35
that he's taking not for its real purpose, as
19:37
well as the supplements, and he is
19:40
really pushing the boundaries on what you can do here, and
19:42
he showed me what's going on in the clinic
19:44
inside his house.
19:46
Where to start in here, what's
19:48
this? This is high frequency
19:51
electromagnetic stimulation. We got this
19:53
machine because we were trying to solve a problem. Every
19:55
night I was getting up one
19:57
time per night to go to the bathroom. So
20:00
it produced less quality sleep and so
20:02
I wanted to see if I could get up zero times
20:05
So we got this machine to trial
20:07
could we strengthen my pelvic floor
20:09
and my bladder?
20:10
So that I wouldn't get up at night to go the bathroom if you
20:12
want to try it out if you want to just sit down Okay, so
20:15
I'm going to turn it on. Okay
20:19
Tickling
20:24
but tickling in a quite hard aggressive
20:27
way
20:29
It's the session is 30 minutes,
20:31
but actually it's been successful Do you think he's on a
20:34
slightly destructive path here in a way and
20:36
also do you think it's kind of addictive? You know when you when
20:38
you first get a health tracker and you become
20:40
a bit obsessed with it Don't you and you're constantly
20:42
checking it and constantly looking at how
20:45
you're getting on with whatever it is that you're tracking Do
20:47
you think he's he's on that sort of mindset,
20:49
but he's also got the money to really
20:52
get into it
20:53
Absolutely totally addicted
20:55
when I spoke to him briefly off-camera
20:58
about it And I said to him when did you last
21:00
flip up? When did you not keep this regime
21:02
properly as you're meant to and he said it's been a year
21:05
So it's been a year since he has gone wrong
21:07
with any of this He has 30 scientists
21:09
working with him and he says because
21:11
it's being monitored by these doctors It
21:13
is pretty safe. Whereas actually
21:15
you talk to other experts and they say well, no, he
21:17
is a human experiment He signed his life away
21:20
here to test all of this stuff But it's
21:22
interesting when you talk to him about
21:23
his reasons for doing it. I when
21:26
I was 21 years old I decided
21:28
that I wanted to spend my life doing something meaningful
21:30
for the world and I didn't know what to do So I said
21:33
i'm going to make a whole bunch of money being an entrepreneur By
21:35
the age of 30 then i'll decide what to do And so
21:37
for the past since I sold my company brain
21:40
tree venmo I made 300 million dollars
21:42
and this question was What could
21:44
I do that would matter in the year? 2500 and
21:48
right now it could be aging if we
21:50
slowed the speed of aging and even reversed
21:53
it It would change what it means
21:55
to be human
21:55
I understand what
21:58
he's trying to do. I mean, you know I don't think there
22:00
are many of us who wouldn't want to remain younger
22:03
for as long as we possibly can but of course there
22:05
are all sorts of things that can happen to you in life that
22:07
could completely throw a grenade
22:10
into this research. You could get run over by a bus,
22:12
you could get ill, you know there's
22:14
any number of unknown quantities here.
22:16
Of course there are and he is an extreme
22:19
example and actually when I had the vision for going
22:21
out and making this show on increasing your
22:23
health span, increasing the healthy number of years
22:25
of our lives, I wanted to make sure
22:28
that I actually stayed away from the whole sensationalist
22:31
idea that's out there of tech billionaires wanting
22:33
to throw loads of money at being able to live forever
22:36
because really the premise of the program isn't that
22:38
Brian Johnson makes great telly and
22:40
he is experimenting with lots of
22:43
things that are interesting to people who are working
22:45
in the aging field. But really
22:47
there are two sides to this whole
22:49
health span idea. First of all
22:51
I went to visit a biotech company and went
22:53
to see a lot of research that's going into what could
22:56
potentially be a new frontier of medicine.
22:58
The idea being as to whether we can stop
23:00
or reverse the aging process at a cellular
23:03
level. Now the reason for doing this is
23:05
that age is the greatest risk factor
23:07
for most diseases, cancer, heart disease,
23:10
type 2 diabetes. So if
23:12
there was a way in the future of us being
23:14
able to track our health better, so being
23:17
able to predict problems were coming, then
23:19
we could also preempt them possibly
23:21
with these medicines. So there's this
23:23
one side which is the biotech companies and
23:26
what could potentially be a change to
23:28
the way we do medicine one day or at least an
23:30
extra way of doing it. But then
23:33
there's also this whole trend going on
23:35
in various
23:35
parts of the world but particularly a
23:37
Silicon Valley thing on longevity.
23:41
And the irony is that most people who are taking part
23:43
in this are probably extremely healthy
23:45
to start with. What do the doctors
23:47
say about whether this stuff really
23:50
works?
23:51
Well yes that is a very good question
23:53
because there is a lot of research happening
23:55
even into what types of exercise are
23:58
good for us because what is known is the
23:59
exercising well and
24:02
eating well. There's some questions about fasting.
24:04
It seems that general opinion is fasting is good
24:07
for most people for 14 to 16 hours
24:09
a day. There are some people it's not good
24:11
for but that's a separate issue
24:14
and the sleeping well is important. So all of
24:16
this, the doctors and the experts
24:18
all agree on and they're researching what type
24:20
of exercise constitutes this exercise being
24:22
good for example. But of course there
24:24
are people taking it to extremes and doing things
24:27
that aren't proven. So I spoke to one
24:29
leading Silicon Valley doctor to find out what
24:31
he had to say on the matter. Here's Dr. Jordan
24:34
Schlin.
24:34
I don't know why people want to gamble
24:36
with their health. I understand
24:38
gambling with your money. So if it's safe and
24:41
it's not effective like a vitamin, some vitamins,
24:44
you know, then you're gambling with your money. But
24:46
if it's not proven safe and
24:48
it's not proven effective, you're gambling with your
24:50
health. And then there's people spending a lot
24:53
of time doing these things and they're not
24:55
living their life. They're living the version
24:57
of their life to give themselves a future
24:59
life. But what happens if that future
25:02
life never materializes?
25:03
I mean the search for, you
25:06
know, the Alexia of eternal youth has been the stuff
25:08
of fiction for hundreds of years, hasn't
25:10
it? We've all been chasing it. But realistically,
25:13
as you say, this is currently
25:15
something that sort of excludes a lot
25:17
of people, doesn't it? It does.
25:20
Of course it does. And it has become
25:22
a trend generally amidst the
25:24
pretty wealthy to get interested
25:26
in a lot of this longevity stuff. Now, of course,
25:29
the hope is that if medicines are developed,
25:32
many people will benefit from them.
25:33
As always, the wealthy are
25:36
putting money into medical research that should hopefully
25:38
eventually benefit the masses.
25:40
But this is still something which
25:43
is really going to be more beneficial for developed
25:45
nations where these are the sort of diseases, the
25:47
cancer, the heart disease, kidney disease, which
25:50
happen later in life, are relevant
25:52
in countries where there is a higher
25:54
life expectancy. So throughout
25:57
any part of the world, then we can know
25:59
that living.
25:59
a better life is going to be better for our health,
26:02
how we feel, we can feel better for longer.
26:04
If people have the luxury of being able to exercise,
26:07
being able to eat good food, not everybody
26:10
has that luxury. And the idea of fasting
26:12
for 14 to 16 hours a day, well, if you're
26:14
struggling to afford to eat or you're working shifts, obviously
26:17
that's not plausible, however good
26:19
it may be for your longevity. Laura,
26:21
has making this changed the
26:23
way you live your life? Is
26:25
there anything that made you think, I need to do
26:27
that differently actually? Well,
26:29
I think it has. When I set out to make
26:31
this program, I had a very different vision about
26:33
what it was going to be. I'd read
26:35
the stories about tech billionaires throwing
26:38
their money at this and I was expecting
26:40
it to be a far more
26:42
sensational kind of idea than
26:44
it was, especially the biotech stuff. But
26:46
what I actually learned from it was that we're at
26:48
a point now where we really understand
26:51
where the science of aging is at.
26:53
We know what the target is, but how to
26:55
treat it is still proving quite difficult. And
26:57
even if there is success in that, we're likely
27:00
to see incremental change, drugs that may
27:03
increase lifespan by six months,
27:05
a year, two years, and maybe
27:07
healthspan by more than that. Obviously,
27:10
in the bigger picture, that is absolutely huge.
27:12
It's just when it comes down to each individual, it doesn't sound much.
27:15
But the thing that I really came away
27:17
feeling more strongly than anything was how
27:19
important it is to lead our healthiest lives.
27:21
So if you can eat as well as you
27:23
can, sleep as well as you can.
27:25
I've definitely come back feeling more
27:27
aware that it's important that we look after ourselves
27:30
because it seems that that
27:32
is the thing that is really proven right
27:34
now. And if there's anything that we can do to delay
27:36
the onset of serious disease in our lives, then
27:39
of course, we probably all want to do that. Laura,
27:42
I am now dreaming of eight hours sleep myself.
27:44
Thank you so much for talking to us. And
27:46
where will we see this programme?
27:49
It'll be on BBC News across the weekend.
27:51
It has various showings. It's called Forever Young.
27:54
Thank
27:55
you. There's
27:57
no way to extend this programme, I'm afraid, but we will
27:59
be back. more next week. You can follow our work
28:01
in the meantime at bbc.com slash
28:04
technology and on Twitter at BBC
28:06
Tech and me at ZSK.
28:26
I'm
28:30
Brad Smith.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More