Podchaser Logo
Home
#2044 - Sam Altman

#2044 - Sam Altman

Released Friday, 6th October 2023
 2 people rated this episode
#2044 - Sam Altman

#2044 - Sam Altman

#2044 - Sam Altman

#2044 - Sam Altman

Friday, 6th October 2023
 2 people rated this episode
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

23:59:59

Joe Rogan podcast, check it out.

0:03

The Joe Rogan Experience.

0:06

Train by day, Joe Rogan podcast by

0:08

night, all day. Hello,

0:13

Sam. What's happening? Not much, how are you? Thanks for coming

0:15

in here. Appreciate it. Thanks for having me. So,

0:18

what have you done? Like

0:20

ever? No, I mean, what have you done with

0:22

AI? I mean, it's

0:25

one of the things about

0:28

this is, I mean, I think

0:30

everyone is fascinated by it. I mean, everyone

0:33

is absolutely blown away

0:35

at the current capability and

0:38

wondering what the potential for the future is

0:40

and whether or not that's a good thing.

0:42

I

0:45

think it's going to be a great thing, but I

0:47

think it's not going to be all a great thing. And

0:49

that is where I think

0:54

that's where all of the complexity comes in for people. It's

0:56

not this like clean story of we're going to do this

0:58

and it's all going to be great. It's we're going to do this,

1:01

it's going to be net great, but it's going to be like

1:03

a technological

1:04

revolution. It's going to be a societal

1:06

revolution. And those

1:09

always come with change. And even

1:11

if it's like net wonderful, you

1:13

know, there's things we're going to lose along the way. Some

1:15

kinds of jobs, some kind of parts of our way of life, some

1:17

parts of the way we live are going to change or go away. And

1:21

no matter how tremendous the upside

1:24

is there, and I believe it will be tremendously good. You

1:26

know, there's a lot of stuff we got to navigate through to

1:28

make sure. That's

1:33

a complicated thing for anyone to wrap their heads

1:35

around. And there's deep and super understandable

1:38

emotions around that. That's a very honest answer

1:40

that

1:42

it's not all going to be good. But

1:45

it seems inevitable at this point. It's,

1:48

yeah, I mean it's definitely inevitable. My view

1:51

of the world, you know, when you're like

1:53

a kid in school, you learn about this technological

1:55

revolution and then that one and then that one. And

1:58

my view of the world now sort of looking backwards.

1:59

stand for, it says that this

2:02

is like one long technological revolution.

2:05

And we had, sure, like first we

2:07

had to figure out agriculture so that we had

2:09

the resources and time to figure out how to

2:11

build machines, then we got the industrial revolution. And

2:14

that made us learn about a lot of stuff, a lot of other

2:16

scientific discovery too. Let us do the

2:18

computer revolution. And that's now letting us

2:20

as we scale up to these massive systems do the

2:23

AI revolution. But it really is just

2:25

one long story of humans

2:28

discovering science and technology and co-evolving

2:30

with it. And I think it's the most

2:32

exciting story of all time. I think it's how we

2:34

get to this world of abundance. And

2:37

although, you know,

2:39

although we do have these things to navigate, there will

2:41

be these downsides.

2:42

If you think about what it means for the

2:44

world, for people's quality of lives,

2:46

we can get to a world where

2:50

the cost of intelligence

2:53

and the abundance that comes with that,

2:57

the cost dramatically falls, the abundance goes

2:59

way up. I think we'll do the same thing with energy. And

3:02

I think those are the two sort of key inputs to

3:04

everything else we want. So if we can have abundant

3:06

and cheap energy and intelligence,

3:09

that will transform people's lives largely

3:11

for the better. And I think it's gonna

3:14

in the same way that if we could go back now 500 years and look

3:17

at someone's life, we'd say, well, there's some great things, but

3:19

they didn't have this, they didn't have that. Can you believe they didn't have

3:21

modern medicine? That's

3:23

what people are going to look back at us like, but

3:25

in 50 years. When you

3:27

think about the people that currently rely

3:30

on jobs that AI will replace,

3:33

when you think about whether it's truck drivers

3:36

or automation workers, people that work

3:38

in factory assembly lines, what,

3:42

if anything, what strategies can

3:44

be put to mitigate the negative

3:47

downsides of those jobs

3:49

being eliminated by AI?

3:51

So I'll

3:55

talk about some general thoughts, but I

3:58

find making very specific. prediction

4:00

difficult because the way the technology

4:03

goes has been so different than

4:05

even my own

4:06

Intuitions or certainly my own intuitions.

4:08

We maybe we should stop there and back up a

4:11

little what we what were your

4:13

initial Thoughts if you

4:15

had asked me 10 years ago. I would have said

4:18

first AI is going to come for

4:21

Blue-collar labor, basically, it's gonna drive

4:23

trucks and do factory work and you

4:25

know, it'll handle heavy machinery Then

4:28

maybe after that it'll do like some

4:31

kinds of cognitive labor Kind

4:34

of you know, but not it won't be off doing

4:36

what I think of personally is the really hard stuff

4:38

it won't be off Proving new mathematical

4:41

theorems won't be off discovering new

4:43

science Won't

4:45

be operating code and then eventually

4:48

maybe but maybe last of all maybe never

4:50

because human creativity is this magic

4:53

special special thing

4:57

That's what I would have said

4:59

now A

5:00

it looks to me like and

5:03

for a while AI is much better at doing

5:05

tasks than doing jobs It can

5:07

do these little pieces super well,

5:09

but sometimes it goes off the rails. It can't

5:11

keep like very long coherence So

5:14

people are instead just able to

5:16

do their existing jobs way more productively

5:19

But you really still need to human there today and

5:21

then b it's going exactly the other direction Do

5:24

the creative work first

5:24

stuff like coding second They

5:27

can do things like other kinds of cognitive

5:29

labor third and we're just purposed

5:31

away from like human robots

5:34

hmm

5:35

so back to the initial question

5:39

If we do have Something

5:42

that completely eliminates factory

5:44

workers completely eliminates truck

5:47

drivers delivery drivers things

5:50

along those lines That creates

5:52

this massive vacuum in

5:54

our society. So I

5:57

think there's things that We're gonna

5:59

do

5:59

that are good

6:02

to do but not sufficient. So I think at some point

6:04

we will do something like a UBI or

6:06

some other kind of like

6:08

very long-term unemployment

6:10

insurance something, but we'll have some way of

6:13

giving people like redistributing

6:15

money in society as a

6:17

cushion for people as people figure out the new jobs.

6:21

And maybe I should touch on that. I'm

6:24

not a believer at all that there won't be

6:26

lots of new jobs. I think human

6:29

creativity, desire for status, wanting

6:31

different ways to compete, invent new things,

6:34

feel part of the community, feel valued, that's

6:37

not going to go anywhere. People have

6:39

worried about that forever. What happens

6:41

is we get better tools and

6:44

we just invent new things and more amazing

6:46

things to do. And there's a big universe out there. And

6:48

I think, I mean that like literally

6:51

in that there's like space is really big, but

6:53

also there's just so much

6:55

stuff we can all do if we do get to this

6:57

world of abundant intelligence

6:59

where you can sort of just think of a new idea and it gets

7:02

created. But

7:06

again, that doesn't –

7:08

to the point we started with that

7:11

doesn't provide like great solace to people who

7:13

are losing their jobs today.

7:15

So saying there's going to be this great indefinite

7:17

stuff in the future,

7:18

people like what are we doing today? So

7:21

you know.

7:22

I think we will as a society do

7:25

things like UBI and other

7:27

ways of redistribution, but I don't think that gets

7:29

at the core of what people want. I think what

7:31

people want is like agency, self-determination,

7:35

the ability to play a role in architecting

7:37

the future along with the rest of society, the

7:40

ability to express themselves and

7:43

create

7:44

something meaningful to them. And

7:49

also I think a lot of people work jobs they hate.

7:52

And I think there's we as a society

7:54

are always a little bit confused

7:55

about whether we want to work more or work

7:57

less. But somehow –

8:02

We all get to do something meaningful and

8:05

we all get to

8:06

play our role in driving the future

8:08

forward. That's really important. And

8:10

what I hope is as those truck driving, long

8:13

haul truck driving jobs go away, which

8:16

people have been wrong about predicting

8:18

how fast that's going to happen, but it's going to happen. We

8:21

figure out not just a way to solve

8:25

the economic problem by

8:28

giving people the equivalent of money

8:30

every month, but that there's a way that –

8:34

and we have a lot of ideas about this. There's a way that

8:36

we share

8:37

ownership and decision making

8:39

over the future.

8:41

I think I say a lot about AGI's that

8:45

everyone realizes we're going to have to share the benefits

8:48

of that, but we also have to share

8:50

the decision making over it and access

8:52

to the system itself. I'd be more

8:54

excited about a world where we say rather

8:57

than give everybody on earth like one eight billionth

8:59

of the AGI money, which we should

9:01

do that to, we say you get like one

9:04

eight billionth of the

9:05

system.

9:09

You can sell it to somebody else. You

9:11

can sell it to a company. You can pull it with other people. You

9:13

can use it for whatever creative pursuit you

9:16

want. You can use it to figure out how to start some new business.

9:19

And with that, you get sort of like

9:22

a voting right over how this is all going

9:24

to be used. And so the better the AGI

9:26

gets, the more your little one eight

9:28

billionth ownership is worth to you. We

9:32

were joking around the other day on the podcast where I was saying that

9:34

what we need is an AI

9:36

government.

9:37

We should have an AI president and have AI right there.

9:41

Just make all the decisions? Yeah. Have

9:43

something that's completely unbiased,

9:45

absolutely rational, has the

9:48

accumulated knowledge of the entire

9:50

human history at its disposal,

9:52

including all knowledge

9:55

of psychology and psychological

9:57

study, including UBI, because that comes

9:59

with a host of people.

9:59

of pitfalls and issues

10:03

that people have with it. So I'll say something there.

10:06

I think we're still very far away from a system that

10:09

is capable enough and reliable

10:11

enough

10:12

that any of us would want that, but I'll

10:14

tell you something I love about that. Someday,

10:17

let's say that thing gets built. The fact that

10:19

it can go around and talk to every person

10:22

on earth, understand their exact

10:24

preferences at a very deep level, how

10:26

they think about this issue and that one and how they also

10:28

trade off and what they want, and then understand

10:31

all of that and

10:33

collectively optimize for

10:35

the collective preferences of

10:37

humanity or of citizens of the US, that's

10:40

awesome.

10:41

As long as it's not

10:43

co-opted,

10:44

our government currently is co-opted.

10:47

That's for sure. We know for sure

10:49

that our government is heavily influenced by special

10:51

interests. If

10:53

we could have an artificial

10:56

intelligence government that has no

10:58

influence, nothing has influence on

11:00

it. What is fascinating idea? It's possible.

11:03

And I think it might be the only way where

11:06

you're gonna get completely objective,

11:10

the absolute most intelligent

11:12

decision for virtually

11:15

every problem, every dilemma

11:17

that we face currently in society. Would you

11:19

truly be comfortable handing over like

11:21

final decision making and say, all right, AI,

11:24

you got it fair? No, no, but I'm not comfortable

11:26

doing that with anybody. I

11:30

was uncomfortable with the Patriot Act. I'm uncomfortable

11:32

with many decisions that

11:34

are being made. It's just there's

11:37

so much obvious evidence

11:39

that decisions that are being made are

11:41

not being made in the best interests of the overall

11:43

well of the people. It's being made

11:45

in the decisions of whatever

11:49

gigantic corporations that

11:51

have donated to and whatever

11:53

the military industrial complex and

11:56

pharmaceutical industrial complex and then just

11:58

the money. That's really what

12:00

we know today, that money has a massive

12:03

influence on our society and the choices

12:05

that get made and the overall good

12:07

or bad for the population. Yeah,

12:09

I have no disagreement at all that

12:12

the current system is super broken, not

12:14

working for people, super corrupt

12:17

and for sure unbelievably

12:19

run by money. I

12:23

think there is a way to

12:25

do a better job than that

12:27

with AI in some way,

12:30

but

12:30

this might just be like a factor of sitting

12:33

with the systems all day and watching all of the ways they

12:35

fail. We've got a long way to go. A

12:37

long way to go, I'm sure. But

12:39

when you think of AGI,

12:42

when you think of the possible

12:44

future like where it goes to, do

12:47

you ever extrapolate? Do you

12:49

ever sit and pause and say, well,

12:51

if this becomes sentient and

12:53

it has the ability to make

12:55

better versions of itself, how

12:58

long before we're literally dealing with a God?

13:03

The way that I think about this is

13:05

it used to be that AGI was this

13:07

very binary moment. It was before and after,

13:09

and I think I'm totally wrong about that.

13:12

The right way to think about

13:14

it is this continuum

13:17

of intelligence, this smooth exponential

13:19

curve back all the way to that sort of

13:21

smooth curve of technological revolution.

13:25

The amount of

13:27

compute power we can put into the system, the

13:29

scientific ideas about how to make it more

13:31

efficient and smarter to give it

13:34

the ability to do reasoning, to think about how to

13:36

improve itself,

13:38

that will all

13:39

come. But my model

13:41

for a long time, I think if you

13:44

look at the world of AGI thinkers, there's

13:47

sort of two,

13:48

particularly around the safety issues you're talking about, there's

13:50

two axes that matter. There's

13:53

what's called short timelines or long timelines to

13:56

the first milestone of AGI,

13:58

whatever that's going to be. Is that going to happen

14:01

in a

14:01

few years, a few decades, maybe even

14:04

longer, although at this point I think

14:05

most people are a few years, a few decades, and

14:07

then there's takeoff speed? Once we get there,

14:09

from there to that point you were talking about where it's capable

14:12

of the rapid self-improvement, is

14:14

that a slow or a fast process? The

14:17

world that I think we're heading, that

14:19

we're in,

14:20

and also the world that I think is the most

14:22

controllable and the safest, is

14:25

the short timelines

14:27

and slow takeoff quadrant.

14:32

I think we're going to have, there

14:34

were a lot of very smart people for a while, and they

14:36

were like, the thing you already talked about happens in a day or

14:38

three days. That

14:41

doesn't seem likely to me given the shape of the

14:43

technology as we understand

14:45

it now. Now even if that happens

14:47

in a decade or three

14:49

decades, it's still like the blink of

14:51

an eye from a historical perspective, and

14:55

there are going to be some real challenges to getting

14:57

that right, and the decisions

14:59

we make, the

15:02

safety systems and the checks

15:05

that the world puts in place, how we think

15:07

about global

15:10

regulation

15:11

or rules of the road from a safety perspective

15:13

for those projects, it's super important

15:15

because you can

15:16

imagine many things going horribly wrong.

15:19

I

15:21

feel cheerful about the progress

15:24

the world is making towards taking this seriously,

15:26

and it reminds

15:28

me of what I've read about the conversations

15:31

that the world had around the development of nuclear

15:33

weapons.

15:35

It seems to me that this is, at least

15:37

in terms of public consciousness, this has emerged

15:40

very rapidly, where I

15:42

don't think anyone was really aware, people

15:45

were aware of the concept of artificial

15:48

intelligence, but they didn't think

15:50

that it was going to be implemented so

15:53

comprehensively, so quickly. Chat

15:57

GPT is on 4.5 now? Four.

16:01

Four. And with 4.5, there'll be

16:03

some sort of an exponential increase in its

16:05

abilities. It'll be somewhat better.

16:09

Each step, you know, from each like half

16:11

step like that, you kind

16:13

of, humans have this ability to like

16:15

get used to any new technology so quickly.

16:18

The thing that I think was unusual about the launch

16:20

of chat GPT 3.5 and 4 was

16:21

that people

16:25

hadn't really been paying attention. And

16:27

that's part of the reason we deploy.

16:29

We think it's very important that people and institutions

16:31

have time to gradually

16:34

understand this re-earth co-design

16:36

the society that we want with it. And

16:38

if you just build AGI in secret in a lab and

16:40

then drop it on the world all at once, I think that's a really

16:42

bad idea. So

16:44

we had been trying to talk to the world about

16:46

this for a while.

16:48

People – if you don't give people something

16:50

they can feel and use in their lives, they don't quite

16:53

take it seriously. Everybody's busy. And

16:55

so there was this big overhang from where

16:57

the technology was to where public consciousness was.

17:00

Now, that's caught up. We've deployed.

17:03

I think people understand it. I don't

17:05

expect a field that jump from

17:08

like 4 to whenever we finish 1.5, which will be a

17:10

little while. I don't expect that to be

17:12

the

17:14

crazy – I think the crazy switch, the crazy

17:17

adjustment that people have had to go through has

17:19

mostly happened. I think most

17:21

people have gone from thinking that AGI was

17:23

science fiction and very far off to

17:25

something that is going to happen.

17:27

And that was like a one-time reframe.

17:29

And now every year you get a new iPhone.

17:32

Over the 15 years or whatever since the launch,

17:34

they've gotten dramatically better. But iPhone

17:37

to iPhone, you're like, yeah, it's a little better. But

17:39

now if you go hold up the first iPhone to the 15 or

17:41

whatever, that's a big difference.

17:43

GPT 3.5 to AGI, that'll be a

17:45

big difference.

17:47

But along the way, it'll just get incrementally better.

17:49

Do you think about the convergence

17:52

of things like Neuralink

17:54

and there's a few competing technologies

17:57

where they're trying to implement some – sort

18:02

of a connection between the

18:04

human biological system and

18:07

technology.

18:10

Do you want one of those things in your head? I

18:12

don't until everybody does. Right.

18:15

And I have a joke about it, but it's like the idea

18:17

is once it gets done, you have to

18:20

kind of, because everybody's going to have it. So

18:22

one of the hard questions

18:25

about

18:25

all of the related merge

18:28

stuff is exactly what you just said. Like

18:30

as a society, are we going to let

18:32

some people merge with AGI

18:35

and not others? And

18:38

if we do, then... And

18:40

you choose not to. Like, what does that mean for you? Right.

18:44

And will you

18:46

be protected? How you get that moment right?

18:48

You know, if we like imagine like all the way

18:51

out to the sci-fi future,

18:53

there have been a lot of sci-fi books written about how

18:55

you get that moment right. You know, who gets to do that first?

18:57

What about people who don't want to? How do you make sure that

18:59

people that do it first like actually help lift

19:01

everybody up together? Yeah. How

19:03

do you make sure people who want to just like live their very human life

19:06

get to do that? That stuff is really

19:08

hard and honestly so

19:11

far off from my problems of the day that I don't

19:13

get to think about that as much as I'd like to, because I do think

19:15

it's super interesting. But

19:21

yeah, it seems like if we just think logically,

19:25

that's going to be a huge challenge at

19:27

some point and people

19:29

are going to want

19:33

wildly divergent things.

19:36

But

19:37

there is a societal question about how we're

19:39

going to like,

19:40

questions of fairness that

19:43

come there

19:44

and what it means for the people who don't

19:46

do it. Like,

19:47

super, super complicated. Anyway,

19:50

on the neural interface side, I'm

19:52

in the short term, like before we figure out how

19:54

to upload

19:56

someone's conscience into a computer, if that's even

19:58

possible at all, which I think there's...

20:01

plenty of sides you could take on why

20:03

it's not. The

20:06

thing that I find myself most interested

20:08

in is what we can

20:11

do without drilling a

20:13

hole in someone's head.

20:15

How much of the inner monologue can we

20:17

read out with an externally mounted device? And

20:20

if we have a

20:21

imperfect low bandwidth, low

20:23

accuracy

20:24

neural interface,

20:26

can people still just learn how to use it really well

20:29

in a way that's like quite powerful for

20:32

what they can now do with a new computing platform?

20:34

And my guess is we'll figure that out. I'm

20:36

sure you've seen that headpiece. There's

20:38

a demonstration where there's

20:40

someone asking someone a question. They

20:43

have this headpiece on. They think the question and

20:45

then they literally Google the question and get the answers

20:48

through their head. That's the kind of thing we've been in. That's the kind

20:50

of direction we've been exploring. Yeah. That

20:53

seems to me to be step one. That's

20:55

the pong of the eventual

20:57

immersive 3D video games. Like

21:00

you're going to get these first steps

21:02

and they're gonna seem sort of crude and slow.

21:05

I mean it's essentially slower

21:07

than just asking Siri. I think if

21:09

someone built

21:12

a system where you could

21:14

think

21:15

words doesn't have to be a question. It could just be your

21:17

passive rambling inner monologue. That certainly

21:19

could be a question. And that was

21:21

being fed into GPT-5 or 6. And

21:24

in your field of vision, the

21:26

words in response were being displayed.

21:28

That would be the pong.

21:29

Yeah. That's still soup. That's a very

21:32

valuable tool to have. And that seems like

21:34

that's inevitable. There's

21:37

hard work to get there on the neural interface side but

21:39

I believe it will happen. Yeah. I

21:41

think so too. My concern is that the

21:43

initial adopters of this will have such

21:45

a massive advantage over the general population.

21:48

Well that doesn't concern me because that's like a...

21:51

You know that's not... You're not... That's just like better...

21:54

That's a better computer.

21:56

You're not like jacking your brain

21:58

into something in a high-risk thing. You know what... what you do

22:00

when you don't want them when you take off the glasses. So

22:03

that feels fine.

22:05

Well this is just the external device

22:07

then. Oh I think we can do the

22:09

kind of like read your thoughts with an external

22:11

device at some point. Read your

22:13

internal monologue. Interesting.

22:16

And do you think we'll be able to communicate with an external

22:19

device as well telepathically or

22:21

semi telepathically through technology? I

22:24

do. Yeah. Yeah

22:26

I do. I think so too. My

22:29

real concern is that

22:32

once we take the step

22:34

to use an actual neural

22:36

interface, when there's an actual operation

22:39

and they're using

22:41

some sort of an implant and then that implant

22:44

becomes more sophisticated. It's not the iPhone 1,

22:46

now it's the iPhone 15. And

22:48

as these things get better and better, we're

22:52

on the road to cyborgs. We're

22:55

on the road to like why would you want

22:57

to be a biological person? Do

22:59

you really want to live in a fucking log cabin when

23:01

you can be in the matrix? It seems

23:04

like we're not – we're

23:06

on this path.

23:10

We're already a little bit

23:12

down that path, right? Like if you take away someone's

23:14

phone and they have to go function in the world

23:16

today, they're in a disadvantage relative

23:18

to everybody else.

23:20

So that's like maybe

23:22

that's like the lightest weight version of a merge

23:24

we could imagine. But I think it's worth like –

23:27

if we go back to that earlier thing about the one

23:29

exponential curve, I think it's worth saying

23:31

we've like lifted off the X-axis already

23:33

down this path for the tiniest bit. And

23:38

yeah, even if you don't go all the

23:40

way to like a neural interface, VR

23:42

will get so good that some people just don't want to

23:44

take it off that much. Right.

23:49

That's fine for them as

23:52

long as we can solve this question of

23:54

how do we like think about what a balance of power

23:57

means in the world. I think there will be many people.

24:00

I'm certainly one of them like actually the human

24:02

body and the human experience is pretty great that

24:05

log heaven in the woods pretty awesome I don't want

24:07

to be there all the time. I'd love to go play the great video

24:09

game

24:09

But like

24:11

I'm really happy to get to go there right

24:13

times Yeah, there's still human

24:16

experiences that are just

24:19

Like great human experience just laughing

24:21

with friends You know kissing someone

24:24

that you've never kissed before that you you're

24:26

on a first date that guy those kind

24:28

of things are real Moments

24:31

it just laughs yeah having a glass

24:34

of wine with a friend just laughing

24:36

not quite the same in VR Yeah now not

24:38

when the VR goes super far so you can't you know

24:40

it's like you are jacked in on your brain and You

24:43

can't tell right what's real and what's not

24:45

and then everybody gets like

24:48

super deep on the simulation hypothesis or

24:50

the like Eastern religion or whatever and I Don't

24:52

know what happens at that point Do you ever fuck around with

24:54

simulation theory because the real problem

24:57

is when you combine that with probability theory

24:59

and you talk to the people that say well If

25:02

you just look at the numbers That

25:04

the probability that we're already in a simulation

25:06

is much higher than the probability that we're not

25:12

It's never been clear to

25:14

me what to do about it

25:16

It's like okay right into that and

25:19

it'll actually makes a lot of sense Yeah, I think

25:21

probably sure right things convincing but

25:23

now what my reality is my life

25:26

and I'm gonna live it and I've

25:32

You know from like

25:34

2 a.m. In my college Hmm

25:40

what seems like one of

25:42

those It's

25:45

there's no You know, it's

25:48

if it is a possibility if it is

25:50

real First of all, once it happens,

25:53

what are you gonna do? I mean that that is the

25:55

new reality and in many ways our

25:57

new reality is

25:59

as

25:59

alien to you know

26:03

hunter-gatherers and 15,000 years ago as That

26:07

would be to us now. I mean we're already

26:09

we've already entered into some very

26:12

bizarre territory where you

26:14

know I was just having a conversation with my kids were asking

26:17

questions about something and You know

26:19

I always say let's guess What percentage

26:21

of that is this and then we just google

26:23

it and then just ask Siri and we pull it up

26:25

like look at Like that alone.

26:28

Yeah, so bizarre compared

26:30

to how it was when I was 13, and

26:33

you had to go to the library But

26:35

hope that the book was accurate totally I

26:38

was very annoyed

26:40

this more I was reading about how horrible

26:42

systems like chat GPT and Google

26:44

are from an environmental impact because it's you know using

26:46

like

26:47

some extremely tiny amount of energy for each query

26:50

and

26:50

You know how world throwing the world and I was

26:52

like before that people drove to the library

26:55

Yeah, how much carbon they burn or? What

26:57

it takes now come on those but that's just

27:00

people looking for some reason why

27:02

something's bad That's not a logical total

27:04

perspective. Well. We should be looking

27:06

at is the spectacular

27:09

changes that are possible through this

27:11

and All the problems

27:13

the insurmountable problems that we have with resources

27:16

with the environment with cleaning

27:18

up the ocean Climate change

27:20

there's

27:20

so many problems that we need this

27:22

to solve all of everything else and that's

27:25

why we need president If

27:29

if AI could make every scientific

27:31

discovery,

27:32

but we still had human presidents you think would be okay No,

27:35

because those creeps would still be pocketing

27:38

money, and they'd have offshore accounts And

27:41

it would always be a weird thing

27:43

of corruption and how to mitigate that

27:45

corruption Which is also one of the fascinating

27:47

things about the current state of technology is that we're so

27:49

much more aware of corruption We're

27:52

so much more. There's so much independent

27:54

reporting and we're so much

27:57

more cognizant of the actual

27:59

problem

27:59

problems that are in place. This is really

28:02

great.

28:04

One of the things that I've observed, obviously

28:06

many other people too, is

28:08

corruption is such an incredible

28:10

hindrance to getting anything done in a

28:12

society to make it forward progress. My

28:15

worldview had been

28:17

more US-centric when I was younger,

28:20

and as I've

28:21

just

28:22

studied the world more and had to work in more

28:24

places in the world, it's amazing how much corruption there

28:26

still is. But the shift

28:29

to a technologically-enabled world, I

28:31

think, is a major force against it because

28:33

everything is ... It's harder to hide stuff.

28:37

I do think corruption in the world will

28:39

keep trending down.

28:41

Because of its exposure through

28:44

technology. It

28:47

comes at a cost, and I think the

28:49

loss ... I am very worried

28:51

about how far the surveillance state could go

28:54

here. But in

28:57

a world where payments,

29:00

for example, are no longer like bags

29:02

of cash but done somehow digitally,

29:05

and somebody, even if you're using Bitcoin,

29:07

can watch those flows. I

29:09

think that's a corruption-reducing thing.

29:12

I agree, but I'm very

29:14

worried about central bank digital currency

29:17

and that being tied to a social credit score.

29:20

Super against. Yeah, that scares the

29:22

shit out of me. Super against. And

29:24

the push to that is not ... That's

29:27

not for the overall good of society. That's for control.

29:30

Yeah, I think

29:33

there's many things

29:35

that I'm disappointed that the US government

29:38

has done recently, but

29:39

the war on

29:41

crypto, which I think is a, like, we

29:43

can't give this up, we're going to

29:46

control this.

29:48

That's a thing that makes me quite sad about

29:50

the country. It makes me quite sad about the

29:52

country too, but then you also see with things like

29:54

FTX, like, oh, this can

29:57

get without regulation and

29:59

without someone. One overseeing it this

30:01

can get really fucked Yeah,

30:05

I'm not anti-regulation like I think

30:08

there's clearly a role for it

30:10

and

30:11

I

30:13

Also think

30:14

FTX was like a sort of Comically

30:16

bad situation. Yeah Yeah,

30:21

but it's a fun one like it's totally fun and

30:23

I love that story I mean you clearly

30:26

I Really

30:29

do I love the fact that they were all doing drugs and

30:31

have sex. Yeah. No. No, I had every

30:33

part of the drama It's like a it.

30:36

I mean it's a gripping story cuz yeah had everything

30:38

there They did their taxes with like

30:41

what was the the program that they used?

30:44

Quickbooks Billions

30:48

of dollars. I don't know why I think the word polycule

30:50

is so funny But only cool that was what they

30:53

like when you call a relationship like

30:55

a poly but closed mal like

30:57

poly amorous Molecule put together.

30:59

Oh, I see so they were like this is our polycule

31:02

So there's nine of them in their poly a man of them or

31:04

whatever. Yeah, and you call that a polycule I don't

31:06

know funny like that became like a meme

31:08

in Silicon Valley for a while that I thought was hilarious Um,

31:11

you clearly want enough regulation that that

31:14

can't happen but

31:16

they're

31:18

Like I'm not against that happening. I'm

31:20

against them doing what they did. No,

31:22

that's what I mean. I'm a cool Go

31:24

for it. No. No, I mean you want enough thing that like

31:27

FTX can't lose all of its depositors

31:29

money Yes, but but I think there's an important

31:31

point here, which is you have all of this other Regulation

31:35

that people and and it didn't keep us safe

31:38

and the basic thing which was

31:40

like

31:41

You know, let's do that. That was not all of

31:43

the crypto stuff people were talking about. Hmm

31:46

Yes, I

31:48

mean the real fascinating crypto

31:50

is Bitcoin to me I mean, that's

31:53

the one that I think has the most

31:56

likely possibility of becoming a Universal

32:00

survival currency. It's

32:04

limited in the amount that there can be. People

32:09

mine it with their own company. That

32:11

to me is very fascinating and I love the

32:13

fact that it's been implemented. I've

32:17

had Andreas Antonopoulos

32:19

on the podcast and when he talks

32:21

about it, he's

32:25

living it. He's spending all of

32:27

his money. Everything he has paid is in Bitcoin.

32:29

He pays his rent in Bitcoin. Everything

32:31

he does is in Bitcoin. I helped start

32:34

a project called Worldcoin a few years ago.

32:38

I've gotten to learn more about

32:41

the space. I'm

32:43

excited about it for the same reasons. I'm excited about Bitcoin

32:45

too. I think this idea that

32:47

we have a

32:49

global currency that is outside

32:51

of the control of any government is

32:54

a

32:55

super logical and important step

32:57

on the tech tree. Yeah, agreed. I

33:00

mean, why should the government control currency? I

33:02

mean, the government should be dealing with all the pressing

33:05

environmental, social, infrastructure

33:08

issues, foreign policy issues, economic

33:10

issues, the things that we

33:12

need to be governed in order

33:15

to have a peaceful and prosperous society

33:17

that's equal and equitable.

33:20

What do you think happens to money and currency after

33:22

AGI?

33:25

I've wondered about that because I feel

33:27

like with money, especially when money

33:29

goes digital, the bottleneck is

33:31

access. If we get to

33:33

a point where all information

33:36

is just freely shared

33:38

everywhere, there are no secrets, there are

33:41

no boundaries, there are no borders. We're

33:43

reading minds. We have complete

33:46

access to all of the

33:48

information of everything you've

33:51

ever done, everything everyone's ever said. There's

33:53

no hidden secrets. What

33:55

is money then? Money is this digital

33:58

thing. How can you possess it? how

34:00

can you possess this digital thing if

34:02

there is literally no bottleneck?

34:05

There's no barriers to anyone

34:07

accessing any information, because

34:09

essentially it's just ones and zeros. Yeah,

34:12

I mean, another way, I think the information frame makes

34:14

sense. Another way is that like money is like

34:19

a sort of way to trade labor

34:21

or to trade like a limited number of hard

34:24

assets like land and houses and whatever. And

34:28

if you think about a world where like intellectual

34:32

labor is just readily available and

34:34

super cheap,

34:36

then

34:38

that's somehow very different.

34:41

I think there will always be goods that we want

34:43

to be scarce and expensive,

34:45

but it'll only be those goods that we want to be scarce

34:48

and expensive and services that still are. And

34:51

so money in a world like that I think is just

34:54

a very curious idea. Yeah,

34:56

it becomes a different thing. I mean, it's not

34:58

a bag of gold in a leather pouch that

35:00

you're carrying around. Not going to do you much good probably.

35:04

Yeah, it's not going to do you much good. But

35:06

then the question becomes how is that money distributed and how do

35:08

we avoid some horrible Marxist

35:11

society where there's one totalitarian

35:14

government that's just dolted out? That

35:16

would be bad. I

35:18

think you've got to like – my current best idea, and maybe

35:20

there's something better, is I think you – if

35:23

we are right – a lot of reasons we could be

35:25

wrong. If we are right that like

35:27

the AGI systems of which there will be

35:30

a few become the high order bits

35:32

of

35:33

sort of influence whatever in the world, I think

35:36

you do need like

35:39

not to just redistribute the money, but the access

35:41

so that people can make their own decisions about how

35:43

to use it and how to govern it.

35:45

And if you've got one idea, you get to

35:47

do this. If I've got one idea, I get to do

35:49

that. And I have like

35:52

rights to basically do whatever I want with my

35:54

part of it. And if I come up with better ideas

35:56

than you, I get rewarded for that by whatever

35:58

the society is or vice versa. Yeah. You

36:01

know, the hardliners, the people

36:03

that are against like welfare and

36:05

against any sort of UG universal

36:10

basic income, UBI, what

36:12

they're really concerned with is human nature, right?

36:15

They believe that if you remove

36:17

incentives, if you just give people free money,

36:19

they become addicted to it, they become

36:22

lazy. But isn't that a

36:24

human biological and psychological

36:27

bottleneck? And

36:29

perhaps

36:31

with the implementation of artificial

36:35

intelligence combined with

36:37

some sort of neural interface,

36:40

whether it's external or internal, it

36:43

seems like that's a problem

36:46

that can be solved, that

36:49

you can essentially, and this

36:51

is where it gets really spooky, you can re-engineer

36:54

the human biological system, and

36:57

you can remove all of these

36:59

problems that people have that

37:01

are essentially problems that date back to

37:03

human reward systems when we were tribal

37:05

people, hunter-gatherer people, whether

37:08

it's jealousy, lust, envy,

37:10

all these variables

37:13

that come into play when you're dealing

37:15

with money and status and social

37:17

status. If those are eliminated

37:20

with technology, essentially we

37:23

become a next version

37:25

of what the human species is possible. Like

37:29

we're very, very far removed from

37:32

tribal, brutal societies

37:36

of cave people. We all agree

37:38

that this is a way better way to live, it's

37:42

a way safer, you know, we

37:44

were like I was talking about this at my comedy club

37:46

last night, because my wife

37:48

was, we were talking about DNA,

37:52

and my wife was saying that look, everybody

37:54

came from cave people, which is kind of a fucked up thought,

37:56

that everyone here is here because of cave people.

37:59

Well that... All that's still in our DNA.

38:02

All that's still. And these reward

38:05

systems can be hijacked, and they

38:07

can be hijacked by just giving people

38:09

money. And like, you don't have to work, you don't have to do anything,

38:11

you don't have to have ambition, you'll just have money,

38:14

and just lay around and do drugs.

38:17

That's the fear that people

38:19

have of giving people free money. But

38:22

if we can figure out how

38:25

to literally engineer

38:28

the human biological vehicle

38:31

and remove all those

38:33

pitfalls, if we can

38:36

enlighten people technologically,

38:38

maybe enlighten is the wrong word, but

38:41

advance the human

38:44

species to the point where those

38:46

are no longer dilemmas, because those

38:48

are easily solvable through

38:50

coding. They're easily solvable

38:52

through enhancing the human

38:55

biological system, perhaps raising dopamine

38:58

levels to the point where anger, and

39:00

fear, and hate are impossible.

39:03

They don't exist. And if you

39:06

just had everyone on Mali, how

39:09

many wars would there be? There'd be zero wars.

39:11

I mean, I think if you could get

39:12

everyone on Earth

39:14

to all do Mali once on the same day,

39:16

that'd be a tremendous thing. It would be. If

39:18

you got everybody on Earth to do Mali every day, that'd

39:20

be a real loss. But what if they did a

39:23

low dose of Mali, where you just get

39:25

to, ah, where

39:27

everybody greets people with

39:29

love and affection, and there's

39:31

no longer a concern about competition.

39:34

Instead, the concern is about

39:36

the fascination of innovation and creation

39:40

and creativity. Man,

39:42

we could talk the rest of the time about this one topic.

39:45

It's so interesting. I think

39:50

if I could push a button to remove

39:53

all human striving and

39:56

conflict, I wouldn't

39:58

do it, first of all. I think

39:59

I think that's a very important part of

40:02

our story and experience. And

40:06

also, I think we can see both

40:08

from our own biological history

40:10

and also from what

40:13

we know about AI, that very

40:15

simple

40:17

goal systems, fitness functions,

40:20

reward models, whatever you want to call it, lead

40:22

to incredibly

40:23

impressive

40:25

results

40:26

if the biological imperative is survive

40:28

and reproduce.

40:30

Look how far that has somehow gotten

40:33

us as a society. All of this, all this

40:35

stuff we have, all this technology, this building, whatever

40:37

else, like

40:43

that got here

40:45

through an extremely simple

40:48

goal in a very complex environment

40:50

leading

40:51

to all of the richness and complexity

40:53

of people for

40:56

doing this biological imperative to some degree

40:59

and wanting to impress each other.

41:03

So I think evolutionary fitness is

41:05

a simple and unbelievably powerful idea.

41:07

Now,

41:09

could you carefully edit out every

41:12

individual manifestation of that?

41:16

Maybe, but I don't want to live

41:18

in a society of drones where everybody is

41:21

just sort of

41:22

on Molly all the time either. That

41:25

doesn't seem like the right answer.

41:28

Like I want us to get you to strive. I want us to

41:30

continue to push back the frontier

41:33

and go out and explore. And I actually think something's

41:36

already a

41:38

little off track in society about

41:41

all of that and we're, I don't

41:45

know, I think like I'm,

41:48

I don't,

41:48

I thought I'd be older by the time I felt like

41:50

the old guy complaining about the youth. But

41:54

I think we've lost something

41:57

and I think that we... need

42:00

more

42:02

striving, maybe more risk-taking, more

42:05

like

42:06

explorer spirit. What do you mean by you think we've

42:09

lost something?

42:17

I mean here's like a version of it

42:20

very much from my

42:23

own lens. I was a startup investor for a

42:25

long time, and

42:27

it often was the case that

42:30

the very best startup founders were

42:32

in their early or mid-20s, late

42:35

20s maybe even, and now they skew much

42:37

older.

42:38

And what I want to know is in the

42:41

world today, we're the super great 25-year-old

42:43

founders.

42:44

And there are a few, it's not fair to say there are none, but there

42:46

are less than there were before. And

42:49

I

42:53

think that's bad for society at all levels.

42:56

Comptech company founders is one example,

42:58

but people who go off and

43:00

create something new, who push on a disagreeable

43:03

or controversial idea, we need

43:05

that to drive forward.

43:08

We need that

43:09

sort of spirit. We need people

43:11

to be able to put out ideas

43:14

and be wrong

43:15

and not be ostracized from society

43:17

for it or not have it be something

43:19

that they get cancelled for or whatever. We

43:22

need people to be able to take a risk

43:24

in their career because they believe in

43:26

some important scientific quest that may not

43:28

work out or may sound like really controversial

43:31

or bad or whatever.

43:34

Certainly when we started OpenAI and we were

43:36

saying

43:37

we think this AGI thing

43:38

is real and could

43:41

be done unlikely but so important if it happens.

43:44

And all of the older scientists in our field were

43:46

saying those people are irresponsible,

43:49

shouldn't talk about AGI. That's like they're

43:51

selling the scam or they're

43:54

kind of being

43:57

reckless and it's going to lead to an AGI winter.

43:59

We said we believed. We said at the time we knew

44:02

it was unlikely, but it was an important quest.

44:04

And we were going to go after it and kind of like fuck

44:06

the haters.

44:08

That's important to a society.

44:11

What do you think is the origin? Why

44:14

do you think there are less young people

44:17

that are doing those kind

44:19

of things now as opposed to

44:21

a decade or two ago?

44:24

I am so interested in that

44:26

topic.

44:28

I'm tempted to blame the education system,

44:31

but I'm sure that I think that interacts

44:34

with society in

44:37

all of these strange ways.

44:40

It's funny. There was this thing all over

44:42

my Twitter feed recently trying to talk about what

44:46

caused the drop in testosterone in American

44:48

men over the last few decades. And no

44:51

one was like, this is a symptom, not a cause.

44:55

And everyone

44:56

was like, oh, it's the microplastics. It's

44:58

the birth control pills. It's the whatever. It's the whatever. It's the whatever.

45:01

And I think this is like

45:02

not at all

45:04

the most important

45:07

piece of this topic, but it was just interesting

45:09

to me sociologically

45:12

that

45:13

there was only talk about it being – about what

45:16

caused it, not

45:17

about it being enough. An

45:20

effect of some sort of change in society.

45:24

But isn't what caused

45:26

it – well, there's biological

45:29

reasons why – like when we talk

45:32

about the phthalates and the microplastic

45:34

pesticides, environmental factors,

45:36

those are real. Totally.

45:38

And I don't – again, I'm so far out of my depth and

45:40

expertise here. This is – it was just

45:42

interesting to me that the only talk was about

45:45

biological factors and not that somehow society

45:47

can have some sort of –

45:48

Well, society most certainly has an effect. Do

45:51

you know what the answer to this is? I

45:54

don't. I mean, I've had a

45:56

podcast with Dr. Shanna Swan who

45:59

wrote the book Counting. down and that

46:01

is all about the introduction of petrochemical

46:03

products and the correlating drop

46:06

in testosterone, rise

46:08

in miscarriages, the fact

46:10

that these are ubiquitous endocrine

46:12

disruptors that when they do

46:15

blood tests on people, they find some

46:18

insane number. It's like 90

46:20

plus percent of people have phthalates

46:22

in their system. I

46:24

appreciate the medical. Yeah, we

46:26

try to mitigate it as much as possible.

46:30

You're getting it. If you're microwaving food, you're

46:32

fucking getting it. You're just getting it.

46:34

If you eat processed food, you're getting it. You're getting a certain

46:37

amount of microplastics in your diet and estimates

46:39

have been that it's as high as a credit

46:42

card of microplastics per week. Like flow

46:44

neuron. In your body. You consume a

46:46

credit card of that a week. The

46:49

real concern is with mammals because

46:52

the introductions, when they've done studies with

46:54

mammals and they've introduced phthalates

46:56

into their body, there's a correlating.

47:01

One thing that happens is these

47:04

animals, their taints shrink. The

47:06

taint of them, the mammal, when

47:08

you look at males, it's 50 percent

47:11

to 100 percent larger than the females. With

47:13

the introduction of phthalates on the males, the taints

47:16

start shrinking, the penises shrink, the testicles

47:18

shrink, sperm count shrinks. We

47:20

know there's a direct biological

47:23

connection between these

47:26

chemicals and how they interact with

47:29

bodies. That's

47:32

a real one. It's also the

47:35

amount of petrochemical products that

47:37

we have, the amount of plastics that we use,

47:41

it is such an integral part of our culture

47:43

and our society, our civilization.

47:46

It's everywhere. I've wondered

47:50

if you think about how

47:53

these territorial

47:56

apes evolve into

47:59

this new advanced

48:02

species. Wouldn't

48:04

one of the very best ways be

48:07

to get rid of one of the

48:09

things that causes the most problems which is

48:11

testosterone. We need testosterone.

48:14

We need aggressive men and protectors

48:16

but why do we need them? We need them because

48:18

there's other aggressive men that are evil,

48:21

right? So we need protectors from

48:24

ourselves. We need the good

48:26

strong people to protect us from the bad

48:28

strong people. But if we're

48:31

in the process of integrating with technology,

48:34

if technology is an inescapable

48:37

part of our life, if it is everywhere,

48:39

you're using it, you have the Internet of Everything

48:42

that's in your microwave, your television,

48:45

your computers, everything you use.

48:48

As time goes on, that will be more

48:51

and more a part of your life and as these

48:53

plastics are introduced into the human biological

48:56

system, you're seeing a feminization

48:59

of the males of the species. You're

49:01

seeing a downfall in birth

49:04

rate. You're seeing all these correlating

49:06

factors that would sort

49:08

of lead us to become this more

49:12

peaceful, less

49:14

violent, less aggressive,

49:17

less ego-driven thing.

49:20

Which the world is definitely becoming all the

49:23

time and

49:25

I'm all for less violence obviously. But

49:29

I don't ...

49:35

Look, obviously testosterone has many

49:37

great things to say for it and some

49:40

bad tendencies too. But I don't

49:42

think a world ...

49:43

If we leave that out of the equation and just say like

49:45

a world that has a

49:48

spirit that

49:51

we're going to

49:53

defend ourselves, we're going to ... We're

49:58

going to find a way to ...

50:00

protect ourselves and

50:02

our tribe and our society

50:04

into this future, which you can get with lots

50:07

of other ways. I think that's an important impulse.

50:10

More than that though, what I meant

50:15

is about – if we go back to the issue

50:17

of like where are the young founders, why

50:20

don't we have more of those? And

50:23

I don't think it's just the tech startup industry. I

50:25

think you could say that about like young scientists

50:28

or many other categories. Those are maybe just the ones

50:30

that I know the best.

50:35

In a world with any

50:39

amount of technology,

50:41

I still think we've got

50:43

to –

50:44

it is our destiny in some sense

50:45

to stay on

50:47

this curve. And we still

50:50

need to go figure out what's next and after the

50:52

next hill and after the next hill.

50:54

And it would be – my

50:59

perception is

50:59

that there is some long-term societal

51:02

change happening here and I think it makes us less happy

51:04

too. Right.

51:08

It may make us less happy.

51:10

But what I'm saying is if the human species

51:13

does integrate with technology, wouldn't

51:16

it be a great way to facilitate that

51:19

to be to kind of feminize the

51:21

primal apes and to

51:24

sort of downplay the role –

51:26

You mean like the tech – like should the AGI fall into

51:29

the world? Well, maybe – I don't know if it's AGI. Maybe

51:31

it's just an inevitable consequence

51:34

of technology because especially

51:37

the type of technology that we use which

51:39

does have so much plastic in it and

51:42

then on top of that, the technology that's involved

51:44

in food systems, preservatives,

51:46

all these different things that we use to make sure that people don't

51:48

starve to death. We've made incredible

51:51

strides in that. There are very few people

51:53

in this country that starve to death. Yeah. It

51:56

is not a primary issue but violence

51:58

is a – primary issue.

52:01

But our concerns about

52:03

violence and our

52:05

concerns about testosterone and strong

52:08

men and powerful people is

52:10

only because we need to protect against

52:12

others. We need to protect against other

52:14

same things. Is that true? Is that really the

52:16

only reason? Sure. I

52:19

mean, how many like incredibly violent women

52:21

are out there running gangs? No, no, that part for

52:23

sure. Yeah. I've been offering

52:25

many. What I meant more

52:27

is that the only reason that

52:29

society values like strong masculinity.

52:32

Yeah, I think so. I think it's a biological

52:34

imperative, right? And I think that biological

52:37

imperative is because we used to have to defend

52:39

against incoming tribes and predators

52:41

and animals and we needed

52:44

someone who was stronger than most

52:47

to defend the rest. And like

52:49

that's the concept of the military. That's why

52:52

Navy SEAL training is so difficult. We want

52:54

the strongest of the strong to be

52:56

at the tip of the spear. But that's only

52:58

because there's people like that out there

53:00

that are bad. If artificial

53:04

general intelligence and the implementation

53:06

of some sort of a device that changes

53:08

the biological structure of

53:11

human beings to the point where that is

53:13

no longer a concern. Like if you

53:15

are me and I am you and I know this

53:17

because of technology, violence

53:19

is impossible. Yeah. Look, by the time if

53:22

this goes all the way down the sci-fi path and we're all like merged

53:24

into the one single like planetary

53:27

universal whatever consciousness,

53:30

then yes, you don't. You don't need testosterone.

53:32

You need testosterone, but you still- Especially

53:34

if we can reproduce through other methods.

53:38

Like this is the alien hypothesis, right? Like

53:40

why do they look so spindly and without any

53:42

gender and you know when they have these big

53:44

heads and tiny mouths. They don't need physical strength. They don't need

53:46

physical strength. They have some sort of

53:48

a telepathic way of communicating. They

53:51

probably don't need sounds with their mouths and

53:54

they don't need this

53:56

urge that we have to conquer

53:58

and to spread our DNA.

53:59

that's so much of what people do

54:02

is these reward systems that

54:05

were established when we were territorial apes.

54:08

There's a question to me about how much you can

54:12

ever get

54:14

rid of that if you make an

54:17

AGI

54:18

and it decides actually

54:21

we don't need to expand. We don't need more territory. We're

54:23

just like happy. We at this point, you,

54:25

me, it, the whole thing all together all mentioned. We're

54:28

happy here on Earth. We don't need

54:29

any bigger. We don't need to reproduce. We don't need to grow. We're

54:33

just going to sit here and run.

54:36

A, that sounds like a boring life. I don't agree

54:38

with that.

54:39

I don't agree that that would be the logical conclusion.

54:42

I think the logical conclusion would be they

54:46

would look for problems

54:48

and frontiers that are

54:51

insurmountable to our current existence

54:54

like intergalactic communication and transportation.

54:57

What happens when it meets another AGI, the other galaxy

54:59

over? What happens when it meets an AGI that's

55:01

a million years more advanced? What

55:04

does that look like?

55:06

That's what I've often wondered if we are –

55:09

I call ourselves the biological caterpillars that

55:11

create the electronic butterfly that

55:14

we're making a cocoon right now and we don't even know

55:16

what we're doing. I think it's also tied into

55:18

consumerism because

55:20

what does consumerism do? Consumerism

55:23

facilitates the creation of newer and better

55:25

things because you always want the newest,

55:27

latest, greatest. So you have more advanced

55:30

technology and automobiles and computers

55:32

and cell phones and all

55:35

of these different things including medical

55:36

science.

55:40

That's all for short true.

55:44

The thing I was like reflecting on as you were saying that is

55:47

I don't think I – I'm

55:51

not as optimistic that we can or

55:54

even should

55:56

overcome our biological

55:58

base.

55:59

to the degree that I think you think we can. And

56:04

to even go back one further level, I

56:07

think society is happiest

56:10

where there's roles for

56:12

strong femininity and strong masculinity

56:14

in the same people and in different people.

56:24

And I don't think a lot of these

56:26

deep-seated things are

56:29

going to be able to get pushed

56:32

aside very easily and still have

56:34

a system that works. Sure,

56:37

we can't really think about what, if there

56:39

were consciousness in a machine someday or whatever, what

56:42

that would be like.

56:44

And maybe I'm just thinking too

56:47

small-mindedly. But

56:49

I think there is something

56:53

about us that has worked

56:55

in a super deep way.

56:58

And it took evolution a lot of search

57:00

space to get here.

57:01

But I wouldn't discount it too easily. But

57:04

don't you think that cave people would probably have

57:06

those same logical conclusions about

57:09

life and sedentary lifestyle and sitting in front

57:11

of a computer and not interacting with each other

57:14

except through text?

57:19

I mean, isn't that like what you're saying is correct?

57:21

How different do you think our motivations are

57:23

today? And what really brings

57:26

us

57:26

genuine joy in how we're

57:28

wired

57:28

at some deep level differently than

57:31

cave people? Clearly, lots of other things have changed.

57:33

We've got much better tools. But how

57:35

different do you think it really is? I think

57:38

that's the problem is that genetically,

57:41

at the base level, there's not much difference.

57:44

And that these reward systems are

57:47

all there. We interact

57:49

with all of them, whether it's

57:52

ego, lust, passion,

57:55

fury, anger, jealousy, all

57:58

these different things. And you think we'll be. Some

58:00

people will upload and edit those out. Yes.

58:03

Yeah. I think that our concern with

58:06

losing this aspect of what

58:08

it means to be a person, like the

58:11

idea that we should always have conflict and struggle

58:13

because conflict and struggle is how we facilitate

58:16

progress, which is true, right?

58:18

And combating evil is how the

58:20

good gets greater and stronger if the

58:22

good wins. My concern

58:25

is that that is all predicated

58:27

on the idea that the biological

58:29

system that we have right now is

58:32

correct

58:34

and optimal.

58:36

And I think one of the things

58:38

that we're dealing with, with the heightened

58:40

states of depression and anxiety

58:42

and the lack of meaning and existential angst

58:45

that people experience, a lot

58:47

of that is because the biological

58:50

reality of being a human animal

58:53

doesn't really integrate

58:55

that well with this world that we've created.

58:58

That's for sure. Yeah. And

59:01

I wonder if the

59:03

solution to that is not

59:05

find ways to find

59:08

meaning with the biological, you

59:11

know, vessel that you've been given, but

59:13

rather engineer those

59:16

aspects. That are problematic

59:19

out of the system

59:21

to create a truly enlightened being.

59:24

Like one of the things, if you ask someone today, what

59:26

are the odds that in three years

59:28

there will be no war in the world?

59:30

That's zero. Like nobody

59:32

thinks, no, there's never been a time in human

59:35

history where we haven't had war. If

59:37

you had to say, what is our number one

59:39

problem as a species? Well,

59:41

I would say our number one problem is

59:44

war. Our number one problem is

59:46

this idea that it's okay

59:48

to send massive groups of people who don't know

59:51

each other to go murder massive groups of people

59:53

that are somehow opposed because of the

59:55

government and because of the wines

59:57

and the sand and the terra-chords. That's clearly the same thing. The

1:00:00

same thing. How do you get rid of that?

1:00:02

Well, one of the ways you get rid of that is to completely

1:00:05

engineer out all the human reward

1:00:07

systems that pertain to the acquisition

1:00:10

of resources. So what's

1:00:12

left at that point? Well, we're a new thing.

1:00:15

I think we've become a new thing. And what does that thing

1:00:17

do once? I think that new thing

1:00:19

would probably want to interact with

1:00:22

other new things that are even more advanced than it.

1:00:25

I do believe that scientific

1:00:30

curiosity can drive

1:00:32

quite – that can be a great frontier

1:00:34

for a long time. Yeah.

1:00:38

I think it can be a great frontier for a long time as well.

1:00:40

I just wonder if what we're

1:00:42

seeing with the drop in testosterone,

1:00:46

because of microplastics, which sort of just

1:00:48

snuck up on us, we didn't even know that it

1:00:50

was an issue until people started studying. How certain

1:00:52

is that at this point that that's what's happening? I don't know.

1:00:55

I'm going to go study after this. It's a very

1:00:57

good question. Dr. Shana Swan believes

1:00:59

that it's the primary driving

1:01:01

factor of the sort

1:01:04

of drop in testosterone and all miscarriage

1:01:06

issues and low birth weights. All

1:01:09

those things seem to have a direct – there

1:01:11

seems to be a direct factor environmentally.

1:01:14

I'm sure there's other factors too. The

1:01:17

drop in testosterone, I mean, it's been

1:01:19

shown that you can increase male's

1:01:21

testosterone through resistance training

1:01:23

and through making – there's certain things

1:01:26

you can do. Like, one of the big ones they found

1:01:28

through a study in Japan is cold water

1:01:30

immersion before exercise

1:01:33

radically increases testosterone. So

1:01:36

cold water immersion and then exercise post

1:01:38

that. I wonder why. Yeah, I don't know.

1:01:40

I just see who can find that. But

1:01:44

it's a fascinating field of study, but

1:01:46

I think it has something to do with resilience

1:01:48

and resistance and the fact that your body has to combat

1:01:51

this external factor

1:01:53

that's very extreme, that causes

1:01:55

the body to go into this state

1:01:58

of preservation and – the

1:02:01

implementation of cold shock proteins and

1:02:03

the reduction of inflammation

1:02:05

which also enhances the body's endocrine

1:02:07

system but then on top of that this imperative

1:02:10

that you have to become more resilient to

1:02:12

survive this external factor

1:02:15

that you've introduced into your life every single

1:02:17

day. So

1:02:20

there's ways obviously

1:02:22

that you can make a human being

1:02:25

more robust. We know

1:02:27

that we can do that through strength training and that all

1:02:29

that stuff actually does raise testosterone.

1:02:32

Your diet can raise testosterone

1:02:34

and a poor diet

1:02:37

will lower it and will hinder your endocrine

1:02:39

system, will hinder your ability to produce

1:02:41

growth hormone, melatonin,

1:02:44

all these different factors. That

1:02:46

seems to be something that we can fix

1:02:49

in terms or at least mitigate with

1:02:52

decisions and choices and effort. But

1:02:55

the fact that these petrochemical

1:02:58

products – like there's a graph that Dr.

1:03:00

Shannon Swan has in her book that shows

1:03:03

during the 1950s when I started using

1:03:06

petrochemical products and everything, microwave,

1:03:09

plastic, Saran wrap, all those different stuff.

1:03:11

There's a direct correlation between

1:03:14

the implementation and the dip and

1:03:17

it all seems to line up

1:03:19

like that seems to be a primary factor.

1:03:24

Does that have an equivalent impact on estrogen

1:03:27

related hormones? That's a good question. Some

1:03:30

of them actually – I know some

1:03:33

of these chemicals that they're talking about actually

1:03:35

increase estrogen in men. I

1:03:38

don't know – but I do know that it increases

1:03:41

miscarriages. So I

1:03:43

just think it's overall disruptive

1:03:45

to the human body. Definitely a societal wide

1:03:48

disruption of the endocrine system in a short period

1:03:50

of time. I think

1:03:53

it's just bad and difficult to wrap our heads around. And then

1:03:55

pollutants and environmental

1:03:58

toxins on top of the – pesticides

1:04:00

and herbicides and all these other things and microplastics.

1:04:03

There's a lot of factors that are leading our systems

1:04:05

to not work well.

1:04:08

But I just really

1:04:11

wonder if this – are

1:04:14

we just clinging on to this monkey body?

1:04:16

Are we deciding? I like my monkey body. I

1:04:18

do too. Listen, I love it. But

1:04:22

I'm also – I try to be very objective.

1:04:24

And when I objectively look at it in terms

1:04:26

of like if you take where we are

1:04:29

now and all of our problems and

1:04:31

you look towards the future and like what

1:04:34

would be one way that

1:04:36

you could mitigate a lot of these? And

1:04:38

it would be the implementation of some sort of a telepathic

1:04:41

technology where you couldn't

1:04:44

just text someone or tweet

1:04:47

at something mean because you would literally feel

1:04:49

what they feel when you put that

1:04:51

energy out there. And

1:04:54

you would be repulsed. And

1:04:57

then violence would be – if

1:04:59

you were committing violence on someone

1:05:02

and you literally felt the reaction

1:05:05

of that violence in your own being, then

1:05:09

you would also have no motivation for

1:05:11

violence. If we had no

1:05:14

aggressive tendencies, no primal

1:05:16

chimpanzee tendencies – You know, it's

1:05:19

true that violence in the world has

1:05:21

obviously gone down a lot over the decades

1:05:23

but

1:05:24

emotional violence is up a lot and

1:05:26

the internet has been horrible for that. Like

1:05:28

I don't walk – I'm not going to walk over there and punch you because you're like a big

1:05:30

strong guy. You're going to punch me back and

1:05:32

also there's a societal

1:05:33

convention not to do that. If

1:05:37

I didn't know you, I might like send a mean tweet

1:05:39

about you and I feel nothing on that. And

1:05:43

clearly that has become like

1:05:45

a mega epidemic in society

1:05:48

that we did not evolve

1:05:50

the biological constraints

1:05:54

on somehow. And

1:05:58

I'm actually very –

1:06:00

worried about how much that's already destabilized

1:06:03

us and made us all miserable. It certainly

1:06:05

accentuated it. It's exacerbated

1:06:08

all of our problems. I mean if

1:06:10

you read Jonathan Hates' book, The Coddling of

1:06:12

the American Mind, Great book. Yeah it's a great

1:06:14

book and it's very damaging to

1:06:16

women, particularly young girls. Young

1:06:18

girls growing up there's a direct correlation between

1:06:21

the invention of social media, the

1:06:23

introduction to the iPhone, self-harm,

1:06:25

suicide, online bullying.

1:06:28

You know like people have always talked

1:06:30

shit about people and no one's around. I think

1:06:32

the fact that they're doing it now openly

1:06:35

to harm people. Horrible

1:06:37

obviously. I think it's super damaging to men

1:06:40

too. Maybe they just like talk about it less but I don't

1:06:42

think any of us are like set up for

1:06:44

this. No, no one set up for it and

1:06:46

you know I think famous people know that

1:06:48

more than anyone. We all get used to it. Yeah

1:06:51

you just get numb to it and or if you're wise you

1:06:53

don't engage. I don't engage. I don't

1:06:55

even have any apps on my new phone.

1:06:57

Yeah. I've got a new phone and I said

1:07:00

okay nothing.

1:07:00

That's really smart. No Twitter. So I have

1:07:02

a separate phone that if I have to

1:07:04

post something I pick up but

1:07:07

all I get on my new phone is text messages.

1:07:10

And is that more just to like keep

1:07:12

your mind pure and un-looted?

1:07:14

Yeah. Not tempt myself. You know many fucking

1:07:18

times I've got up to go to the bathroom first

1:07:20

thing in the morning and spent an hour just

1:07:23

sitting on the toilet scrolling through Instagram

1:07:25

like for nothing does zero for

1:07:27

me and there's this thought that

1:07:29

I'm gonna get something out of it. I

1:07:32

was thinking actually just yesterday

1:07:34

about how you

1:07:36

know we all have talked for so long about these algorithmic

1:07:39

feeds are gonna manipulate us in these big

1:07:41

ways and that

1:07:43

will happen but in the small ways already

1:07:46

where like

1:07:47

scrolling Instagram is not even that

1:07:49

fulfilling like you finished that hour and you're like

1:07:51

I know that was a waste of my time

1:07:55

but it was like over the threshold where you couldn't

1:07:57

quite it's hard to put the phone down. Right.

1:08:00

hoping that the next one's gonna be interesting.

1:08:02

And every now and then, the problem is every

1:08:05

30th or 40th reel that I

1:08:08

click on is wild. I wonder,

1:08:10

by the way, if that's more powerful

1:08:12

than if everyone was wild,

1:08:14

if everyone was great. Sure. You

1:08:17

have some mine for gold. You don't just go out

1:08:19

and pick it like daisies. That's so interesting. If

1:08:22

the algorithm is intentionally feeding you some

1:08:24

shit along the way. Yeah. Well,

1:08:27

there's just a lot of shit out there, unfortunately.

1:08:30

But it's just, in terms of, I

1:08:33

was talking to Sean O'Malley,

1:08:35

who's this UFC fighter, who's, obviously,

1:08:37

has a very strong mind. Really interesting guy. But

1:08:39

one of the things that Sean said is, I

1:08:41

get this low-level anxiety

1:08:44

from scrolling through things, and I don't know why.

1:08:47

What is that? And I think it's part

1:08:49

of the logical mind realizes this is a

1:08:52

massive waste of your resources. I

1:08:54

also deleted a bunch of that stuff off my phone

1:08:57

because I just didn't have the self-control. I mean,

1:08:59

I had the self-control to delete it, but not

1:09:01

to stop once I was scrolling through. And

1:09:05

so I think we're just like, yeah,

1:09:09

we're getting attention hacked in

1:09:12

some ways. There's some good to it, too, but

1:09:14

we don't yet have the

1:09:16

stuff in place.

1:09:18

The tools, the societal norms, whatever

1:09:21

to modulate it well. Right. And

1:09:23

we're designed for it. Certainly. This is

1:09:26

a completely new technology that, again,

1:09:28

hijacks our human reward systems

1:09:31

and hijacks all of

1:09:33

the checks and balances

1:09:35

that are in place for communication,

1:09:38

which historically has been one-on-one.

1:09:40

Historically, communication has been one person

1:09:43

to another. And when people write letters

1:09:45

to each other, it's generally things

1:09:48

like if someone writes a love letter

1:09:50

or they miss you. They're

1:09:53

writing this thing where they're kind of exposing

1:09:55

a thing that maybe they have a difficulty in expressing

1:09:58

in front of you, and it was Generally,

1:10:01

unless the person was a psycho, they're not

1:10:04

hateful letters. Whereas

1:10:06

the ability to just communicate, fuck that guy,

1:10:08

I hope he gets hit by a bus, is so

1:10:11

simple and easy and you

1:10:14

don't experience. Twitter seems

1:10:17

to be particularly horrible for this, the

1:10:19

mechanics work. It

1:10:22

really rewards

1:10:23

in ways that I don't think anybody fully understands.

1:10:26

It happens into something about human psychology.

1:10:29

But

1:10:32

that's how you get engagement, that's how you get followers,

1:10:36

that's how you get the

1:10:38

dopamine hits or whatever.

1:10:45

The people who I know that spend all day

1:10:47

on Twitter,

1:10:48

more of them are unhappy about it than happy. Oh

1:10:51

yeah.

1:10:51

They're the most unhappy. There's

1:10:53

quite a few people that I follow that I only

1:10:56

follow because they're crazy and then I'll

1:10:58

go and check in on them and see what the fuck they're

1:11:00

tweeting about. Some of them are on

1:11:02

there eight, 10 hours a day. I'll

1:11:04

see tweets all day long

1:11:07

and I know that person cannot

1:11:09

be happy. They're unhappy and they cannot stop. They

1:11:12

can't stop. It seems like

1:11:15

it's their life. They

1:11:19

get meaning out of it in terms of

1:11:22

reinforcement. They

1:11:24

get short term meaning out of it. I

1:11:26

think maybe each day you go to bed feeling like you

1:11:29

accomplished something and got your dopamine and at the end of

1:11:31

each decade, you probably are like, where'd that decade

1:11:33

go? I was talking to a friend of mine who was having a real problem

1:11:35

with it and he's saying he would be literally walking down

1:11:37

the street and he'd have to check his phone to see who's replying

1:11:40

and he wasn't even looking where he's walking. He

1:11:42

was just caught up in the anxiety

1:11:45

of these exchanges. And it's not because of the

1:11:47

nice things people say. No, no, no, no.

1:11:50

And with him, he was recognizing that

1:11:53

he was dunking on people and then seeing

1:11:55

people respond to the dunking. Yeah,

1:11:58

I stopped doing that a long time. I stopped interacting

1:12:01

with people on Twitter in a negative way. I just won't

1:12:03

do it Yeah, just even if I disagree with someone

1:12:05

else say something I have like

1:12:07

fleece possible I have like more of an internet

1:12:09

troll streak than I would like to admit And

1:12:12

so I try to just like not give myself too

1:12:14

much anticipation, but I slip up sometimes. Yeah,

1:12:17

it's so tempting Totally. It's

1:12:19

so tempting to and it's fun. It's fun

1:12:21

to say something shitty I

1:12:23

mean again, whatever this biological system we're

1:12:26

talking about earlier that get that that gets a positive

1:12:28

reward Well, it's a moment. There's a

1:12:30

react and you know, there's reactions You

1:12:32

say something outrageous and someone's going to react

1:12:35

and that reaction is like energy

1:12:37

and there's there's all these other human

1:12:39

beings engaging with your idea but

1:12:43

Ultimately, it's just not

1:12:45

productive for most

1:12:47

people and it's psychologically

1:12:50

It's just fraught with peril.

1:12:53

There's just so much going on. I don't know anybody

1:12:56

ages all day long. That's happy Certainly

1:12:59

not I don't like I'm

1:13:03

I think I've watched it like destroys too strong

1:13:05

of a word But like

1:13:06

knock off track the careers

1:13:09

or life or happiness or human

1:13:11

relationships Of people that are

1:13:14

like good smart conscientious

1:13:16

people.

1:13:17

Yeah, like God couldn't fight

1:13:19

this demon Yeah, like hacked there and

1:13:21

kovat really accentuated that because people

1:13:23

were alone and isolated and

1:13:26

that made it even worse because

1:13:28

then they felt They

1:13:30

felt even better saying shitty things

1:13:33

to people. I'm unhappy. Yeah, even

1:13:35

worse things about you And then there was the psychological

1:13:38

aspect of it like the angst that came from

1:13:40

being socially isolated

1:13:42

and terrified about this invisible disease.

1:13:45

It's gonna kill us all and You

1:13:47

know and so you have this like rah and

1:13:49

then you're interacting with people on tour and then you're caught

1:13:51

up in that Anxiety and you're doing it all

1:13:53

day and I know quite a few people especially

1:13:56

comedians That really lost

1:13:58

their minds and lost the respect respect

1:14:00

to their peers by doing that.

1:14:02

I have a lot of sympathy for people who lost their

1:14:04

minds during COVID because what

1:14:07

a natural thing for us all to go through and isolation

1:14:10

was just brutal. But a lot

1:14:12

of people did.

1:14:13

And I don't think the internet,

1:14:15

and particularly not the kind of like social

1:14:18

dynamics of things like Twitter,

1:14:19

I don't think that like brought on anyone's best. Yeah.

1:14:23

Well, I mean, some people I think if

1:14:25

they're not inclined

1:14:28

to be shitty to people, I think some people did seek

1:14:31

comfort and they did interact

1:14:33

with people in positive ways. I see there's

1:14:35

plenty of positive. I think the thing

1:14:37

is that the negative interactions are so much more

1:14:40

impactful. Yeah. Look,

1:14:42

I think there are a lot of people who use

1:14:44

these systems for wonderful things. I didn't mean to imply

1:14:47

that's not the case, but that's not what drives

1:14:51

people's emotions after getting off the platform in

1:14:53

the end of the day. Right. Right.

1:14:56

Right. And I think that's a pie chart of

1:14:58

the amount of interactions on

1:15:00

Twitter. I would say a lot of them

1:15:03

are shitting on people and being angry

1:15:05

about them. How many of the people that you know that

1:15:07

use Twitter those eight or ten hours a day are

1:15:09

just saying wonderful

1:15:10

things about other people all day versus the

1:15:12

virulent?

1:15:13

Very few. Yeah. Very

1:15:16

few. I don't know any of them. I know.

1:15:20

But then again, I wonder with

1:15:22

the implementation of some

1:15:25

new technology that makes communication

1:15:27

a very different thing than what we're occurring in. What

1:15:30

we're doing now with communication is less

1:15:33

immersive than communicating

1:15:35

one-on-one. You and I are talking. Yeah.

1:15:38

We're looking into each other's eyes. We're getting social cues. Yeah.

1:15:41

We're smiling at each other. We're laughing. It's

1:15:43

a very natural way to talk. I

1:15:45

wonder if through the implementation

1:15:47

of technology, if it

1:15:50

becomes even more immersive

1:15:53

than a one-on-one conversation, even

1:15:56

more interactive and you

1:15:58

will understand even more about

1:16:01

the way a person feels about what you say, about

1:16:04

that person's memory, that person's

1:16:07

life, that person's history,

1:16:10

their education, how it comes

1:16:12

out of their mind, how their

1:16:15

mind interacts with your mind and

1:16:17

you see them, you really see

1:16:19

them.

1:16:20

I wonder if that, I

1:16:22

wonder if what we're experiencing now is

1:16:25

just like the first time people invented

1:16:27

guns and just started shooting at things. Yeah,

1:16:30

if you can like feel what I feel

1:16:32

when you say something mean to me or nice to me,

1:16:34

like that's clearly gonna change

1:16:38

what you decide to say. Yes, yeah,

1:16:41

yeah, unless you're a psycho. Unless you're a psycho.

1:16:43

And then what causes someone to

1:16:45

be a psycho and can that be

1:16:47

engineered out? Imagine

1:16:50

what we're talking about when

1:16:52

we're dealing with the human mind, we're dealing with various

1:16:55

diseases, bipolar, schizophrenia.

1:16:58

Imagine a world

1:17:01

where we can find the root

1:17:03

cause of those things and

1:17:05

through coding and some

1:17:08

sort of an implementation

1:17:10

of technology that elevates

1:17:13

dopamine and serotonin and

1:17:15

does some things to people that eliminates

1:17:18

all of those problems and

1:17:21

allows people to communicate in a very

1:17:23

pure way. It

1:17:27

sounds great. It sounds great but you're not gonna have

1:17:29

any rock and roll, stand up comedy will die.

1:17:33

You'll have no violent movies, you

1:17:36

know, there's a lot of things that are gonna go out the window

1:17:38

but maybe that is also part of the process

1:17:40

of our evolution to the next stage of

1:17:43

existence. Maybe

1:17:47

I feel genuinely confused on

1:17:49

this. Well, I think you should be. We're

1:17:51

gonna find out. Yeah. I mean,

1:17:53

to be sure how it's gonna... That's the same. But I

1:17:55

don't even have like... Two versus Beyond Belief. Right.

1:17:58

I mean, you just... From

1:18:00

the when did open AI when

1:18:02

did you first start this project? our

1:18:05

like the very beginning and end of 2015 early 2016 and When

1:18:09

you initially started this project?

1:18:12

What kind of timeline

1:18:14

did you have in mind and

1:18:17

how's it stayed on that timeline? Or is it

1:18:19

just wildly out of control? I? Remember

1:18:23

talking

1:18:24

with John Schulman one of our co-founders

1:18:30

And I

1:18:30

was like yeah, that's about right to me and

1:18:34

I've always sort of thought since then now

1:18:36

I no longer think of like AGI is quite the endpoint

1:18:39

But to get to the point where we like accomplish

1:18:42

the thing we set out to accomplish

1:18:44

You know that would take us to like 20 30 20 31

1:18:49

has felt

1:18:50

to me like

1:18:51

all the way through kind of a a

1:18:55

reasonable estimate with huge error bars, and

1:18:58

I kind of think we're on the trajectory

1:19:00

I sort of assumed and

1:19:02

What did you think

1:19:04

the impact? On

1:19:07

society would be like did you when

1:19:09

you when you first started doing this and

1:19:12

you said okay if we are successful And

1:19:14

we do create some massively

1:19:17

advanced AGI What

1:19:20

what is the implementation and how

1:19:22

what is the impact on society? Have

1:19:25

did you did you sit there and

1:19:27

have like a graph like you had

1:19:29

the pros on one side the cons on the other? Did

1:19:32

you just sort of abstractly consider

1:19:34

well?

1:19:35

We definitely talked a lot about the cons

1:19:37

you know

1:19:39

Many of us were super worried

1:19:42

about

1:19:42

and so are about

1:19:46

Safety and alignment and if we build these

1:19:48

systems

1:19:49

we can all see the great future that's easy to imagine But

1:19:52

if something goes horribly wrong, it's like really horribly

1:19:54

wrong

1:19:55

And so there was a lot of discussion

1:19:58

about and really

1:19:59

A big part of the founding spirit of this is like, how

1:20:02

are we going to solve this safety problem? What does

1:20:04

that even mean?

1:20:06

One of the things that we believe is that the

1:20:08

greatest minds in the world cannot sit there and solve

1:20:10

that in a vacuum.

1:20:12

You've got to have contact reality.

1:20:14

You've got to see where the technology

1:20:15

goes. Practice

1:20:17

plays out in a stranger way than theory,

1:20:20

and that's certainly proven true for us.

1:20:23

We had a long list

1:20:25

of cons. We had a very

1:20:28

intense list of cons because there's

1:20:31

like all of the last decades of sci-fi telling

1:20:33

you about how

1:20:34

this goes wrong. Why are you supposed to shoot me right now?

1:20:38

I'm sure you've seen the John Connor

1:20:41

chat GPT memes. I haven't. What

1:20:43

is it? It's

1:20:46

like John Connor from The Terminator,

1:20:48

the kid, looking at you

1:20:50

when you open up chat GPT. Yeah,

1:20:55

so

1:20:56

that stuff we were very clear

1:20:59

in our minds on. Now, I think we

1:21:01

understand. There's a lot of work to do,

1:21:03

but we understand more

1:21:05

about how to make AI safe in

1:21:07

the –

1:21:09

AI safety gets overloaded. Does

1:21:12

it mean don't say so many people find defense? Or does it mean

1:21:14

don't destroy all of humanity or some continuum?

1:21:18

I think the word has gotten

1:21:20

overloaded, but in terms of the

1:21:22

not destroy all of humanity version of

1:21:24

it, we have a lot of work to do. But

1:21:27

I think we have finally more ideas

1:21:29

about what can work, and given

1:21:32

the way the systems are going,

1:21:33

we have a lot more opportunities available

1:21:35

for us to solve, and I thought we would have –

1:21:39

given the direction that we initially thought the technology

1:21:41

was going to go. So that's good. On

1:21:43

the positive side,

1:21:46

the thing that I was most excited about then

1:21:48

and remain most excited about now is

1:21:52

what if this system can

1:21:54

dramatically increase

1:21:56

the rate of scientific knowledge in society?

1:21:59

And that is a – I

1:22:02

think that kind of like all real

1:22:05

sustainable economic growth,

1:22:07

the future getting better,

1:22:09

progress in some sense

1:22:11

comes from increased

1:22:13

scientific and technological capacity

1:22:15

until

1:22:16

we can solve all the problems. And if

1:22:18

the AI can help us do that,

1:22:20

that's always been the thing I've been most excited about. Well,

1:22:23

it certainly seems like that is the greatest

1:22:25

potential – greatest positive potential

1:22:28

of AI. It is to solve

1:22:31

a lot of the problems that human beings

1:22:33

have had forever, a lot of the societal

1:22:36

problems that seem to

1:22:38

be – I mean that's what I was talking about at an AI president.

1:22:41

I'm kind of not joking because I feel like if

1:22:43

something was hyper intelligent

1:22:46

and aware of all the variables with

1:22:48

no human bias and no

1:22:51

incentives, no other than

1:22:53

here's your program, the

1:22:55

greater good for the community of the

1:22:58

United States and the greater

1:23:00

good for that community as

1:23:02

it interacts with the rest of the world. The

1:23:07

elimination of these

1:23:10

dictators, whether

1:23:12

they're elected or

1:23:14

non-elected who impose their

1:23:16

will on the population

1:23:18

because they have a vested

1:23:21

interest in protecting special interest groups

1:23:23

and industry. I

1:23:26

think as long as –

1:23:29

the thing that I find scary when you say that is

1:23:31

it does – it feels like it's

1:23:33

humanity not in control

1:23:36

and I reflexively don't like

1:23:38

that. But

1:23:41

if it's instead like

1:23:43

it is the collective will of humanity being

1:23:45

expressed without the mistranslation

1:23:48

and corrupting influences along the way, then

1:23:50

I can see it.

1:23:51

Is that possible? It seems like it would be.

1:23:54

It seems like if it was programmed in that regard

1:23:57

to do the greater good for humanity.

1:24:00

And and take into account the

1:24:02

values of humanity the needs

1:24:04

of humanity There's something about the phrase

1:24:07

do the greater good for you. I know terrifying

1:24:09

very Orwellian all yeah But

1:24:12

also so is artificial general

1:24:14

intelligence for sure for sure open the door I

1:24:16

wish I wish I had worked on you know something

1:24:18

that was less

1:24:19

morally fraud But do you

1:24:21

cuz no no no I mean I can't

1:24:23

imagine that I cannot imagine a cooler thing

1:24:26

to work on I feel Unbelievable I feel like the luckiest

1:24:28

person on earth. No it is not

1:24:32

It's not on easy mode No,

1:24:36

no no no I mean you are at the

1:24:39

forefront of one of the most spectacular

1:24:42

changes in human history,

1:24:44

and I would say as No,

1:24:47

I would say more spectacular than

1:24:49

the implementation of the internet I

1:24:52

think the implementation of the internet

1:24:54

was the first baby steps this

1:24:57

and that artificial general intelligence

1:25:00

is Yeah, it is

1:25:02

the internet on steroids. It's the

1:25:04

the internet in you know hyperspace

1:25:09

What I would say is it's it's the next step and there'll be more

1:25:11

steps after yeah, but it's our most exciting step yet Yeah,

1:25:14

my my wonder is what

1:25:17

are those next steps after isn't that

1:25:19

so exciting to think about it's very exciting

1:25:21

I think we're the last people I Really

1:25:24

do I think we're the last of the biological

1:25:26

people with all the biological problems. I

1:25:29

think there's a Very

1:25:31

excited about that I Just

1:25:34

think that's just what it is you're just fine

1:25:36

with it is what it is you know I mean

1:25:39

That

1:25:40

I don't think you can control it at this point other than

1:25:43

some massive natural disaster that

1:25:45

resets us back to the Stone Age Which

1:25:47

is also something we should be very concerned with yes

1:25:49

It seems like that happens a lot We're not aware of

1:25:51

it because the timeline of a human body is so

1:25:53

small You know the timeline of the human

1:25:55

existence as a person is a hundred

1:25:58

years if you're lucky, but yet the time of

1:26:01

the Earth is billions of years and

1:26:03

if you look at how many times life

1:26:06

on Earth has been reset by comets

1:26:09

slamming into the Earth and just completely

1:26:12

eliminating all technological

1:26:15

advancement. It seems like it's happened multiple

1:26:17

times in recorded history.

1:26:22

I do think I always think we don't think about

1:26:24

that quite enough. We

1:26:29

talked about the simulation hypothesis earlier. It's had this

1:26:32

big resurgence.

1:26:33

In the tech industry recently,

1:26:35

one of the new takes on it as we get closer

1:26:37

to AGI is that if

1:26:39

ancestors were simulated at us,

1:26:41

the time they want to simulate again and again is right

1:26:43

up to the creation

1:26:44

of AGI.

1:26:45

Yeah.

1:26:48

So it seems very crazy we're living through this time. It's

1:26:50

not a coincidence at all. This

1:26:52

is the time that is after we

1:26:54

had enough cell phones out in the world recording

1:26:56

intensive video to train the video model of the world

1:26:59

that's all being jacked into us now via brain

1:27:01

implants or whatever. And before

1:27:03

everything goes really crazy with AGI.

1:27:05

And it's also this interesting time to simulate

1:27:07

like, can we get through?

1:27:10

Does the asteroid come right before we get there

1:27:12

for dramatic tension? Do we figure out how to make this

1:27:14

safe? Do we figure out how to societally agree

1:27:16

on it? That's led to like a

1:27:18

lot more people believing it than before, I think. Yeah,

1:27:21

for sure. And

1:27:23

again, I think this is just where it's going.

1:27:26

I mean, I don't know if that's a good thing or

1:27:29

a bad thing. It's just a thing. But it's

1:27:31

certainly better to live now. I

1:27:33

would not want to live in the

1:27:35

1800s and be in a covered wagon trying to

1:27:37

make my way across the country. Yeah,

1:27:39

we got the most exciting time in history yet. It's the best.

1:27:42

It's the best, but it also has the most

1:27:44

problems, the most social problems, the

1:27:47

most awareness of social,

1:27:50

environmental infrastructure,

1:27:52

the issues that we have. We get to go solve them. Yeah.

1:27:56

And I intuitively,

1:27:58

I think I feel something.

1:27:59

something somewhat

1:28:02

different

1:28:03

than you, which is I think humans in

1:28:07

something close to this form are

1:28:09

going to be around

1:28:12

for a lot longer than I don't

1:28:14

think we're the last humans. How long

1:28:17

do you think we have?

1:28:19

Like

1:28:23

longer than a time frame I can reason about. Really?

1:28:26

There may be like I could totally imagine

1:28:28

a world where some people decide

1:28:31

to merge and go off exploring the universe with

1:28:34

AI and there's a big universe out there, but

1:28:37

like can I really imagine a world

1:28:39

where short of a natural disaster there

1:28:41

are not humans pretty similar

1:28:44

to humans from today

1:28:45

on earth doing human-like

1:28:47

things

1:28:49

and the sort of spirit of humanity

1:28:51

merged into these other things that are out

1:28:53

there doing their thing in the universe. It's

1:28:56

very hard for me to actually

1:28:59

see that happening

1:29:01

and maybe that means I'm like going to turn out to be a dinosaur

1:29:03

and let it

1:29:04

horribly run in this prediction, but

1:29:06

I would say I feel it more over time

1:29:08

as we make progress with AI, not less. Yeah,

1:29:10

I don't feel that at all.

1:29:12

I feel like we're done. In like

1:29:14

a few years? No, maybe

1:29:17

a generation or two. It'll probably

1:29:19

be a gradual change like wearing

1:29:21

of clothes. I don't think everybody

1:29:24

wore clothes and they invented clothes. I think it probably

1:29:26

took a while. When someone figured out shoes,

1:29:28

I think that probably took a while. When they

1:29:30

figured out structures, doors, houses,

1:29:33

cities, agriculture, all those

1:29:35

things were slowly implemented over

1:29:37

time and then now become everywhere. I

1:29:40

think this is far more

1:29:43

transformative. It's part of that because

1:29:45

you don't think there will be an option for some people

1:29:47

not to merge. Right. Just

1:29:49

like there's not an option for some people to not have telephones

1:29:51

anymore. I used to have friends

1:29:54

like, I don't even have email. Those

1:29:56

people don't exist anymore. They all have email. Everyone

1:29:58

has a phone, at least a flip phone. I know

1:30:00

some people that they just can't handle

1:30:03

social media and all that jazz. They went to a flip

1:30:05

phone good I don't know if this is true or not. I've heard

1:30:07

you can't like walk into an AT&T store anymore I

1:30:10

heard they can't change you can okay really someone

1:30:13

maybe this but I don't know if it's Verizon still hasn't

1:30:15

I was just there Hey, they still have flip phones.

1:30:17

I was like I like it I like this fucking

1:30:20

little thing that you just call people and I always

1:30:22

like romanticized about going to that

1:30:24

but my step was to go to a phone

1:30:26

that has nothing on it, but yes messages and That's

1:30:29

been a few days Feeling good

1:30:31

so far. Yeah, it's good. You know I still

1:30:34

have my other phone that I use for

1:30:36

social media but when I pick that motherfucker

1:30:38

up I start scrolling through YouTube and watching

1:30:41

videos and scrolling through tik-tok

1:30:43

or Instagram and I

1:30:45

don't have tik-tok but I've

1:30:47

I have I tried threads for a little

1:30:49

while but fucking ghost

1:30:52

town Went right back to

1:30:54

X. I Live on a ranch

1:30:56

during the weekends, and there's no I fly

1:30:58

in the house, but there's no cell phone coverage

1:31:01

anywhere else hmm and

1:31:04

It's

1:31:06

Every week I forget

1:31:08

how nice it is and what a change

1:31:11

it is to go for a walk Yeah,

1:31:13

no cell phone coverage good for your mind

1:31:15

for it's unbelievable for your mind,

1:31:17

and I think we have like

1:31:19

So quickly lost something

1:31:21

hmm

1:31:23

Like out of service just doesn't happen that doesn't even have

1:31:25

an airplanes anymore. We don't like but

1:31:28

that

1:31:30

Like

1:31:31

ours where your phone just cannot

1:31:33

buzz yeah,

1:31:35

no text message either nothing I

1:31:39

Think that's a really healthy thing

1:31:41

I dropped my phone once when I was in lanai

1:31:44

and I think it was the last time I dropped the phone the

1:31:46

phone was Like we're done, and it just

1:31:48

started calling people randomly Like

1:31:50

it would just call people and I'd hang it

1:31:52

up and call another person I'd hang it up, and

1:31:54

I was showing my wife. I was like look at this. This is crazy It's

1:31:57

just calling people and so the phone

1:31:59

was broken And so I had an order of

1:32:01

phone, and we were on vacation for like eight days. And

1:32:04

it took three days for Apple to

1:32:06

get me a phone. I bet you had a great three days. It was

1:32:08

amazing. It was amazing. Because

1:32:11

when I was hanging out with my family, I was totally

1:32:13

present, there was no options. And

1:32:16

I wasn't thinking about checking my

1:32:18

phone, because it didn't exist, I didn't have

1:32:20

one. And there was a alleviation

1:32:23

of again, what Sean

1:32:25

was talking about, that low level of anxiety.

1:32:28

It's sort of like, ehh, ehh, that

1:32:32

you have when you always wanna check your phone. Yeah,

1:32:35

I think that thing, it's so bad. We

1:32:38

have not figured out yet, like

1:32:41

the technology has moved so fast, biology

1:32:43

moves very slowly. We have not figured out

1:32:45

how we're gonna function in a society

1:32:47

and get those occasional times when

1:32:50

your phone has broken for three days, or you

1:32:52

go for a walk within the service. But

1:32:55

it's like,

1:32:59

I very much feel like my phone controls

1:33:02

me, not the other way around. Uh-huh.

1:33:04

And I hate it,

1:33:06

but I haven't figured out what to do about it.

1:33:08

Well, that's what I'm worried about with

1:33:10

future technology, is that

1:33:12

this, which was so unanticipated,

1:33:15

if you'd imagine a world, when you, can you

1:33:17

imagine going up to someone in 1984

1:33:20

and pointing to a phone and saying, one day that'll

1:33:22

be in your pocket, it's gonna ruin your life. Like,

1:33:25

what? Yeah, one day

1:33:27

people are gonna be jerking off to that thing. You're

1:33:29

like, what? One day people are gonna be

1:33:31

watching people get murdered on Instagram. I've never seen

1:33:33

so many murders on Instagram over the last few months.

1:33:36

Really, I've never seen one. Oh, been a bad timeline.

1:33:38

Me and my friend Tom Segura, every

1:33:41

morning we text each other the worst things that

1:33:43

we find on Instagram. Why? For

1:33:45

fun, he's a comedian. Okay. We're both

1:33:47

comedians. That's fun to you? Yeah. This

1:33:50

is fucking, just ridiculous. I mean, just

1:33:53

crazy car accidents, people get gored

1:33:55

by bulls and like every,

1:33:57

like we try to top each other. So every day.

1:34:00

He's setting me the most every day when I wake up and

1:34:02

I check Tom fuck when you know, I

1:34:04

think can you explain? What's fun about

1:34:06

that? Well? He's

1:34:08

a comic and I'm a comic and

1:34:10

comics like chaos. We like

1:34:13

we like ridiculous Outrageous

1:34:16

shit that is just so

1:34:19

far beyond the norm of what you experience

1:34:21

in a regular day Got it and

1:34:23

also the Understanding

1:34:26

of the wide spectrum of

1:34:28

human behavior if you're a nice person

1:34:31

and you surround yourself with nice people You

1:34:33

very rarely see someone get shot

1:34:36

you very rarely see People

1:34:38

get stabbed for no reason randomly

1:34:40

on the street, but on Instagram you see that every day And

1:34:45

there's something about that which just reminds

1:34:47

you oh The work crazy

1:34:50

like the the human species like there's

1:34:52

a certain percentage of us that are just off

1:34:55

the rails and just out there

1:34:58

just causing chaos and Jumping

1:35:01

dirt bikes and landing on your neck and all

1:35:04

that stuff is that even to hear that makes

1:35:06

me like yeah Physically like I know that

1:35:08

happens of course mm-hmm And

1:35:12

I know like not looking at it doesn't make it not happen

1:35:14

right, but it makes me so uncomfortable And I'm

1:35:16

happy to watch oh yeah, it makes me uncomfortable too,

1:35:19

but yeah, we do it to each other every day And

1:35:23

it's not good It's definitely not good,

1:35:25

but it's also I'm not gonna stop fun, but

1:35:28

why is it fun? It's fun because it's my friend Tom,

1:35:30

and we're both kind of the same in that way which Just

1:35:33

look at that look at this that I get look at this It

1:35:36

is just a thing we started doing a few months ago. It's

1:35:38

just can't stop And

1:35:40

Instagram has like learned that you do that so just keep showing

1:35:43

your more Instagram knows when I my

1:35:45

search page is A mess when

1:35:47

I go to the discover page stuff. Oh, it's

1:35:49

just crazy, but the thing is it shows up

1:35:51

in your feed, too That's what I

1:35:53

understand about the algorithm. It shows it knows

1:35:55

you're fucked up So it shows up in your feed

1:35:58

of things like even if They're

1:36:00

not people I follow, but Instagram

1:36:03

shows them to me anyway. I

1:36:05

heard an interesting thing a few days ago about Instagram

1:36:07

and the feed, which is if you use it at off

1:36:09

hours when they have more

1:36:12

processing capability available because less people are using

1:36:14

it, you get better recommendations. Your

1:36:16

feed will be better in the middle of the night. What

1:36:19

is better though? Doesn't your feed

1:36:21

more addictive to you or whatever? Right.

1:36:24

For me, better would be more murders, more animal

1:36:26

attacks. Sounds horrible. It's horrible.

1:36:29

It's just, it seems to

1:36:31

know that's what I like. It

1:36:34

seems to know that that's what I interact with, so it's just

1:36:36

sending me that most of the time.

1:36:40

Yeah. That probably has all kinds of crazy

1:36:43

psychological. I'm sure. Yeah,

1:36:45

I'm sure that's also one of the reasons why I want to get

1:36:47

rid of it and move away from it. Yeah,

1:36:49

so maybe, maybe, maybe it went

1:36:51

too far. I don't even know if it's too

1:36:54

far, but what it is is it's

1:36:56

showing me the

1:36:58

darker regions of society,

1:37:01

of civilization, of human

1:37:03

behavior. But you think we're about to edit all that

1:37:05

out. I wonder if that is a solution.

1:37:09

I really do because I don't

1:37:11

think it's outside the realm of possibility.

1:37:14

If we really, truly can engineer that ...

1:37:16

Like one of the talks about Neuralink that's

1:37:19

really promising is people

1:37:21

with spinal cord issues, injuries,

1:37:24

people that can't move their body, and

1:37:26

being able to hotwire that, where

1:37:30

essentially it controls all

1:37:32

these parts of your body that you couldn't control

1:37:34

anymore. That would be an amazing

1:37:37

thing for people that are injured, for

1:37:39

amazing things for people that are

1:37:43

paralyzed, they have all sorts of neurological

1:37:45

conditions. That

1:37:48

is probably one of the first, and that's

1:37:50

what Elon has talked about as well, when the first

1:37:52

implementations, the restoration

1:37:55

of sight, cognitive

1:37:57

function enhanced from people that have brain

1:37:59

issues.

1:37:59

issues.

1:38:01

That's tremendously exciting. Yeah.

1:38:04

And like many other technologies, I don't think

1:38:06

we can stop neural interfaces, nor

1:38:09

because of the like great good that's going to happen along

1:38:11

the way. I don't think we know where it goes.

1:38:14

It's Pandora's box for sure.

1:38:16

And I think when we open it, it's

1:38:19

just we're not going to go back, just

1:38:21

like we're not going to go back to no computers without

1:38:23

some sort of natural disaster. By

1:38:25

the way, I mean, this is a great compliment.

1:38:28

You are one of the most neutral people I have

1:38:30

ever heard talk about the merge come

1:38:32

in. You're just like, yeah, I think it's going

1:38:34

to happen. You know, it'd be good in these

1:38:36

ways, bad in these ways. But you

1:38:39

seem like unbelievably

1:38:41

neutral about it, which is always something I admire. I

1:38:43

try to be as neutral about

1:38:45

everything as possible, except for corruption,

1:38:48

which I think is just like one of the most

1:38:50

massive problems with the

1:38:52

way our culture is

1:38:55

governed. Then corruption, the

1:38:58

influence of money is just a giant,

1:39:01

terrible issue. But in terms

1:39:03

of like social issues and in terms

1:39:05

of the way human beings

1:39:08

believe and think about things, I

1:39:10

try to be as neutral as possible. Because

1:39:13

I think the only way to really

1:39:15

truly understand the way other people think about

1:39:18

things is try to look at it through their mind. And

1:39:20

if you have this inherent bias and

1:39:22

this, you have

1:39:24

this like very

1:39:27

rigid view of what's good

1:39:29

and bad and right and wrong, I

1:39:31

don't think that serves you very well

1:39:34

for understanding why people differ.

1:39:36

So I try to be as neutral and

1:39:39

as objective as possible when I look at

1:39:41

anything. This is a skill that I've

1:39:43

learned. This is not something I had in 2009 when

1:39:45

I started this podcast. This

1:39:47

podcast I started just fucking around with friends and

1:39:49

I had no idea what it was going

1:39:52

to be. I mean, there's no way I could have

1:39:54

ever known. But also I had no idea what

1:39:56

it was going to do to me as far

1:39:58

as the evolution of me. as a human being. I

1:40:01

am so much nicer. I'm so

1:40:03

much more aware of things.

1:40:05

I'm so much more conscious

1:40:07

of the way other people think and feel. I'm

1:40:10

just a totally different person than

1:40:12

I was in 2009, which is hard

1:40:14

to recognize. It's hard to believe.

1:40:16

That's really cool. But it's

1:40:19

just an inevitable consequence

1:40:21

of this unexpected education

1:40:24

that I've received. Did the empathy kind of like come

1:40:27

on linearly? Yes. That was not

1:40:29

a… No, it just came on… Well,

1:40:32

first of all, it came on recognizing that

1:40:35

the negative interactions on social

1:40:38

media that I was doing, they didn't

1:40:40

help me. They didn't help the person. And then

1:40:42

having compassion for this person that's fucked

1:40:44

up or done something stupid, it's not

1:40:47

good to just dunk on people. There's

1:40:49

no benefit there other than to give

1:40:51

you some sort of social credit and get a bunch of likes.

1:40:54

It didn't make me feel good. That's not

1:40:57

good. And also a lot of psychedelics, a ton

1:40:59

of psychedelic experiences from 2009 on and with

1:41:03

everyone a greater understanding

1:41:05

of the impact. I had one recently. And when

1:41:08

I had the one recently, the

1:41:11

overwhelming message

1:41:14

that I was getting through this was that everything

1:41:17

I say and do ripples

1:41:21

off into all the people that I interact

1:41:23

with. And then if I'm not doing

1:41:26

something with at least the goal

1:41:29

of overall good or

1:41:31

overall understanding that

1:41:34

I'm doing bad and that that

1:41:36

bad is a real thing, as much as you

1:41:38

try to ignore it because you don't interface with

1:41:40

it instantly and you're

1:41:43

still creating

1:41:45

unnecessary negativity and

1:41:48

that I should avoid that as much as possible.

1:41:50

It's like an overwhelming message

1:41:53

that this psychedelic experience was

1:41:55

giving me. And I took

1:41:59

it because I was just... particularly

1:42:01

anxious that day about the state of the

1:42:03

world, particularly anxious about Ukraine

1:42:06

and Russia and China

1:42:09

and the political

1:42:11

system that we have in this country and this

1:42:13

incredibly polarizing

1:42:16

way that the left and the right

1:42:18

engage with each other and God just

1:42:20

it just seems so just

1:42:24

tormented and so I was

1:42:26

just ugh some days I just get

1:42:28

I think too much about it I just I'm like I need something

1:42:30

yeah crack me out of this so I I

1:42:33

took the psychedelics

1:42:35

are you surprised psychedelic therapy

1:42:37

has not made

1:42:39

from what you thought would happen in the

1:42:41

early 2010s are you surprised it has not

1:42:44

made more progress sort of on a path to legalizationist

1:42:46

medical treatment

1:42:47

no no I'm not because there's

1:42:50

a lot of people that don't want it

1:42:52

to be in place and those people have tremendous

1:42:54

power over our medical system and

1:42:57

over a regulatory system and

1:42:59

those people have also not experienced

1:43:01

these psychedelics it is very few people

1:43:04

that have experienced profound

1:43:06

psychedelic experiences that don't think there's

1:43:08

an overall good for those things so

1:43:11

you're pro the problem is you're having these laws

1:43:15

and these rules implemented

1:43:17

by people who are completely ignorant about

1:43:20

the positive effects of these things and

1:43:23

if you know the history of psychedelic

1:43:27

prohibition in this country it all took

1:43:29

place during 1970 and it was really to stop

1:43:32

the civil rights movement and it was really to

1:43:34

stop the anti-war movement and they

1:43:36

they tried to find a way to make

1:43:39

all these things that these people were doing that

1:43:41

was causing them to thinking these very

1:43:43

different ways is tune in turn

1:43:45

on drop out that they just wanted to put

1:43:48

a fucking halt to that what better way than

1:43:50

to lock everyone who participates in that

1:43:52

in cages find the people are producing

1:43:55

it lock them in cages put them in jail

1:43:57

for the rest of life make sure it's illegal

1:43:59

arrest people, put the bus on television,

1:44:01

make sure that people are aware. And

1:44:04

then there's also you connect it to drugs

1:44:06

that are inherently dangerous for society

1:44:08

and detrimental, the fentanyl crisis,

1:44:11

the crack cocaine crisis that we experienced

1:44:13

in the 90s. Like all of those things, they're

1:44:16

under the blanket of drugs. Psychedelic

1:44:19

drugs are also talked

1:44:21

about like drugs, even though

1:44:23

they have these profound spiritual

1:44:27

and psychological changes that they, you

1:44:29

know, I remember

1:44:31

when I was in elementary school and I was in like drug

1:44:34

education, they talked about, you know, marijuana

1:44:36

is really bad because it's a gateway to these other

1:44:39

things. And there's this bad one, that bad one, heroin,

1:44:41

whatever. And the very end of the line, the worst possible

1:44:43

thing is LSD. Did

1:44:47

you take LSD and go, Oh, they're lying.

1:44:50

Psychedelic therapy was definitely

1:44:52

one of the most important things in my life.

1:44:55

And I

1:44:56

assumed, given how

1:44:59

powerful it was for me, like I

1:45:01

struggled with like all kinds of anxiety

1:45:04

and other negative things and to like,

1:45:06

watch all of that go away. And

1:45:09

like,

1:45:10

I traveled to my country for like a week, did

1:45:12

a few things, came back.

1:45:15

Totally different person. Yeah, like I've

1:45:17

been lied to my whole life. Yeah, I'm

1:45:19

so grateful that this happened to me now.

1:45:21

Talked to a bunch of other people, all similar experiences.

1:45:24

I assumed, this was a while ago, I assumed

1:45:26

it was, and I was like, you know, very interested in

1:45:29

what was happening in the US.

1:45:31

I was like, particularly like looking

1:45:34

at where MDMA and psilocybin

1:45:36

were on the path. And I was like, all right, this is going to

1:45:38

get through like this is, and this is going to like change

1:45:41

the mental health

1:45:42

of a lot of people in a really positive way. And

1:45:45

I am surprised we have not made faster progress

1:45:47

there, but I'm still optimistic we will. I

1:45:49

have made so much progress

1:45:52

from the time of the 1990s. In

1:45:55

the 1990s, you never heard about

1:45:57

psychedelic retreats. You never heard

1:45:59

about. people taking these vacations. You never heard about

1:46:02

people getting together in groups and

1:46:04

doing these things and coming back with these

1:46:06

profound experiences that they relate to other

1:46:08

people and literally seeing people

1:46:10

change, seeing who they

1:46:12

are, change, seeing people become

1:46:15

less selfish, less

1:46:17

toxic, less mean, and more

1:46:21

empathetic and more understanding. Yeah.

1:46:26

I mean, I can only talk about it from a personal experience.

1:46:28

It's been a radical change in my life, as

1:46:30

well as, again, having all these conversations

1:46:33

with different people. I feel so fortunate to be able to do

1:46:35

this, that I've had so many different

1:46:37

conversations with so many different people that think

1:46:40

so differently and so many exceptional

1:46:42

people that have accomplished so many incredible

1:46:44

things and you get to sort of understand

1:46:46

the way their mind works and the way

1:46:49

they see the world, the way they interface with things.

1:46:52

It's awesome.

1:46:53

It's pretty fucking cool. And that is one

1:46:55

of the cooler things about being a human,

1:46:57

that you can find a way

1:47:00

to mitigate all the negative aspects

1:47:02

of the monkey body. There

1:47:04

are tools that are in place, but unfortunately,

1:47:07

in this very prohibitive society,

1:47:10

this society of prohibition,

1:47:13

we're denied

1:47:15

those and we're denied ones that

1:47:17

have never killed anybody, which is really

1:47:19

bizarre when OxyContin

1:47:22

can still be prescribed. What's the deal

1:47:24

with why we

1:47:25

can't make –

1:47:28

if we leave like why we can't get these medicines

1:47:30

that have transformed people's lives

1:47:33

more available, what's the deal with

1:47:35

why we can't stop the

1:47:37

opioid crisis? Or

1:47:39

like Fentanyl seems like an unbelievable

1:47:42

crisis for San Francisco. You remember when at the

1:47:44

beginning of the conversation, we said that

1:47:46

AI will do a lot of

1:47:49

good, overall good, but also

1:47:51

not no harm. If

1:47:53

we legalize drugs, all

1:47:56

drugs, that would do

1:47:58

the same thing. Would you advocate for that? to legalize

1:48:00

all drugs? It's a very complicated question

1:48:03

because I think you're gonna have a lot of addicts

1:48:06

that wouldn't be addicts. You're gonna have a lot of people's

1:48:08

lives destroyed because it's legal. There's

1:48:10

a lot of people that may not be psychologically

1:48:13

capable of handling things. Maybe they

1:48:15

already have, like that's the thing about psychedelics.

1:48:17

They do not ever recommend them for people that

1:48:19

have a slippery grasp on reality as it is.

1:48:22

People that are struggling, people that are already

1:48:25

on a bunch of medications that allow

1:48:27

them to just keep a steady state

1:48:30

of existence in the normal

1:48:32

world. If you just fucking bombard

1:48:34

them with psilocybin, who knows

1:48:38

what kind of an effect that's gonna happen, whether or not

1:48:40

they're psychologically too fragile

1:48:42

to recover from that? I mean there's many

1:48:44

many stories of people taking too much acid and never

1:48:47

coming back. Yeah,

1:48:50

these are like...

1:48:55

Powerful doesn't seem to begin to cover it.

1:48:57

Right.

1:48:57

Yeah. But there's also what

1:49:00

is it about humans that are constantly

1:49:02

looking to perturb their normal state of

1:49:05

consciousness? Constantly, whether

1:49:07

it's we're both drinking coffee, you

1:49:09

know, people smoke cigarettes, they do

1:49:12

all, they take Adderall, they do all sorts of different

1:49:14

things to change and enhance their

1:49:16

normal state of consciousness. It seems like whether

1:49:19

it's meditation or yoga, they're always

1:49:21

doing something to try to get

1:49:23

out of their own way or get

1:49:25

in their own way or distract themselves

1:49:28

from the pain of existence. And

1:49:30

it seems like a normal part of humans

1:49:33

and even monkeys, like vervet monkeys get

1:49:35

addicted to alcohol. They get

1:49:37

addicted to fermented fruits and alcohol and they

1:49:39

become drunks and alcohols.

1:49:43

What do you think is the deep lesson there? Well,

1:49:46

we're not happy exactly, you

1:49:48

know, and then some things can make you happy

1:49:50

sort of for a couple of drinks makes you

1:49:52

so happy for a little bit until

1:49:55

you're an alcoholic, until you destroy your liver,

1:49:57

until you crash your car, until you're, you know...

1:50:00

You're involved in some sort of a

1:50:02

violent encounter that you would never

1:50:04

be involved with if you weren't drunk You

1:50:08

know, I love caffeine which clearly

1:50:10

is a drug

1:50:12

Alcohol like

1:50:14

I like but I often am like yeah, this

1:50:16

is like yeah, you know This

1:50:18

is like dalling me and I wish I hadn't had this drink

1:50:21

and then other stuff like I

1:50:24

Mostly would choose to avoid But

1:50:27

that's because you're smart And

1:50:29

you're probably aware of the pros and cons

1:50:31

and and you're also probably aware of

1:50:33

how it affects you And what's doing

1:50:36

good for you and what is detrimental

1:50:38

to you? But that's a decision

1:50:40

that you can make as an informed human

1:50:42

being that you're not allowed to make

1:50:44

if everything's illegal, right? Yeah,

1:50:48

right and also when

1:50:50

things are illegal Criminals

1:50:52

sell those things because it's you're not

1:50:55

tampering the desire by making

1:50:57

illegal You're just making access to

1:51:00

it much more complicated. What I was gonna

1:51:02

say is if fentanyl is really great. I

1:51:04

don't want to know Apparently it

1:51:06

is apparently it is. Yeah, Peter Berg

1:51:08

was on the podcast and he produced

1:51:10

that painkiller documentary for Netflix Yeah,

1:51:13

the the docudrama about the Sackler

1:51:15

family. It's amazing piece But

1:51:18

he said that he took Oxycontin

1:51:21

once recreationally and it was like oh

1:51:23

my god. It's amazing He's

1:51:26

like keep this away from me. It feels so

1:51:29

good Yeah, and that's part of

1:51:31

the problem is that yeah, it will wreck your

1:51:33

life. Yeah, it will it will capture

1:51:35

you But it's just so unbelievable But the feeling

1:51:38

like what how did Lenny Bruce describe it? I

1:51:40

think he described heroin as getting a warm hug

1:51:42

from God

1:51:44

Yeah,

1:51:45

I think the feeling that it gives you

1:51:47

is probably pretty spectacular. I don't

1:51:51

know if Legalizing

1:51:53

that is gonna solve the problems,

1:51:56

but I do know that another problem

1:51:59

that we're not paying attention to Is the rise of the cartels

1:52:01

and that fact that right across our border

1:52:03

where you can walk there are these? enormous

1:52:07

enormous organizations that

1:52:09

make who knows how much

1:52:11

money untold Uncalculable

1:52:14

amounts of money selling drugs and

1:52:16

bring them into this country and one of the things they do

1:52:18

is they put fentanyl and everything They make things

1:52:20

stronger and they do it for

1:52:23

like street Xanax There's people that have overdosed

1:52:26

thinking they're getting Xanax and they fucking die. Whoa. That's

1:52:28

all. Yeah, they do it with

1:52:30

cocaine, of course They

1:52:32

do it with everything There's so many

1:52:34

things that have fentanyl in them and they're

1:52:36

cut with fentanyl because fentanyl is cheap

1:52:39

and insanely potent And

1:52:42

that wouldn't be a problem if things were legal.

1:52:45

So do you would you net out towards saying all right? Let's

1:52:47

just yeah, I would I would net out towards

1:52:49

that But I would also put into place

1:52:51

some serious mitigation efforts like

1:52:54

in terms of counseling drug addiction and

1:52:56

I became therapy Which is another thing that someone

1:52:59

was just telling me about how transformative this was

1:53:01

for them Yeah, I haven't experienced that personally but

1:53:03

I began for many of my friends that

1:53:05

have had pill problems and I have

1:53:08

a friend my friend Ed Clay who Started

1:53:11

an Ibogaine Center in Mexico

1:53:13

because he had an injury

1:53:16

and he got hooked on pills and he couldn't kick

1:53:18

it Did I begin gone

1:53:20

one time done one time done

1:53:22

24-hour super uncomfortable experience? It's supposed to be a

1:53:24

horrible experience. Yeah, it's supposed to be not

1:53:26

very recreational Not exactly

1:53:29

something you want to do on the weekend with friends it's

1:53:31

something you do because you're fucked and you need

1:53:33

to figure out how to get out of this fuckness and

1:53:36

That like we think about how

1:53:38

much money is spent on rehabs in this country

1:53:40

and what's the relapse rate?

1:53:43

It's really high. I mean, I have

1:53:45

many friends that have been to rehab for drug

1:53:47

and alcohol abuse and Quite

1:53:50

a few of them went right back to it quite

1:53:52

a few. It doesn't seem to be that effective It

1:53:55

seems to be effective to people when

1:53:57

people have like they really hit rock bottom

1:53:59

and they have a strong will and then they get involved

1:54:01

in a program, some sort of a 12-step program,

1:54:03

some sort of a narcotic synonymous program

1:54:06

and then they get support

1:54:08

from other people and they eventually build

1:54:10

this foundation of other types of behaviors

1:54:13

and ways to find other things

1:54:15

to focus on to whatever

1:54:18

aspect of their mind that allows them to be addicted

1:54:20

to things. Now it's focused on exercise,

1:54:23

meditation, yoga, whatever it is, that's

1:54:25

your new addiction and it's a much more positive

1:54:27

and beneficial addiction. But the reality

1:54:30

of the physical addiction that

1:54:33

there are mitigation efforts, like there's

1:54:35

so many people that have taken psilocybin and completely

1:54:38

quit drugs, completely quit cigarettes,

1:54:40

completely quit a lot because they realize like oh,

1:54:43

this is what this is, this is why I'm

1:54:45

doing this. Yeah,

1:54:48

that's why I was more optimistic that

1:54:50

the world would have made faster progress towards

1:54:53

acceptance of, you hear so many

1:54:55

stories like this. So I would say like all

1:54:57

right, clearly a lot of our existing mental health

1:54:59

treatment

1:55:00

at best doesn't work. Clearly our addiction

1:55:02

programs are

1:55:04

ineffective.

1:55:05

If we have this thing that in every

1:55:08

scientific study or most scientific studies

1:55:10

we can see is delivering these like unbelievable

1:55:12

results,

1:55:14

it's going to happen. And

1:55:18

yeah, I still am excited for

1:55:20

it. I still think it'll be a transformatively positive development

1:55:22

but...

1:55:24

It'll change politics. It'll

1:55:27

absolutely change the way we think of other

1:55:29

human beings. It'll absolutely change the way we think

1:55:31

of society and culture as a whole. It'll

1:55:34

absolutely change the way people interact with each other

1:55:36

if it becomes legalized

1:55:39

and it's slowly becoming legalized. Like think of

1:55:41

marijuana, which is like the gateway

1:55:43

drug. Marijuana is now

1:55:45

legal recreationally in how

1:55:47

many states? And

1:55:51

then medically and many more. And

1:55:56

it's really easy to get a license medically.

1:56:00

I got one in 1996. He

1:56:02

used to be able to just go somewhere and go, I got a headache.

1:56:06

That's it. Yeah, I get headaches.

1:56:08

I'm in pain a lot. I do a lot of martial arts.

1:56:10

I'm always injured. I need some medication. I

1:56:13

don't like to get pain pills. Bam.

1:56:15

You got the legal prescription for weed. I used to have to go to Inglewood

1:56:17

to get it. I used to have to go to

1:56:20

the Inglewood Wellness Center. I

1:56:23

was like, this is crazy. Marijuana

1:56:25

is now kind of legal. And

1:56:29

in 2016, it became legal in California

1:56:31

recreationally. It just changed everything.

1:56:34

I had all these people that were like right wing

1:56:37

people that were taking edibles to

1:56:39

sleep. Now

1:56:42

that it's legal, they thought about it in a different way.

1:56:45

I think that that drug,

1:56:47

which is a fairly mild psychedelic,

1:56:50

also has enhancing effects.

1:56:53

It makes people more compassionate. It makes people

1:56:55

more kind and friendly. It's

1:56:57

sort of the opposite of a drug that enhances

1:57:00

violence. It doesn't enhance violence at all.

1:57:03

Alcohol does that. Cocaine does that.

1:57:05

To say a thing that will make me very unpopular, I hate

1:57:08

marijuana. It does not sit well with me at

1:57:10

all. What does it do to you that you don't like? It makes

1:57:12

me tired and slow for a long time after

1:57:14

it. Well, I think also there's biological

1:57:17

variabilities, right?

1:57:18

Like some people, like my wife, she

1:57:21

does not do well with alcohol. She can get drunk off one

1:57:23

drink. But it's biological. She

1:57:26

got some sort of a genetic test. I forget what it's

1:57:28

called, something about Billy Rubin, like something

1:57:30

that her body just doesn't process

1:57:33

alcohol very well. So she's a cheap date. Oh, all

1:57:35

I meant is that genetically I got whatever the mutation

1:57:37

is that makes it an unpleasant experience. Yeah.

1:57:40

But what I was saying is for me, that's the opposite. Alcohol

1:57:42

doesn't bother me at all. I could drink

1:57:44

three, four drinks and I'm sober in 20 minutes. My

1:57:47

body, my liver is just like a blast.

1:57:50

It just goes right through it. I can sober up real quick.

1:57:52

But I also don't need it. Like

1:57:55

I'm doing sober October for

1:57:57

the whole month. I don't train. Feel good? Great.

1:58:00

No problems that not having alcohol

1:58:03

doesn't seem to bother me at all, but

1:58:05

I do like it I do like a glass

1:58:08

of wine nice thing at the end. Yeah, I like

1:58:10

it

1:58:11

speaking of that and psychedelics

1:58:13

in general I you know many cultures

1:58:16

have had a place for Some sort

1:58:18

of psychedelic

1:58:19

time in someone's life or rite of passage, but

1:58:22

as far as I can tell most of them are under

1:58:26

There's some sort of ritual about it, and

1:58:28

I don't worry that

1:58:31

and I think these

1:58:33

Things are so powerful that I worry about them.

1:58:35

Just being like kind of

1:58:37

Yeah, used all over the place all the time. Yeah, and

1:58:40

I

1:58:41

Hope that we as a society because I think this

1:58:43

is not gonna happen even if it's slow

1:58:45

find a way to treat this with The

1:58:48

respect that it needs

1:58:49

Yeah, we'll

1:58:51

see how that goes agreed. Yeah, I

1:58:54

think set and setting is

1:58:56

very important and Thinking about

1:58:59

what you're doing before you're doing it while

1:59:01

you're doing it Like I was saying the other night

1:59:03

when I had this psychedelic experience. I was just

1:59:05

like Sometimes I

1:59:07

just think too much about the world and that

1:59:10

it's so fucked and you have kids

1:59:12

and you wonder like what what kind of A world are they

1:59:14

gonna grow up in and what is and

1:59:16

it was just one of those days where I was just like God

1:59:18

There's so much anger and there's so much this and

1:59:20

that and then it's just

1:59:23

It took it away from me the rest of the day like

1:59:25

that night I was so friendly and so happy

1:59:27

and I just want to hug everybody. I just

1:59:30

really got it I go. Oh my god.

1:59:32

That's not thinking about it wrong Do

1:59:34

you think the anger in the world is way higher

1:59:36

than it used to be or we just

1:59:39

it's like all these dynamics Social media we were talking

1:59:41

about I think the dynamics and social media certainly

1:59:44

Exacerbated anger in some people, but

1:59:46

I think anger in the world is

1:59:48

just a part of Frustration

1:59:52

inequality Problems

1:59:55

that are so clear but are not solved

1:59:58

and all the issues that people have I

2:00:00

mean it's not a coincidence

2:00:02

that a lot of the mass violence that

2:00:04

you're seeing in this country mass looting and

2:00:06

all these different things Are being done by poor people? Do

2:00:09

you think AGI will be an equalizing force for

2:00:11

the world or further inequality? I think it

2:00:13

would be it depends on how it's implemented

2:00:17

I my concern is again what

2:00:19

we're talking about before with some

2:00:21

sort of a neural interface That

2:00:23

it will increase your ability to be productive

2:00:26

to a point where you can control resources So

2:00:29

much more than anyone else and you

2:00:31

will be able to advance your Your

2:00:34

economic portfolio and your influence

2:00:36

in the world through that your amount of power that

2:00:39

you can acquire It

2:00:41

will Before the other

2:00:43

people can get involved because I would

2:00:45

imagine Financially, it'll be like

2:00:47

cell phones in the beginning. You remember when

2:00:50

the movie Wall Street when he had

2:00:52

that big brick Oh, yeah, I'll phone. It's like look

2:00:55

at him. He's out there on the beach with a phone.

2:00:57

That was crazy No one had one of those things back

2:00:59

then and they were so rare I

2:01:01

got one in 1994 when I first

2:01:03

moved to California and I thought I was living in the fucking

2:01:05

giant thing No, it was a Motorola star

2:01:07

tack. That was a cool phone. I actually had one on

2:01:10

in my car in 1988 I

2:01:12

was on the first people to get a cell phone I

2:01:15

got one in my car and it was great because

2:01:17

my friend my friend Bill Blumenreich who

2:01:20

runs the Comedy connection he

2:01:23

would call me because he knew

2:01:25

he could get a hold of me like someone got sick or

2:01:27

fell out I could get a gig because he could call

2:01:29

me so I was in my cars like Joe What are you

2:01:31

doing? Do you have a spot tonight, and I'm

2:01:33

like no I'm open He's like fantastic,

2:01:35

and so he'd give me gigs So I got a bunch

2:01:38

of gigs through this phone work kind

2:01:40

of paid itself But I got

2:01:42

it just because it was cool like I could drive

2:01:44

down the street and call people dude I'm driving

2:01:46

and I'm calling you like it was nuts To

2:01:49

be able to drive and I had a little antenna little

2:01:51

squirrely I with a little antenna

2:01:53

on my car on the roof of the car, but

2:01:57

now everyone has one You

2:02:00

know, you can go to a third world country and

2:02:02

you know people in small villages have phones.

2:02:05

It's it's super common It's

2:02:07

everywhere essentially more people have phones

2:02:10

than don't have phones. There's more phones than there

2:02:12

are human beings Which is pretty

2:02:14

fucking wild and I think that

2:02:17

that initial

2:02:19

cost problem

2:02:21

It's going to be prohibitively expensive

2:02:23

initially And the problem is the wealthy

2:02:26

people are going to be able to do that Yeah And then

2:02:28

the real crazy ones that wind up getting

2:02:30

the holes drilled in their head and if

2:02:32

that stuff is effective Maybe it's not gender. Maybe

2:02:34

there's problems with generation one, but generation

2:02:37

two is better There's

2:02:39

gonna be a time where you have to enter

2:02:41

into the game There's gonna be a time where you have to sell your

2:02:43

stocks like you don't don't wait too long

2:02:46

hang on there go and once

2:02:49

that happens My concern

2:02:51

is that the people that have that will have such

2:02:55

a Massive advantage over everyone

2:02:57

else that the have the gap between the haves

2:02:59

and have nots will be even further

2:03:02

And it'll be more polarized. This is something

2:03:05

i've changed my mind on I you know

2:03:08

someone

2:03:15

Because it could be such an incredible

2:03:18

Distortion of power and then we're going to have

2:03:20

to have some sort of societal discussion

2:03:22

about this. Yeah,

2:03:24

that seems real That

2:03:26

seems like yeah especially

2:03:29

if it's as effective as

2:03:32

AGI is or as uh, excuse me

2:03:34

chat gpt is chat gpt

2:03:37

is so amazing When

2:03:39

you enter into it information you ask

2:03:41

it questions and it can give you answers and

2:03:43

you could ask it to code a website For you and it

2:03:45

doesn't instantly and it solves problems Like

2:03:48

literally you would have to take decades to

2:03:50

try to solve and it gets to it right

2:03:53

away This is the dumbest it will ever be yeah,

2:03:56

that's it's crazy. That's it's crazy. So

2:03:58

imagine something like that but even

2:04:01

more advanced. Multiple stages

2:04:04

of improvement and innovation forward

2:04:07

and then it interfaces

2:04:09

directly with the mind but it only does

2:04:11

it with the people that can afford it. Those

2:04:14

people are just regular humans. They

2:04:16

haven't been enhanced. We

2:04:21

haven't evolved physically. We

2:04:23

still have all the human reward systems in place.

2:04:25

There's still basically these territorial primates

2:04:29

and now we have – you

2:04:31

just imagine some fucking psychotic

2:04:34

billionaire who now gets

2:04:36

this implant and decides

2:04:39

to just completely hijack

2:04:41

our financial system, acquire all

2:04:43

the resources, set into place

2:04:46

regulations and influences that only benefit

2:04:48

them and then make sure that they

2:04:50

can control it from there on out. This

2:04:52

actually though even requires like a physical

2:04:55

implant

2:04:56

or like a physical merge versus

2:04:58

just some people have access to

2:05:01

GPT-7 and can spend a lot on the

2:05:03

inference compute for it and some don't.

2:05:06

I think that's going to be very transformative too

2:05:09

but my thought is that

2:05:11

once – I mean we

2:05:13

have to think of what are the possibilities

2:05:16

of a neural enhancement. If you think

2:05:19

about the human mind and how the human

2:05:21

mind interacts with the world, how you

2:05:23

interact with language and thoughts

2:05:25

and facts and

2:05:28

something that is exponentially

2:05:31

more powerful than that but it

2:05:34

also allows you to use

2:05:36

the same emotions, the

2:05:38

same ego, the same desires

2:05:40

and drives, jealousy, lust,

2:05:43

hate, anger, all of those

2:05:45

things but with this godlike

2:05:48

power when one person

2:05:51

can read minds and other people can't,

2:05:53

when one person has a

2:05:56

completely accurate forecast of

2:05:59

all of the trends. in terms of stocks

2:06:01

and resources and commodities,

2:06:03

and they can make choices

2:06:06

based on those. I totally see all

2:06:08

of that. The only thing I feel

2:06:11

a little confused about is

2:06:16

human

2:06:18

talking and listening bandwidth or typing

2:06:20

and reading bandwidth is not very high. But

2:06:22

it's high enough where if you can just say like tell

2:06:25

me everything that's going to happen in the stock market if I

2:06:27

want to go make all the money, what should I do right now?

2:06:29

And then it just shows you on the screen.

2:06:32

Even without a neural interface, you're

2:06:34

kind of a lot of the way there you're describing.

2:06:37

Sure, with stocks. Or

2:06:39

with like tell me how to like

2:06:42

invent some new technology that will change

2:06:44

the course of history.

2:06:46

Yeah.

2:06:48

Yeah. All those things.

2:06:51

I think what somehow matters is access

2:06:53

to massive amounts of computing power,

2:06:56

especially like differentially massive amounts, maybe

2:06:59

more than the interface itself. I think that

2:07:02

certainly is going to play a massive factor

2:07:04

in the amount of power and influence

2:07:07

a human being has, having access to

2:07:09

that. My concern

2:07:11

is that what neural interfaces

2:07:14

are going to do is now

2:07:16

you're not a human mind

2:07:19

interacting with that data. Now

2:07:21

you are some massively

2:07:24

advanced version of

2:07:26

what a human mind is. And

2:07:30

it has just

2:07:32

profound possibilities

2:07:35

that we can't even imagine.

2:07:38

We can imagine, but we can't

2:07:42

truly conceptualize them because we

2:07:44

don't have the context. We don't have that

2:07:47

ability and that possibility currently.

2:07:50

We can just guess, but when it does

2:07:53

get implemented, that you're

2:07:55

dealing with a completely different being.

2:08:02

The only true thing I can say is I don't know.

2:08:05

Yeah. Do you wonder

2:08:08

why it's you? Do you ever

2:08:10

think like how am I at the

2:08:13

forefront of this spectacular

2:08:15

change?

2:08:21

Well, first of all, I think it's very much like...

2:08:24

I think you could make the statement from

2:08:26

many companies, but none

2:08:28

as

2:08:29

true force open AI. The

2:08:32

CEO is far from the most important person

2:08:34

in the company.

2:08:35

In our case, there's a large handful

2:08:37

of researchers, each of which are individually

2:08:40

more critical to the success we've had so

2:08:42

far than we will have in the future than me. But

2:08:46

even

2:08:47

that – and I bet those

2:08:49

people really are like, this is weird

2:08:51

to be them. But it's

2:08:54

certainly weird enough for me that it

2:08:57

like ups my simulation hypothesis probability

2:08:59

somewhat.

2:09:03

If you had to give a guess,

2:09:08

when you think about the possibility of simulation

2:09:10

theory, what kind of percentage do you –

2:09:13

I've never known how to put any number on

2:09:15

it.

2:09:16

It's the every argument that

2:09:18

I've read, written, explaining why it's like super

2:09:20

high probability. That all seems reasonable to

2:09:22

me. It feels impossible to reason about

2:09:24

though.

2:09:26

What about you? Yeah, same

2:09:28

thing. I'd go maybe,

2:09:30

but it's still what it is. I'm going to have

2:09:33

to do this. That's the main thing is I think it doesn't matter.

2:09:35

I think it's like okay. It

2:09:38

definitely matters,

2:09:41

I guess, but there's not a way to know

2:09:44

currently. What matters

2:09:46

though? Well, if this really

2:09:48

is – I mean, our inherent understanding

2:09:50

of life is that we are these biological

2:09:52

creatures that interact with other biological

2:09:55

creatures. We mate and

2:09:57

breed, and that this creates more

2:09:59

of us. And then hopefully as society

2:10:02

advances and we acquire more information, more

2:10:04

understanding and knowledge, this next version

2:10:06

of society will be superior to

2:10:08

the version that preceded it, which

2:10:10

is just how we look at society today.

2:10:13

Nobody wants to live in 1860 where

2:10:16

you died of a cold and there's no cure

2:10:18

for infections. It's much better

2:10:20

to be alive now.

2:10:23

Just

2:10:24

inarguably. Unless

2:10:27

you really do prefer the simple

2:10:29

life that you see on Yellowstone or something, it's

2:10:32

like what we're dealing with now,

2:10:35

first of all, access to information,

2:10:38

the lack of ignorance. If

2:10:41

you choose to

2:10:43

seek out information, you have so much

2:10:45

more access to it now than ever before. That's so

2:10:47

cool. And over time – like

2:10:49

if you go back to the beginning of written history

2:10:52

to now, one

2:10:54

of the things that is clearly

2:10:56

evident is the more access to information,

2:10:59

the better choices people can make. They don't always make

2:11:01

better choices, but they certainly have much

2:11:03

more of a potential to make better choices with

2:11:05

more access to information. We

2:11:09

think that this is just this biological

2:11:12

thing, but imagine if that's not what's

2:11:14

going on. Imagine if this is a program

2:11:17

and that you are just consciousness that's

2:11:19

connected to this thing

2:11:22

that's creating this experience

2:11:25

that is indistinguishable

2:11:28

from what we'd like to think of as

2:11:30

a real biological experience from

2:11:32

carbon-based life forms interacting

2:11:34

with solid physical things. It's still unclear to me

2:11:36

what I'm supposed to do differently or think

2:11:41

differently. Yeah, there's no answer. Yeah, you're 100% right. What can you do differently?

2:11:47

I mean, if you

2:11:49

exist as if it's a simulation,

2:11:51

if you just live your life as if it's a simulation, is that –

2:11:54

I don't know if that's the solution. I don't –

2:11:56

I think – I

2:12:00

mean it's real to me no matter what. Mm-hmm. It's

2:12:03

real. Yeah. I'm going

2:12:05

to live it that way. Mm-hmm. And that will be the problem

2:12:07

with an actual simulation if and when it does

2:12:09

get implemented. Yeah. If we

2:12:12

do create an actual simulation

2:12:14

that's indistinguishable from real life, like

2:12:18

what are the rules of the simulation? How

2:12:20

does it work? Is that simulation fair

2:12:23

and equitable and much more reasonable

2:12:25

and peaceful? Is there no war

2:12:27

in that simulation? Should we all agree

2:12:30

to hook up to it because

2:12:32

we'll have a completely different experience

2:12:34

in life and all the angst

2:12:37

of crime and violence and

2:12:39

the things that we truly are terrified of, there

2:12:42

will be non-existent in this simulation.

2:12:46

Yeah.

2:12:46

I mean if we keep going, it

2:12:49

seems like if you just extrapolate

2:12:51

from where VR is now. Did you see the

2:12:53

podcast that Lex

2:12:56

Friedman did with Mark Zuckerberg? I

2:12:58

saw some clips, but I haven't got to watch it all. Bizarre,

2:13:00

right? So they're essentially using very

2:13:04

realistic physical avatars in

2:13:06

the metaverse.

2:13:07

Like

2:13:09

that's step one. That's

2:13:11

Pong. Maybe that's step three. Maybe it's a little

2:13:13

bit on Pong at that point. Yeah. Maybe

2:13:16

it's Atari. Maybe you're playing Space Invaders now. But

2:13:18

whatever it is, it's on the path

2:13:20

to this thing that will be indistinguishable.

2:13:23

That seems inevitable. Those two things

2:13:25

seem inevitable to me. The inevitable

2:13:28

thing to me is that we will create a life form

2:13:31

that is an artificial, intelligent

2:13:35

life form that's far more advanced than us.

2:13:38

Once it becomes sentient, it will be able to

2:13:40

create a far better version of itself. And

2:13:43

then as it has better

2:13:45

versions of itself, it will keep

2:13:47

going. And if it keeps going, it

2:13:49

will reach god-like

2:13:52

capabilities. The complete

2:13:55

understanding of every aspect

2:13:57

of the universe and the Structure

2:14:00

of it itself how to manipulate

2:14:02

it how to travel through it how to

2:14:04

communicate And

2:14:07

that you know if we keep going if we survive

2:14:10

a hundred years a thousand years ten

2:14:12

thousand years and we're still on this same

2:14:15

technological exponential increasing

2:14:17

and capability path That's

2:14:20

God We become

2:14:23

something like a God and

2:14:25

that might be what we do that

2:14:28

might be what? intelligent curious

2:14:30

Innovative life actually does it creates

2:14:34

something that creates the very universe

2:14:36

that we live in Yeah,

2:14:40

maybe that's the birth of the universe itself

2:14:43

is creativity and intelligence and

2:14:46

that it all comes from that I have

2:14:48

this joke about the Big Bang Like

2:14:51

what if what if the Big Bang is

2:14:53

just a natural thing like humans get

2:14:55

so advanced that they create a Big Bang machine And

2:14:58

then you know we're so autistic and

2:15:00

riddled with Adderall that we did no

2:15:03

Concept or worry of the consequences

2:15:05

and someone's like I'll fucking press it and

2:15:08

they press it and shh Oh, we

2:15:10

start from scratch every 14 billion

2:15:12

years and then that's what a

2:15:14

Big Bang is I Mean

2:15:18

I don't know where it goes But

2:15:20

I do know that if you looked at

2:15:22

the human race from afar if you were an

2:15:24

alien lifeform completely detached

2:15:27

from Any understanding

2:15:29

of our culture any other understanding

2:15:31

of our biological? Imperatives

2:15:34

and you just looked at like what is this one

2:15:37

dominant species doing on this planet?

2:15:39

It makes better things That's what it does

2:15:42

that I agree goes to war it You

2:15:45

know it steals it does a bunch of things that

2:15:47

it shouldn't do it pollutes It does

2:15:49

all these things that are terrible, but

2:15:51

it also Consistently

2:15:54

and constantly creates better things whether

2:15:56

it's better weapons going from the catapults

2:16:00

to the rifle, to the cannonballs,

2:16:02

to the rocket ships, to the hypersonic

2:16:04

missiles, to nuclear bombs. It creates

2:16:06

better and better and better things. That's

2:16:09

the number one thing it does. It's never

2:16:11

happy with what it has. You

2:16:15

add that to consumerism, which is

2:16:17

baked into us, and this desire,

2:16:20

this constant desire for newer, better things,

2:16:23

well, that fuels that innovation because that

2:16:25

gives it the resources that it needs to consistently

2:16:27

innovate and constantly create newer and

2:16:29

better things. If I was an alien

2:16:32

life form, I was like, oh, what is it doing? It's

2:16:34

trying to create better things. What is the forefront

2:16:37

of it? Technology.

2:16:39

Technology is the most transformative, the most spectacular,

2:16:42

the most interesting thing that we create, and

2:16:44

the most alien thing. The fact

2:16:47

that we just are so comfortable that you can FaceTime

2:16:49

with someone in New Zealand, like instantly.

2:16:53

We can get used to anything pretty quickly. Take

2:16:55

it for granted, Omar. If

2:16:59

you were an alien life form and you were looking at us,

2:17:01

you're like, what is it doing? It keeps making

2:17:03

better things. It's going to keep making better things.

2:17:06

Well, if it keeps making better things, it's going to

2:17:08

make a better version of a thinking thing. It's

2:17:11

doing that right now. You're a part of that. It's

2:17:13

going to make a better version of a thinking thing.

2:17:15

With that better version of a thinking thing, it's basically

2:17:18

now in the amoeba stage, just in

2:17:20

the small multicellular life form

2:17:22

stage. What if

2:17:24

that version becomes a

2:17:27

fucking Oppenheimer? What if that version,

2:17:29

if it scales up so

2:17:31

far that it becomes so

2:17:34

hyper intelligent that it is completely

2:17:37

alien to any other intelligent

2:17:39

life form that has ever existed here before, and

2:17:42

it constantly does the same thing, makes better

2:17:44

and better versions of it. Well, where does that go?

2:17:47

It goes to a god. It goes to something

2:17:49

like a god, and maybe god is

2:17:51

a real thing, but maybe it's a

2:17:54

real consequence of this process

2:17:57

that human beings have of consistently

2:18:00

constantly innovating and constantly

2:18:03

having this desire to

2:18:05

push this envelope of creativity

2:18:09

and of technological power. I

2:18:12

guess it comes down to maybe a definitional

2:18:14

disagreement about

2:18:15

what you mean by it becomes a god. Like

2:18:18

I can totally...

2:18:20

I think it becomes something much, like

2:18:23

unbelievably much smarter and more capable

2:18:25

than we are. And what does that thing become

2:18:28

if that keeps going?

2:18:30

And maybe the way you mean it as a god-like

2:18:33

force is that that thing can then go create, can

2:18:35

go simulate in a universe. Yes. Okay,

2:18:39

that I can resonate with. Yeah. I think whatever

2:18:41

we create will still be subject to the laws of

2:18:43

physics in this universe. Right.

2:18:45

Yeah, maybe that is the overlying fabric

2:18:48

that God exists in. The

2:18:50

God word is a fucked up word because it's just

2:18:52

been so co-opted. But you know,

2:18:54

I was having this conversation with Stephen Meyer

2:18:57

who is... He's a physicist.

2:19:00

I believe he's a physicist? He is a physicist. And

2:19:02

he's also religious. It

2:19:04

was a real weird conversation. Very fascinating

2:19:06

conversation. What kind of religion? Believer in Christ. Yeah,

2:19:09

he even believes in the resurrection, which

2:19:12

I found very interesting. But

2:19:14

you know, it's interesting

2:19:17

communicating with him because he has these little pre-

2:19:22

designed speeches that

2:19:24

he's encountered all these questions so many

2:19:26

times. That he has these very well-worded,

2:19:29

very articulate responses to these things

2:19:32

that I censor like bits. You

2:19:34

know, like when I'm talking to a comic and like a

2:19:36

comic like, oh, I got this bit on train

2:19:38

travel. And they just tell you the bit.

2:19:41

Like that sort of is like he has bits on

2:19:43

why he believes in Jesus and

2:19:45

why he believes in him. And very very

2:19:47

intelligent guy. But I propose the question

2:19:51

when we're thinking about God, what if the instead

2:19:53

of God created the universe, what if the universe

2:19:55

is God and the creative

2:19:57

force of all life and all... everything

2:20:00

is the universe itself. Instead

2:20:03

of thinking that there's this thing that

2:20:05

created...

2:20:06

This is like close to a lot of the Eastern religions.

2:20:08

I think it's an easier thing to wrap my mind

2:20:10

around than any other religions for me. And

2:20:13

that

2:20:13

is... when I do psychedelics

2:20:16

I get that feeling. I get that feeling like

2:20:18

there's this insane soup

2:20:21

of like innovation and

2:20:24

connectivity that exists all

2:20:26

around us. But our minds

2:20:29

are so primal. We're this fucking

2:20:31

thing. This is what we used to

2:20:33

be. And that... What is that? There's

2:20:36

a guy named Shane

2:20:38

Against the Machine who's this artist who

2:20:40

created this. It's a chimpanzee skull

2:20:42

that he made out of zilgin symbols. See that?

2:20:45

He left that on the back

2:20:48

and he just made this dope art piece. Cool. It's

2:20:51

just cool. But I

2:20:53

wonder if our limitations are

2:20:56

that we are an advanced version of

2:20:59

primates. We're still... We still have

2:21:01

all these things we talked about. Jealousy, envy, anxiety,

2:21:03

lust, anger, fear,

2:21:06

violence. All these things that are detrimental

2:21:09

but were important for us to survive

2:21:11

and get to this point. And that

2:21:14

as time goes on we will figure out

2:21:16

a way to engineer those out. And

2:21:19

that as intelligent life

2:21:21

becomes more intelligent and we create

2:21:24

a version of intelligent life that's

2:21:26

far more intelligent than what we are. We're

2:21:28

far more capable of what we are. If

2:21:30

that keeps going, if it just keeps

2:21:33

going. I mean chat GPT. Imagine

2:21:35

if you go to chat GPT and go back to Socrates

2:21:39

and show him that. Explain that and

2:21:41

show him a phone and you know and put it

2:21:43

on a phone and have access to it. He'd be like, what

2:21:46

have you done? Like what is this? I

2:21:48

bet he'd be much more impressed with the phone than chat GPT. I

2:21:51

think you'd be impressed with the phone's abilities

2:21:53

to communicate for sure. But then the

2:21:56

access to information would be so profound.

2:21:58

I mean back then... I mean,

2:22:00

look, you're dealing with a time when Galileo

2:22:03

was put under house arrest because

2:22:05

he had the gumption to say that

2:22:08

the Earth is not the center of the universe. Well

2:22:10

now we fucking know it's not. Like

2:22:12

we have satellites. We send literal

2:22:15

cameras into orbit to take photos of things.

2:22:18

No, I totally get that. I just meant that

2:22:21

we kind of know what it's like to talk to a smart

2:22:23

person. And so in that sense, you're like, oh, all

2:22:25

right. I didn't think you could like talk to a

2:22:28

not person and have them be person like

2:22:30

in some responses some of the time on

2:22:32

a phone. Man, if you just like woke up

2:22:34

after 2000 years and there was like a phone

2:22:36

that would, you have no model for that. You didn't

2:22:38

get to get there gradually. Yeah, no,

2:22:40

you didn't get. My friend

2:22:42

Eddie Griffin has a joke about that. He's

2:22:45

about how Alexander Graham Bell had to be

2:22:47

doing Coke. He goes, because only someone

2:22:50

on Coke would be like, I want to talk to

2:22:52

someone who's not even here. And

2:22:57

that's what a phone is. Is that something Coke makes people

2:22:59

want to do? I don't know. I've never done Coke, but

2:23:01

I would imagine it is. I

2:23:04

mean, it just makes people angry

2:23:06

and chaotic. Yeah, a little of that, but they also

2:23:08

have ideas. Yeah,

2:23:12

I mean, but back to this, where does

2:23:14

it go? If it keeps

2:23:17

going, it has to go

2:23:19

to some impossible level

2:23:21

of capability. I mean, just think of

2:23:24

that. I believe it's going to happen. What

2:23:26

we're able to do now with nuclear power and nuclear

2:23:29

bombs and hypersonic

2:23:32

missiles, just the insane

2:23:35

physical things that we've been able to

2:23:37

take out of the human creativity

2:23:39

and imagination and through engineering

2:23:42

and technology, implement these

2:23:44

physical devices that

2:23:47

are indistinguishable from magic

2:23:49

if you brought them 500 years ago. Yeah.

2:23:52

I

2:23:54

think it's quite remarkable. So keep

2:23:56

going. Keep going 100,000 years from now, if

2:23:58

we're still here. If something like us

2:24:00

is still here, what can it do? In

2:24:04

the same way that I don't think Socrates would have predicted

2:24:07

the phone, I can't predict that. No,

2:24:09

I'm probably totally off. But maybe

2:24:11

that's also why comets exist.

2:24:13

Maybe it's a nice reset. Just like

2:24:15

leave a few around, give

2:24:18

them a distant memory of the

2:24:20

utopian world that used to exist, have

2:24:23

them go through thousands of years of barbarism,

2:24:25

of horrific behavior,

2:24:28

and then reestablish society. This

2:24:30

is the Younger Dryas Impact Theory that allowed 11,800

2:24:32

years ago at the end of the ice

2:24:34

age that we were hit by multiple

2:24:37

comets that caused the instantaneous

2:24:41

melting of the ice caps over North America.

2:24:43

Flooded everything. Flooded everything, the

2:24:46

source of the flood myths from the Epic

2:24:48

of Gilgamesh and the Bible and all those

2:24:50

things. And also there's physical

2:24:53

evidence of it when they do core samples. There's

2:24:55

high levels

2:24:56

of iridium,

2:24:57

which is very common in space, very rare on Earth.

2:25:00

There's micro diamonds that are from impacts

2:25:02

and it's like 30% of the Earth has evidence of

2:25:06

this. And so it's very likely that

2:25:08

these people that are proponents of this theory are

2:25:10

correct and that this is why they

2:25:13

find these ancient structures that they're

2:25:15

now dating to like 11,000, 12,000 years

2:25:17

ago when they thought people were hunter gatherers. And

2:25:19

they go, okay, maybe our timeline is really off

2:25:22

and maybe this physical evidence of the impacts.

2:25:25

Interesting. Yeah. Randall

2:25:27

Carlson is the greatest guy to pay attention to. Randall Carlson. Yeah.

2:25:30

He's kind of dedicated his whole life to it, which by the

2:25:32

way happened because of a psychedelic experience.

2:25:35

He was on acid once and he was looking

2:25:38

at this immense canyon and

2:25:41

he had this vision that it was created by instantaneous

2:25:44

erosions of the polar caps

2:25:47

and that it just washed this wave

2:25:49

of impossible water through

2:25:51

the Earth. It just caught these

2:25:53

paths. And now

2:25:55

there seems to be actual physical evidence

2:25:57

of that. That is probably what took.

2:25:59

place

2:26:01

and that we're just the

2:26:03

survivors and that we have re-emerged

2:26:07

and that society and human civilization occasionally

2:26:10

gets set back to a primal

2:26:13

place. Yeah. Who knows?

2:26:16

If you're right that

2:26:17

what happens here is we kind of

2:26:19

edit out all of the impulses in ourselves that

2:26:21

we don't like. We get to that world

2:26:23

seems kind of boring, so maybe that's when we have to make a

2:26:25

new simulation to watch people. I think

2:26:28

they're going through some drama or something. Or maybe it's

2:26:30

just we get to this point

2:26:32

where we have this power but the haves

2:26:35

and the have-nots, the divide is too great and

2:26:37

that people did get ahold

2:26:39

of this technology and use it to oppress people

2:26:41

who didn't have it and that

2:26:44

we didn't mitigate the human

2:26:47

biological problems,

2:26:49

the reward systems that we have. That I got to have

2:26:51

more and you got to have less. This is this

2:26:54

sort of natural inclination that we have

2:26:56

for competition and that someone

2:26:59

hijacks that. I think this is

2:27:01

going to be such a hugely important issue

2:27:03

to get ahead of before the first people push that

2:27:05

one. Yeah. What do you

2:27:08

think about like when Elon was calling for a pause

2:27:10

on AI?

2:27:13

He was like starting an AGI company while I was doing

2:27:16

that.

2:27:17

Yeah. Didn't

2:27:19

he start it like after he was calling for the pause?

2:27:23

I think before that. I don't remember. In any case.

2:27:26

Is it one of those you can't beat them, join them things?

2:27:30

I think the instinct of saying like

2:27:32

we've really got to figure out how to

2:27:37

make this safe and good

2:27:40

and like widely good is really important.

2:27:43

But I think calling

2:27:47

for

2:27:49

a pause is like naive at

2:27:51

best. I

2:27:57

kind of think you can't make progress

2:27:59

on.

2:27:59

the safety part of this, as we mentioned earlier, by

2:28:02

sitting in a room and thinking hard, you've got to

2:28:04

see where the technology goes. You've got to have contact

2:28:06

reality. And then when you like,

2:28:09

we're trying to make progress towards

2:28:11

AGI, condition on it being safe

2:28:13

and condition on it being beneficial. And

2:28:15

so when we hit any kind of block,

2:28:18

we try to find a technical or a

2:28:20

policy or a social solution to overcome

2:28:23

it. That could be about the limits of the technology

2:28:25

and something not working. And you know, most needs

2:28:27

are not getting smarter or whatever. Or

2:28:29

it could be there's this like safety issue, we've got to like,

2:28:32

redirect our resources to solve that. But it's all

2:28:34

like, for me, it's all this

2:28:37

same thing of like, we're trying to solve the

2:28:39

problems that emerge at each step, as

2:28:42

we get where we're trying to go.

2:28:43

And, you

2:28:44

know, maybe you can call it a pause if you want a few

2:28:47

pause on capabilities to work on safety.

2:28:49

But in practice, I think the field

2:28:52

has gotten

2:28:53

a little bit wander on the axle there and

2:28:56

safety and capabilities are

2:28:58

not these two separate things.

2:28:59

This is like, I think one of the dirty secrets

2:29:01

of the field. It's like we have this one way

2:29:03

to make progress.

2:29:04

We can understand and push on deep

2:29:06

learning more. And that

2:29:10

can be used in different ways.

2:29:12

But I think it's that same technique

2:29:14

that's going to help us eventually solve

2:29:16

the safety.

2:29:18

That all of that said,

2:29:20

as like a human,

2:29:22

emotionally speaking, I super

2:29:24

understand why it's tempting to call for

2:29:26

a pause.

2:29:27

That was all the time in life, right? This is moving too fast. Right. We

2:29:30

got to pause here.

2:29:32

Yeah. How much of a concern

2:29:35

is it in terms of national security

2:29:37

that we are the ones that come

2:29:39

up with this first?

2:29:43

Well, I would say that if an

2:29:46

adversary of ours comes up with it first

2:29:49

and uses it against us and we don't have

2:29:52

some level of capability, that feels really bad.

2:29:56

But I hope that what happens is

2:29:58

this can be a moment where we can do it.

2:29:59

where to tie

2:30:03

it back to the other conversation, we kind of come together

2:30:05

and overcome our base impulses and say like, let's

2:30:08

all do this as a club together.

2:30:09

That would be better. That would

2:30:11

be nice. And maybe through

2:30:14

AGI

2:30:15

and through the implementation

2:30:17

of this technology, it will make translation

2:30:21

instantaneous and easy. Well, that's

2:30:23

already happened. But I mean

2:30:26

it hasn't happened in real time, the

2:30:28

point where you can accurately

2:30:31

communicate very soon. Very soon.

2:30:34

Very soon.

2:30:35

Yeah.

2:30:36

I do think

2:30:38

for what it's worth that

2:30:41

the world is going to come together here.

2:30:44

I don't think people have quite realized the stakes, but

2:30:47

this is like – I don't think this is a geopolitical –

2:30:50

if this comes down to like a geopolitical fight or race,

2:30:52

I don't think there's any winners.

2:30:55

And

2:30:56

so I'm optimistic about people

2:30:58

coming together.

2:30:59

Yeah, I am too. I mean

2:31:02

I think most people would

2:31:04

like that if you asked the

2:31:06

vast majority of the human beings that are alive, wouldn't

2:31:09

it be better if everybody got along? You

2:31:13

know, maybe you can't

2:31:16

go

2:31:17

all the way

2:31:19

there and say we're just going to have one global effort.

2:31:22

But I think at least we can get to a point

2:31:24

where we have one global set

2:31:26

of rules, safety standards,

2:31:29

organization that makes sure everyone is following the rules. We

2:31:31

did this for atomic weapons

2:31:33

and similar things in the world of biology. I

2:31:35

think we'll get there. That's a good

2:31:38

example, nuclear weapons. Because

2:31:42

we know the destructive capability

2:31:45

of them, and because of that we haven't

2:31:47

detonated once since 1947. Pretty

2:31:50

incredible. Pretty incredible, other

2:31:52

than tests. We haven't used

2:31:55

one in terms of war. 45 or 47? When

2:31:58

was the end of World War II? Wasn't

2:32:01

it 47

2:32:03

when they dropped the bombs?

2:32:05

I think that was 45. I was wondering if there's more after

2:32:07

that I didn't know about it. No, it might be 45. I

2:32:10

think it was, yeah. 45. So from 1945,

2:32:12

which is pretty extraordinary. That's right. It's remarkable. I

2:32:15

would not have predicted that I think if I could teleport

2:32:17

back to 45. No.

2:32:19

I would have thought, oh my God, this is just going to

2:32:22

be something that people do, just launch

2:32:24

bombs on cities. Yeah. I

2:32:26

mean I would have said like, we're

2:32:28

not going to survive this for very long. And

2:32:30

there was a real fear of that. For sure.

2:32:33

It's pretty extraordinary that they've managed to stop that,

2:32:35

this threat of mutually assured destruction,

2:32:38

self-destruction, destruction,

2:32:40

universe. I mean the whole world. We have enough weapons

2:32:42

to literally make the world uninhabitable.

2:32:46

Totally. And because of that

2:32:48

we haven't done it, which

2:32:50

is a good sign. I

2:32:52

think that should give some hope. It should. I

2:32:54

mean Stephen Pinker gets a lot

2:32:56

of shit for his work because he just sort of

2:32:59

downplays violence

2:33:02

today. But it's not that he's downplaying violence

2:33:04

today. He's just looking at statistical trends.

2:33:07

If you're looking at the reality of life today

2:33:09

versus life 100 years ago, 200 years ago, it's far more

2:33:11

safer.

2:33:14

Why do you think that's a controversial thing? Like

2:33:16

why can't someone say, sure, we still have problems, but

2:33:18

it's getting better? Because people don't want to say that.

2:33:21

Especially people who are activists. They're

2:33:23

completely

2:33:25

engrossed in this idea that there's

2:33:27

problems today, and these problems are huge,

2:33:29

and there's Nazis and there's –

2:33:32

But no one's saying there's not huge problems today. Right.

2:33:34

No one's saying there's not. But just to say things are better

2:33:36

today. Yeah, I get that. Some people, they just don't want to hear

2:33:38

that. Right. But those are also people that are addicted

2:33:41

to the problems. The problems become

2:33:43

their whole life. Solving those problems becomes their identity.

2:33:46

Being involved in the solutions or what they

2:33:49

believe are solutions to those problems become

2:33:51

their life's work. And someone comes

2:33:53

along and says, actually, life is safer than

2:33:55

it's ever been before. Interactions, beliefs, and beliefs

2:33:57

are safer. Yeah, that's deeply invalidated. Yeah. But

2:34:01

also true. And

2:34:03

again, what is

2:34:05

the problem? Why can't people recognize that? Well,

2:34:07

it's the primate brain.

2:34:09

It's all the problems that we highlighted

2:34:11

earlier. And that

2:34:14

might be the solution to

2:34:16

overcoming that is through technology.

2:34:19

And that might be the only way we can do it without a long

2:34:21

period of evolution. Because

2:34:24

biological evolution is so relatively

2:34:26

slow in comparison to

2:34:28

technological evolution. And

2:34:31

that might be our bottleneck.

2:34:34

We just still are dealing with this primate body. And

2:34:39

that artificial general intelligence or

2:34:41

something like some implemented

2:34:43

form of engaging with it, whether

2:34:46

it's a neural

2:34:49

link, something that shifts

2:34:52

the way the mind interfaces with other minds.

2:34:55

Isn't it wild that speaking of biological evolution,

2:34:57

there will be people, I think, who are alive

2:35:00

for...

2:35:02

The invention or discovery, whatever you want to call it, of

2:35:04

the transistor? There will also be alive

2:35:06

for the creation of AGI. One human lifetime.

2:35:09

Yeah. You want to know a wild

2:35:11

one? From the implementation

2:35:13

from Orville and Wilbur Wright flying

2:35:16

the plane, it was less than 50 years before

2:35:18

someone dropped an atomic bomb out of it. That's wild.

2:35:21

That's fucking crazy.

2:35:24

That's crazy. Less than 40, right? That's crazy. Yeah.

2:35:28

Bananas.

2:35:32

I mean... 60-something years to land

2:35:34

on the moon. Nuts. Nuts.

2:35:37

Where is

2:35:38

it going? I mean, it's just

2:35:40

guesswork. But it's

2:35:43

interesting. For sure. I mean,

2:35:45

it's the most fascinating thing of our time, for sure. It's fascinating

2:35:48

intellectually, and I also think it is one

2:35:50

of these things that will be...

2:35:53

Tremendously beneficial. Yeah.

2:35:57

We've been talking a lot about

2:35:59

problems in the world.

2:35:59

I

2:36:00

think that's just always a nice reminder of how

2:36:03

much we get to improve and we're gonna get to improve

2:36:05

a lot and this will be I think this will be the most powerful

2:36:08

tool

2:36:10

we have yet created to help us go

2:36:12

do that. I think you're right and

2:36:16

this is an awesome conversation. Thanks for having me. Thank

2:36:18

you for being here. I really appreciate it and thanks

2:36:20

for everything. Keep us posted and if

2:36:22

you create how? I'll give you a call.

2:36:25

Let us know.

2:36:25

All right. Thank you. Bye,

2:36:28

buddy.

2:36:30

Bye.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features