Podchaser Logo
Home
E164: Zuck’s Senate apology, Elon's comp package voided, crony capitalism, Reddit IPO, drone attack

E164: Zuck’s Senate apology, Elon's comp package voided, crony capitalism, Reddit IPO, drone attack

Released Friday, 2nd February 2024
Good episode? Give it some love!
E164: Zuck’s Senate apology, Elon's comp package voided, crony capitalism, Reddit IPO, drone attack

E164: Zuck’s Senate apology, Elon's comp package voided, crony capitalism, Reddit IPO, drone attack

E164: Zuck’s Senate apology, Elon's comp package voided, crony capitalism, Reddit IPO, drone attack

E164: Zuck’s Senate apology, Elon's comp package voided, crony capitalism, Reddit IPO, drone attack

Friday, 2nd February 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

All right, everybody, welcome back to your

0:02

favorite podcast, the all in podcast. It's

0:04

episode 164. I'm down

0:06

here in Miami with

0:08

me again. Of course,

0:10

the dictator chairman himself, Pauli

0:13

Appetitio and the rain man.

0:16

Yeah, burn baby David Sachs.

0:18

Unfortunately, we had

0:20

a little bit of a chat once this week. We don't know

0:23

where Friedberg is. He somewhere lost in his Apple

0:25

Vision Pros, but he'll be back next week. As

0:27

you guys know, I'm incredibly

0:29

generous with my friends. So I sent all

0:32

the besties the Apple Pro goggles.

0:35

And so these Apple Pro goggles are

0:37

amazing. But you

0:39

bought me a pair of the Apple Pro. Yeah,

0:42

we talked about this. Yeah, guys, you guys actually

0:44

you were using them. You just forgot. But Friedberg's

0:46

been using them. Nobody can find Friedberg right now.

0:49

Because apparently, he went

0:51

to Uranus. I recorded in all of these

0:54

sacks. What's happening inside?

0:57

Each of our Apple goggles, a

1:00

vision. Yeah. And so but here

1:02

they are. We actually took a picture. I had not

1:04

take a picture of you wearing them. Do you

1:06

want to actually want to see what Chamorro was

1:08

doing in his goggles? Yeah, let's see. I recorded

1:10

it. Oh, yeah, look

1:12

at that. See, he imagined that

1:15

he did leg day. Look at that. He's

1:20

still reveling in that thirst trap that he posted

1:22

in the I know, but you see those legs.

1:25

The Apple Vision Pro, Jim

1:28

Cook. Can I say something funny about

1:30

this, which is that my legs

1:32

are actually darker than

1:34

my torso and my upper body. It's the weirdest

1:36

thing. And so you have it in reverse. But

1:39

it is true that my legs are a different

1:41

shade than my trunk and my arms and my

1:43

body. Okay, well here's Sax. By the way, Sax,

1:46

you know, he loves his goggles. Yeah, Chabad, do you have

1:48

any interest in seeing what Sax was doing with his goggles?

1:50

Oh my god, I can't imagine. There

1:52

it is. Sax was speedrunning. He

1:54

was doing the speedrun on DJ.

1:58

Absolutely getting in there. That's a serious question. private

2:00

run, right? Yeah, that's you. You were speed

2:02

running, saving private run. Oh, I

2:05

got them too, yes. But I didn't record

2:07

myself. I didn't record myself. I maybe nicked

2:09

it. Oh, I once in there. Oh, look,

2:11

what am I doing? Oh, I was waiting

2:13

to come. You arrived off. Well, you're

2:16

telling me that they

2:19

didn't have the third or

2:21

fourth investor in

2:23

Uber, but

2:36

ringing the bell with them at the original moment. I

2:38

could tell you the backstory. I was invited

2:40

by TK to come to the

2:43

ringing of the bell. TK

2:46

was disinvited. So he was there on the

2:48

floor. It was very awkward. And

2:51

then they didn't have him go up and ring

2:53

the bell or be even there when Dara rang

2:55

the bell. It was because it was controversial. They

2:57

didn't they banned him. No,

3:00

it was really, it was really sad.

3:02

It was super sad. But anyway, everybody,

3:05

I hope you're enjoying your three hours there

3:07

or not. I didn't go

3:09

because okay, he was like, we're gonna have a

3:11

party. But we're not going

3:13

to be able to ring the bell. And it was just all

3:15

like very did you go to the party? I

3:18

didn't I should have regret not going but it was unclear if

3:20

there was even going to be a party or TK was going

3:22

to go because of all the drama. And I don't know if

3:24

you remember on CNBC, they were like, TK

3:26

is in the building, but he's not on the thing. And that

3:28

would became a big bruja. But

3:31

yes, I missed my window. I mean, if you

3:33

hadn't known that this was going to be the

3:36

big exit in your life, really the only one,

3:38

then you would have made every

3:40

effort to attend everything, right? Absolutely.

3:42

Absolutely. Yeah. I mean, you're still

3:45

riding on that. Yeah. PayPal. No,

3:48

he got it. I was I was talking

3:50

to somebody about this the other day, most

3:52

feels careers you have, like, you

3:55

know, it's not like a smooth up to upward

3:57

trajectory. There's like maybe a few pops that you

3:59

get your lucky. actually, if

4:01

you get a few, because most people only have

4:03

one or maybe two. Yes. It's like a power

4:05

law. It's like anything else in venture, right? Absolutely.

4:07

Like I'm sure if you think back on like

4:09

the big outcomes in your life, it's not like

4:11

there's one every year and like some sort of

4:13

smooth gradient. It's basically

4:15

there's like one, two, or three

4:17

over the course of your entire career that you

4:19

remember. Right. Yeah, absolutely. Yeah. I

4:22

mean, it's, it's good. It's

4:24

good to consider that. Uh, because you have

4:26

to enjoy every sandwich. You

4:28

have to enjoy every day what you're doing because

4:30

those pops are out of your control. You don't

4:33

know when they're going to happen, how they're going

4:35

to manifest. So, but you

4:37

guys didn't get these, you didn't

4:39

get the goggles, but these goggles came out this week. I don't

4:41

know if you saw it. What I will say is that the

4:43

poker game, three of the guys, talking

4:45

about these goggles, Sammy

4:48

woke up at five in the morning on the

4:50

first day. He was, he got

4:52

them. And then Kuhn and Robl were telling me

4:54

that they just bought them randomly in

4:56

the next couple of days. And

4:58

it was easy. So those are civilians. There was no

5:01

like waiting in line. They were able to just order

5:03

them as what they do. Yes. But they're civilians. So

5:05

just to be clear, like in Silicon Valley, we're kind

5:07

of like, eh, whatever,

5:10

you know, we've, we've played with Oculus for so long, but

5:12

now these are out and there's a bunch of demos going

5:14

around. This one I thought was interesting. I don't know if

5:16

you saw this, but watching a basketball game, it actually is

5:18

quite compelling to have all the stats on the screen with

5:20

you. And you can kind of move them around. I

5:23

maybe could see myself doing this. I don't

5:26

know. Do you think you would do this watching a

5:28

game? Maybe. So are you actually watching

5:30

the TV through the goggles? Like what

5:32

part is real world and what part

5:35

is augmented? Obviously all the stats are

5:37

augmented reality. Yeah. This is all

5:39

self is, is that that's a stream.

5:41

That's a stream coming into the goggles. Yeah. It's not

5:43

your TV. So, you know, I

5:45

think he's just putting it on the TV there, but

5:47

you could literally be outside on your

5:50

deck. You could be on an airplane and do this. So

5:52

that's what you can see through the goggles, right?

5:54

Is that the idea? It's you can, you want

5:56

to, I think you can sell through to the

5:58

goggles. Yeah. You know,

6:01

you can see the fact that you

6:03

live in a tattered apartment without a

6:05

girlfriend dressed poorly. You

6:07

can see the dishes in your sink. You can see

6:10

the cupboard being bare. You can

6:12

see the spoiled milk in your refrigerator. You're

6:14

a half-used bong that you use to soothe

6:16

yourself to sleep at night. Whatever all of

6:18

these 20 and 30 year olds do to

6:20

cope, you'll see all of that while

6:22

still being in an immersive environment. It's amazing.

6:24

Yes. I mean, it is so

6:27

dystopian. I just think like if

6:30

you take out your phone, your spouse is like, what are you

6:32

doing on your phone? Can you

6:34

imagine the audacity of being with your spouse or your family

6:36

and be like, hey guys, I'll be right back. And you

6:38

put the goggles on to watch the Knicks game or the

6:40

Warriors? I mean, I think that's

6:42

a snap divorce. I don't know. You said it very

6:46

well about the Oculus

6:48

one, which is you had to turn a phrase, which

6:50

I really like. But basically, it's like you

6:52

were very quick to try it and then

6:54

there was like a period and then you

6:56

lost interest. I think that's just going to be the

6:58

key thing with this. And

7:00

the fact that it costs $3,500, they have

7:02

to get the price down fast enough for

7:04

average folks to want and be able to

7:07

buy it. Right. Yeah. I

7:09

call this experience the try. Oh,

7:11

my. Goodbye. You try it. You're

7:13

like, oh, my God, this is incredible. And then you're like,

7:15

we put it in your drawer. You never use it again

7:17

because there's not an application. It's really a

7:19

proof of concept. Look, I think it's a

7:21

great thing for innovation that they're starting with

7:23

this, you know, like you said, expensive headset

7:26

that maybe is not that ergonomic, but

7:28

eventually it'll come down and the form

7:30

factor will be glasses or sunglasses. Facebook,

7:33

I don't know if the product's out yet, but

7:35

do you see their product where the Ray-Ban? Yeah,

7:38

it has the AI built into it and

7:40

you can ask it questions and it will do

7:42

computer vision and give you answers based on

7:44

what it's seeing. It was a

7:46

phenomenal demo. I don't know if that's actually

7:49

real yet, but do you see that

7:51

demo? Yeah, these are the Facebook

7:53

Ray-Ban glasses. And I do think these

7:56

are met as smart glasses. They work particularly well

7:58

taking pictures and sharing them. on Instagram,

8:00

so they're kind of single function. How

8:02

Zuckerberg is, he cribbed what

8:04

Evan Spiegel did a couple years ago

8:07

with Snapchat. Right, but he put out a demo

8:09

that was a little different. The demo was like

8:11

he was in his closet, he was wearing this,

8:13

and he had picked out a shirt or something

8:15

that said, give me, tell

8:18

me what I should match this with. Yeah,

8:20

what goes with a gray shirt that I've

8:22

worn for 14 years? Did you

8:25

guys see this thing where they all had to

8:27

testify in front of the Senate? Yeah. So

8:29

I guess I'll just jump right to that. It was like a

8:31

public vlog. Yeah. Once

8:33

again, a public vlog. And we'll just jump to it

8:35

real quick since we didn't want to go too deep

8:37

on that one. But Josh Hawley made Zuck turn

8:40

around and apologize to people

8:42

in the audience. It was brilliant. Yeah, so

8:44

this is, the name of this

8:47

hearing was Big Tech and Online Child Sexual

8:49

Exploitation Crisis. Zuckerberg, he was questioned, along

8:51

with the CEO of TikTok, Discord X,

8:53

and Snap. But obviously,

8:56

this is all around kids'

9:00

online safety and also section 230, which

9:03

I think these senators are, this is one

9:05

of the few bipartisan moments, I think. They're

9:07

honing in on it. Yeah, they

9:10

realize this is kind of like a winning- They're gonna

9:12

take a run at this. They're taking a run at

9:14

it, for sure. They realize it's a winning ticket. Let's

9:17

just play the clip and then I'll get your thoughts,

9:19

Sax and Jamal. Let me ask you this. There's families

9:21

of victims here today. Have you apologized to the victims?

9:25

Would you like to do so now? Well, they're

9:27

here. You're on national television. Would you like now

9:30

to apologize to the victims who have been harmed

9:32

by your problems? Show them the pictures. Would

9:35

you like to apologize for what you've done to these

9:37

good people? Because

9:42

everything that you've all gone through is

9:44

horrible and no one should have to

9:46

go through the things that your families

9:48

have suffered. And this is

9:50

why we invest so much and are going

9:52

to continue doing industry-leading effort to

9:55

make sure that no

9:57

one has to go through the types of things that your families

9:59

have- to suffer. Thanks. This

10:02

is a powerful moment. And for

10:04

Zuckerberg to get up and actually face the parents,

10:06

he turned around, he faces the president, very dramatic

10:08

moment. And he apologized. Ned,

10:11

you respect him for doing

10:13

that? And you think that was like a powerful

10:16

moment as well? And where does this

10:18

all end up? This was a kangaroo

10:20

court. I mean, this was basically all

10:22

theatrics. This is basically a bipartisan moral

10:24

panic where all these senators are basically

10:26

grandstanding. And these are the same

10:28

types of accusations that we've been hearing for years. Remember,

10:30

this goes back to the whole Frances Hagen claims where

10:33

she says that Facebook wasn't doing enough to

10:35

prevent various kinds of online harms. And

10:39

I think that we're going to regret

10:41

where this all leads, because where it's going to lead

10:43

if they do repeal Section 230 is

10:45

towards greater censorship. All these companies

10:47

are going to spend even more resources

10:50

restricting what we

10:52

can say and hear online, which is

10:54

not the right direction. Do

10:57

some harms occur online? Yes. Do

10:59

I believe that Facebook is taking

11:02

substantial measures to stop them? Yes. I mean,

11:04

but edge cases are always going to get

11:06

through. When you're

11:08

operating at that kind of scale, there are going to be

11:10

these edge cases of kids who

11:13

got harassed or content that

11:15

shouldn't have getting through. It's just

11:18

part of the fact that the internet operates

11:20

at gigantic scale. And these harms

11:22

have always been out there. I think that these companies

11:24

do their best to try and stop them, but

11:26

they're always going to get through. And you

11:28

can't make every aspect of our

11:30

society perfectly safe

11:33

and harm free. Somehow we have this expectation that

11:35

we can eliminate 100% of every harm that occurs.

11:39

And I do think that these online companies have been

11:41

unfairly picked on in a sense. I mean, if you're

11:43

going to talk about these types of

11:46

harms, why aren't you targeting the music

11:48

industry for all their incendiary lyrics that

11:50

basically encourage all sorts of violent

11:53

or sexist behavior? Why don't you target

11:56

the advertising industry for creating unrealistic body

11:59

image expectations? expectations. Why don't you

12:01

target the Kardashians for setting

12:04

unrealistic expectations around image?

12:08

And you could go on down the list. I mean,

12:10

why don't you target Hollywood for releasing a show like

12:12

Euphoria, which is a hit? It seems

12:14

to me that the problem in our culture is not

12:16

coming from the edge cases. It's coming from the

12:19

mainstream entertainment that

12:22

is fully allowed and is popular and is

12:24

our hit shows and hit records and hit

12:26

products. I mean, that's where the

12:28

toxic pollution is coming from in our

12:30

culture. So to turn around and

12:33

now blame the online companies for creating all

12:35

of this, I think is just anything basically

12:37

they're being scapegoated. I mean, again, this is

12:39

a moral panic. Jim, I think it's a

12:41

moral panic or, you know, there have been

12:43

statistics and studies done about

12:46

what is viral on social media, the

12:48

algorithms targeting users, the addictive nature of

12:50

it. You spoke earlier about the addictive

12:52

nature of just gamification on watches. Social

12:55

media is a little bit different than music and

12:58

some of these other things because they have

13:00

these algorithms to increase watch time and

13:02

engagement. So I think that's what the other side

13:04

would say. What do you say? What do you say, Chamath? Where do

13:06

you stand on this? Let me just give a coda to a couple

13:08

of things that Saks said. It

13:10

is true that we've taken turns attacking

13:13

other forms of media when they were ascending

13:15

in their popularity. So in the 1990s, if

13:17

you guys remember, politicians

13:20

and their censorship attempts around

13:22

gangster rap and NWA

13:25

and Two Life Crew and certain songs.

13:27

Al Gore's wife, right? What was her

13:29

name? Tipper Gore. Tipper Gore, right.

13:31

Tipper Gore, a whole tirade against rap lyrics. What

13:33

was the NWA song? You know, **** the Police.

13:36

That whole thing just sent off a huge

13:38

fear about people potentially, David,

13:41

being motivated to kill cops or something, right?

13:43

In the 80s, there was a trial. Judas

13:45

Priest went on trial. Because

13:47

if you played one of their records backwards, supposedly

13:49

it promoted devil worshiping. Suppose it was exactly devil

13:51

worshiping. I don't know if it was playing records

13:54

backwards, but if you did, then it promoted devil

13:56

worship. I think a kid committed suicide and so

13:58

they basically prosecuted you. Judas Priest for

14:00

it. So that's comment number one, which is

14:03

this is not new. And the

14:06

reason why social media is in the crosshairs is

14:08

because instead of having this really

14:12

diverse ecosystem of

14:14

many small players, you have three or four folks.

14:17

And so it's easier to bring

14:19

them up on stage and sort of pillory them. Second

14:21

is I actually thought that Zuck had a lot of moral

14:23

clarity, because it's like, that's a tough position

14:26

to be in. And the fact that he had the courage

14:28

to turn around and actually apologize to those people shows

14:31

he's trying to do the right thing. But the reality

14:33

is, and Saks is right, if

14:35

you apply a very, very small error rate

14:38

to an incredibly large number, so they have a

14:40

network of three and a half billion people monthly,

14:42

right, or daily or whatever the thing is, even

14:45

if you say that there's one tenth of 1% of

14:50

an error rate, meaning things that are unintended,

14:52

well, that's three million unintended consequences,

14:55

right? That's a lot of unintended

14:57

consequences. And so there's

14:59

this massive law of large numbers at

15:01

play. So what do we do, I

15:03

guess, is the question. And

15:06

I think that there is enough knowledge

15:10

that we have to know

15:12

that the ability for a 35-year-old

15:17

to use certain products today is very

15:20

different than the ability for a 12-year-old

15:22

to use that same product because of

15:24

where they are physiologically, right?

15:26

I think we all know that to be scientifically

15:28

true on the dimension of many products.

15:30

And I think what we need to decide as a

15:32

society is whether software

15:34

and electronic products fall into that

15:36

categorization. And if so, what

15:38

does it mean? So in the case of China, they

15:41

mandate top-down what products

15:43

can be used and how many hours

15:47

you can use them for. Specific way of video games you're

15:49

referring to. Yeah, and David's right, which is that

15:51

if we go there and we rewrite

15:53

the law, then there's gonna be

15:56

a different set of unintended consequences. That's gonna

15:58

create, I think... A

16:01

much. Poorer business landscape

16:03

frankly to innovate and a bunch of

16:05

other things. And building on your comments

16:07

it there's clearly an age at which.

16:10

She. Adds pan he shouldn't be on these

16:12

systems. and an age where. He. A

16:14

may be with some guidance. they can. Yeah.

16:17

He wants Smh mafia and then I think

16:19

the third thing is around the section two

16:21

thirty thing itself. I. Think

16:23

that sucks, I'll give you a slightly different

16:26

take. I don't think that the

16:28

section Two Thirty rewrite. He's.

16:30

Gonna be brought and sweeping. What?

16:32

I noticed from a bipartisan

16:34

perspective. By. Both democrats and

16:37

republicans is that. The. One

16:39

single narrow issue that they all

16:41

seem to a line on. Is

16:44

not necessarily. About.

16:48

All. Of the different rules around censorship.

16:50

But. That the lack of

16:52

liability. For. These folks

16:54

should be released. And.

16:56

I think that if you were to write

16:58

a narrow amendment to Section Two Thirty That

17:01

said, That. The social media companies

17:03

are other organizations that at certain characteristics.

17:06

Were. More liable where today they have

17:08

no liability. I'm not saying that it's right. But.

17:11

My read of the temperature in that room

17:13

was a bad as the. Very.

17:15

Narrow change in section Two thirty that I

17:18

think they all seem to want to make.

17:20

And. So that seems like a very. Likely.

17:23

Thing that will happen in the next two or three

17:25

years on package for the audience. who who might not

17:27

know how they would I do that section to their

17:29

he says. If you're a publisher, your

17:31

common carrier. You're. Not responsible for

17:33

people post on your your system blog,

17:36

web host, or social media companies but

17:38

where the social media companies move from

17:40

being. Just. A common carrier. You

17:43

know, like paper might be or or a

17:45

website hosting company like Wordpress? Or

17:48

square space. Is. when they flip

17:50

over and they have an algorithm and then

17:52

they start picking and choosing so once you

17:54

start doing editorial like the new york times

17:56

and you have editors then you're liable if

17:58

you're cnn you're liable if you're Fox, as we

18:00

saw in the Dominion case, that's where the liability comes in.

18:02

So I guess the question to you, Sax, is,

18:05

you know, at the end of the day,

18:07

now that we've seen these things at scale,

18:09

is there not an argument that when you

18:12

start editorializing through an algorithm and you start

18:14

promoting certain content, that you have some level

18:16

of responsibility like Fox News, CNN, or the

18:18

New York Times has, where would you

18:20

stand on that issue? Some liability if you're picking

18:23

winners and losers in terms of what gets promoted

18:25

in the system. Well, apparently, I'm

18:28

the last person in America who thinks

18:30

that Section 230 was a good idea

18:32

and a visionary piece of legislation that

18:34

actually enabled the creation of user-generated content

18:36

platforms. Just to kind of slightly modify

18:38

your description of how it works, I

18:41

would analogize it to a newsstand where

18:43

there's magazines on the newsstand, there are publishers, and

18:46

then there's the newsstand itself, which is a distributor.

18:49

If a magazine

18:52

engages in defamation, they're liable

18:54

for it, but the newsstand is

18:56

not. The newsstand can't be sued. So

18:58

the question is, when you have these

19:00

massive user-generated content platforms, are they operating

19:02

as a publisher or as a distributor?

19:05

And I think what Section 230 made clear is, look,

19:07

if you don't write the content, if

19:10

the content is generated by users, you're

19:12

a distributor. And that is, I believe,

19:14

the better analogy to make

19:16

for these huge UGC platforms. Now, at

19:18

the same time, what Section 230 said

19:22

is that if you take good Samaritan

19:24

actions to reduce things like

19:26

sex and violence on your platforms, then

19:29

we won't make you liable. Because what happens in a

19:31

lot of cases is that you

19:33

can waive your protection legally

19:35

by basically getting involved. And

19:38

so the legislation didn't want to deter these

19:41

platforms for taking, again, good Samaritan

19:43

steps. I think it's a

19:45

pretty good combination of legislation. And that's what

19:47

you see right now, is that Zuckberg

19:49

doesn't want to let these edge cases

19:52

through. I actually believe that they are

19:54

taking huge efforts at scale. They have

19:56

40,000 people, to give him some credit.

19:58

There's 40,000 people moderating stuff.. Image or

20:00

Edge cases I get through. And by the way,

20:02

you have to wonder where where the parents when

20:04

all this stuff happened. I mean, they're acting like

20:06

they're victims in the audience and I'm sorry for

20:08

their particular cases, But at the end of the

20:10

day. We. Do need the Paris step up here.

20:13

Every want to have social media skill and all the

20:15

parents have to play more active role. But are you

20:17

going to go back to such and to thirty? I

20:19

just think that. Republicans. In

20:21

particular are gonna work really regret.

20:24

Getting. Rid of such and to thirty because

20:26

as only a lead to more censorship I think

20:28

what they're going to do if I had to

20:30

bed is that they are going to. Write.

20:33

A very narrow. Amendment.

20:35

To that law and. During

20:37

some budget process or some other thing where

20:39

you have a big Christmas tree build this

20:41

will get in there. And

20:43

I think you will have bipartisan support

20:45

that effectively. Remove. The Liability.

20:48

Protection. That these companies have I

20:51

i had a small change does the entirety

20:53

of such and to thirty I think like

20:55

these companies will not be able to use

20:57

that does a massive change. Listen there are

20:59

plaintiffs' lawyers. The. The plaintiffs'

21:01

lawyers bar. Year. The

21:04

Trial Lawyers bar is salivating.

21:06

Over the possibility that what happened. That's why

21:09

this is gonna happen you know? Wait, this

21:11

is how America were so injury all of

21:13

their personal injury lawsuits. I am corps lawsuits

21:15

lined up in every jurisdiction I states. And

21:18

here's the thing is because Facebook in all

21:20

these other sites operate. Your

21:22

across the entire nation than across

21:24

the entire world. They can be

21:27

sued in every single jurisdiction. If.

21:29

You allow these types of lawsuits skate to mop

21:31

you doubt that are going to get my position

21:34

and more of a move on. feminist. it is

21:36

in as an opinionated as away as possible whether

21:38

we like it or not. There.

21:40

Is an element of American capitalism.

21:43

That. Takes Companies two

21:45

seasons. And. There are

21:47

seasons where you're growing. And. Then there

21:49

are seasons where you're over earning. And.

21:51

Then there are seasons. Where is it

21:53

is possible? The machinery. If

21:56

you see that you are being you are over earning.

21:58

For. a long time the machinery of the

22:01

economy comes and kind of pulls you back

22:03

down to earth. You've won too much

22:05

and you're perceived as too powerful. Yeah. And

22:07

I'm not saying this whether

22:10

or not it's right. I'm just saying if you look back

22:12

in history, these chapters

22:14

have been written umpteen times. And

22:17

I think, David, what you said is

22:20

the absolute single most important thing if

22:22

you had to figure out where this

22:24

was going to go is exactly that.

22:27

The plaintiffs' lawyers, the

22:29

class action lawsuits, the amount of

22:32

money that they think they

22:34

can extract and

22:36

they compare it to the amount of money that

22:38

they were able to extract in two different kinds

22:40

of cases. One was

22:43

tobacco and then the second was

22:45

pharma. And I think that they

22:47

look at this class of app and

22:50

the lack of empathy or the

22:53

lack of popularity that the leaders

22:55

of these companies have in

22:57

Washington as a reason why they will probably

22:59

be able to get this done to create

23:02

this. Again, I'm not saying I think that's

23:04

good. I'm saying that I think it's

23:06

likely. And I think when that does

23:08

happen, you will see a replay. Again, it'll

23:10

be slightly different in terms of how it

23:12

plays out, but exactly the kinds

23:14

of plaintiffs' lawsuits that we saw in pharma

23:17

and in tobacco. And I think it's going to play

23:19

out here. And, David, you are right. That hearing

23:22

to me was a setup for that.

23:25

Yeah. And if you look at this through

23:27

that lens, there will be

23:29

some sort of negotiation and it might be

23:31

age. Because when you look at

23:33

this, really what Americans are upset about is

23:35

the impact this is having on kids. We

23:38

have a limit for the age of smoking, vaping, et

23:40

cetera. And when these things happen, I think an

23:43

easy concession that Zuckerberg and others will make is, hey,

23:45

these products will be for 16 years old and up.

23:48

That's the age I think it's appropriate. 15

23:50

or 16 seems to be. Well, he also

23:52

said, by the way, Jason, to your point,

23:54

Lindsey Graham was the one that brought this

23:56

up. And Lindsey Graham made the connection to

23:58

tobacco and also to fire. firms. And

24:01

then Mark at some point in there basically said,

24:03

well, listen, like, let's look at Apple and

24:05

Google. We should expect them

24:07

to do the actual age verification, not us.

24:10

Right. Because they have the devices and they have

24:12

the credit cards, etc. That's actually a very reasonable

24:15

thing for Americans to come to. You know, and

24:17

that breaks down a little bit, Saxon, your argument,

24:19

and listen, there's no perfect analogies here. But

24:22

if there was a repeated offense of a

24:24

magazine on a newsstand, for example, if

24:27

there was some magazine that had underage, you

24:29

know, an adult magazine that had underage kids

24:31

in it, and people knew

24:33

that and a newsstand continued to publish

24:35

it, they would have liability for trafficking

24:37

in child pornography or whatever it happens

24:39

to be. And so the newsstand

24:41

does get some liability. So there's, again, no

24:44

perfect analogies here. But I think...

24:46

But Facebook and all these other sites are trying to remove the

24:48

child porn or whatever. I don't think much of that gets through

24:50

at all. You know,

24:52

I think maybe you have a better argument and

24:54

there is the argument that people make is that because

24:56

of the feed, they're making editorial judgments and that's

24:58

obviously not disturbing. However, my counterarguments to that is

25:01

that the feed just gives you more of what

25:03

you want. I mean, it just looks at what

25:05

you're clicking on, what you're viewing, the time you're

25:07

spending, and they just give you more of that.

25:09

I don't think it's editorializing. Now back

25:11

to Chamath's point about this is the way things are

25:13

headed, that may well be right, but I think we're

25:15

going to regret it. I mean, first of all, the

25:18

Democrats' interest, one of their biggest donors

25:20

is the trial lawyers bar, and they

25:22

generally will support any legislation

25:25

that opens up the causes of actions, and that's

25:27

where this is headed. And what's going

25:29

to happen is, if they get rid of Section 230,

25:31

is that every time there's an

25:33

alleged harm that occurs, every time a

25:36

kid gets bullied or beat up in

25:38

school, every time something goes wrong

25:41

in their life, they're going to try and pin it

25:43

on social media and try and show that

25:46

they imbibe something on social media that led

25:48

them down this dark path. And

25:50

these types of companies are going

25:53

to get sued in every jurisdiction in America.

25:55

Recently, we've seen huge

25:57

judgments related to defamation. where

26:01

if you have say a

26:03

politically red defendant in a

26:05

blue jurisdiction, huge awards, I

26:07

think we could probably see the opposite as

26:09

well, that basically you'll start seeing blue defendants taken

26:12

on in red jurisdictions. We've

26:14

seen completely disproportionate judgments and

26:16

again around defamation, disproportionate relative

26:19

to the harm that actually took place.

26:21

You're going to see that on steroids if we get rid of Section 230.

26:24

Now historically was the job of

26:27

Republicans to oppose Democrats on this stuff

26:29

because they knew that Democrats were shilling

26:31

for the plaintiff's part. Republicans

26:35

have not done that because they're so mad at

26:37

these social media companies for censorship. So remember when

26:40

I talked about Good Samaritan liability, these

26:42

companies created content moderation

26:45

to basically try and remove

26:47

the violent material, the pornographic

26:49

material, the sexual material, the

26:52

harassing material. But

26:54

in the process of doing that,

26:56

they started making political judgments and

26:58

they started engaging in political censorship.

27:00

And that has made the Republicans so angry that

27:03

they have now turned against these companies and they

27:05

are willing to remove Section 230. My

27:08

point is, I think Republicans at the end of the day

27:10

are going to regret that because if you remove Section 230,

27:12

it's going to open up this flood of

27:14

litigation. It's going to be a free-for-all. It will be a

27:16

free-for-all. And what's going to happen is

27:18

that these companies, just

27:21

driven by simple corporate risk aversion are

27:24

going to clamp down even more. I mean,

27:26

the content moderation is going to be even

27:28

stricter. And because the content

27:30

moderators in these companies basically are liberals, if

27:32

you empower them to take down even more

27:35

content, they're going to take down

27:37

Republican stuff even more. It

27:39

will be very easy for the plaintiffs

27:41

to target that type of content. They'll

27:43

say that, oh, that, you

27:46

know, all of that Republican or conservative

27:48

content that influenced people in a very

27:50

negative direction, that created all of these

27:52

harms, there'll be lawsuits targeting that

27:54

sort of content, and Facebook and others

27:56

will respond in the economically rational way,

27:58

which is to shutter. down completely. So I

28:01

think senators like Josh Hawley are not going to

28:03

get what they want. They're not thinking

28:05

straight. Yeah, Chamath is going to backfire. Chamath. The

28:07

reason why this is going to happen, if it

28:10

does happen, and you wanted

28:12

to try to be probabilistic about it, is

28:15

because when you look back at the tobacco settlement, the

28:17

original settlement in today's dollars is about $370 billion. If

28:19

you were the trial lawyers and you're

28:24

looking at a combination of Facebook

28:27

and TikTok and all

28:29

of that money, I suspect that

28:32

they probably think that the potential

28:35

that they can extract from these companies is

28:38

going to be multiples of that number. And

28:41

then as a result, their fees will

28:43

be between 20 and 50 percent of that. So

28:46

you're talking about hundreds of billions of

28:48

dollars of

28:52

revenue potential that

28:55

will motivate, I think, these folks to

28:58

get the law

29:02

changed, and then the

29:04

byproduct will not be framed in terms

29:06

of dollars. These are highly

29:08

kinetic issues when you're talking about sexual

29:11

exploitation and young people and mental

29:13

health and suicide and

29:15

bullying. These are very kinetic issues, right?

29:17

And so bringing these to jury trials

29:19

all across the United States, I

29:22

think that they probably think that

29:24

they're on the right side of history in

29:26

winning those things. So, you know, again, I'm

29:28

not saying it's right. They're highly emotional. I'm

29:30

not saying it's right. They're highly emotional. I mean,

29:32

listen. It's going to win. But if people's kids... The stuff

29:35

is going to win. They're going to bring a case

29:37

of, say, teen suicide, okay? Horrible

29:40

that it happens, but

29:43

every time something like that happens, there's going to

29:45

be a huge temptation. I'm sure there'll be trial

29:47

lawyers who specialize in this, to bring

29:49

a case against Facebook or some other

29:51

social media company, and they're going to

29:53

scour through these accounts

29:55

and try to point to examples

29:59

that could have led to... to this result. And

30:01

the truth of the matter is that maybe

30:03

social media contributed a little, but what about

30:05

popular culture and popular entertainment? What about all

30:07

the messages? I'm not debating any of that.

30:09

I know, I'm just I'm pointing out what's

30:11

gonna happen. What about all the messages they

30:13

received? Not through

30:16

the edge cases I got through on social media, but

30:18

through the mainstream entertainment. I mean, all the shows are

30:20

watching on television, all the music they're listening to, the

30:22

things that happened in their schools and day to day

30:24

conversations with other kids. But you can't

30:27

really sue any of those other things, but

30:29

you can sue social media. The

30:31

crazy thing about all of this is that all

30:33

of these lawsuits are funded in part

30:35

by these hedge funds who will do

30:38

litigation finance. And part of putting

30:40

together a well

30:42

performing litigation finance fund is underwriting

30:44

the probability of success. And

30:46

I think when you flow through the probabilities,

30:49

and you apply it to these companies, largely

30:51

because of their profitability and their

30:54

ability to over earn and

30:56

generate profits, I suspect

30:58

that Wall Street is

31:00

probably already involved. If not, they'll

31:02

probably get involved in due course.

31:06

But it's an unfortunate thing, David, I agree, because

31:08

this is sort of like, hey, the rules on

31:11

the field were x and folks operated

31:13

by those and they are clear, they are

31:15

trying to do their best. But again, this

31:17

is where capitalism, the part of capitalism that

31:20

can be awkward and uncomfortable is when industries

31:23

over earn for long periods of time, other folks

31:25

say, I'm going to compete away those returns somehow,

31:27

and I want a share of those profits. And

31:29

I think that that's going to be the large

31:31

motivator. And it's just going to

31:33

result in, I think, these things changing and a

31:35

plethora of lawsuits. And at the end

31:37

of the day, this is about children

31:39

and protecting children. So the obvious solution here

31:42

is society has to come up with a

31:44

number. Sadly, I think that could be a

31:46

veneer case. You know what I mean? Hold

31:48

on, let me finish my thought here. The key thing

31:50

is there's some age in which we all agree it's

31:52

reasonable for kids to be using social media. And there's

31:54

a certain age when we think it's not reasonable. And

31:57

back to capitalism, I think a very good point you made,

31:59

Chamath. These companies are going to

32:01

have to say, well, if we lose the 12 to 15-year-olds, is

32:05

that better for society and

32:07

better for our business? And we just all agree

32:09

that social media should start at 15 or

32:11

16, and then the handset manufacturers and

32:14

the social sites all have to get

32:16

permission from your parents to use them, period, full

32:18

stop. And that's it. And that may

32:20

be where this all winds up, I think. And then

32:22

also explaining the algorithms, I think, is the next thing that's

32:24

going to happen. People are going to

32:26

have to disclose how these algorithms work. I think you're making

32:29

a very good point, which is that is the right conversation

32:31

to have. My point was that

32:33

instead of having that conversation, which is more societal,

32:35

it involves David's right parental responsibility.

32:38

What is our role? Absolutely. By

32:41

being actively involved. And by the way, the trends around

32:43

family formation and the fact that there's way

32:46

more single-parent families make this problem even harder

32:48

because now there's only one person to check

32:50

in and not two people to check in.

32:53

So all these things societally build on itself. And

32:55

for that reason, that is the absolute right

32:58

conversation to have. My point is that's not going

33:00

to be why the rules

33:02

need to get rewritten. The rules will get

33:04

rewritten because there's an economic argument by a

33:06

different sector of the economy, in this case,

33:08

the trial lawyers and other folks, that say

33:10

there's a trillion dollars to be had if

33:12

we get this law changed. They are motivated

33:14

enough to do that. Yeah. A

33:17

parasitic sector. So there is a bill

33:19

right now working its way through the

33:21

California Assembly that would go to Gavin

33:23

Newsom that would prohibit the use of

33:26

social media by under 16-year-olds. So that

33:28

is actually happening. Yeah. I

33:30

agree with you. That's a better debate to have than changing

33:32

Section 230 in a way that's going to lead to more

33:34

censorship. By the way, just on that point, I

33:37

think Republicans need to understand this in particular

33:39

is that the anger

33:41

towards these companies is bipartisan. The

33:44

outrage is bipartisan. The moral panic is

33:46

bipartisan. You saw in that

33:48

hearing, you couldn't really tell the difference between

33:50

Republicans and Democrats. Okay. Full

33:53

display. I wouldn't be surprised if

33:55

they agreed, like Jamal said, on

33:57

some sort of change to Section is

34:01

that Republicans and Democrats have fundamentally

34:03

different objectives. Fundamentally, Democrats

34:05

want there to be more censorship. They say

34:07

this all the time. We want you taking

34:09

down more content, not less. Republicans want there

34:12

to be less censorship, okay? So if they

34:14

agree on the same piece of legislation, only

34:17

one of them can be right, okay? And

34:19

the question is, who's gonna be right? My

34:22

guess is that if Democrats and Republicans agree

34:24

on a Section 230 modification, the Democrats know

34:26

what they're doing, and the Republicans generally

34:28

being the stupid party who get outsmarted all the

34:30

time by Democrats, are going to agree to something

34:33

that they later regret. So at the end of

34:35

the day, I don't think that bipartisan

34:37

legislation should be possible. The

34:39

anger is bipartisan, but the objectives are

34:41

not, and if something gets through, it's

34:43

going to be because Republicans make a huge mistake. And

34:46

just to give a tangible example of this, okay, take

34:49

the Second Amendment, okay? Do

34:51

you think that in this world where there's no

34:53

Section 230, that Republicans are

34:55

stoking me out of conversations online

34:58

about, say, gun enthusiasm? No

35:00

way, because every time some harm

35:02

happens, every time there's a shooting

35:04

of some kind, a plaintiff's lawyer

35:08

is gonna ask to, not the person who

35:10

posted the content, talking about how

35:12

much they love their guns, and you

35:14

know, look, it could be totally innocuous, okay?

35:16

It could be a forum on

35:19

Facebook or Reddit where people are just

35:21

having conversations about gun reviews,

35:23

or gun reviews. Yeah, it could

35:26

be totally innocuous conversations, okay? People

35:28

having the right kind of conversations about guns,

35:30

okay? But you know

35:32

that every one of those

35:35

websites that hosts those conversations

35:37

will be targeted in relation to any

35:39

harm that occurs in the real

35:42

world. And very soon, Reddit and

35:44

Facebook and all the rest will

35:46

feel compelled to ban any conversation

35:48

related to even Second Amendment rights. Okay,

35:51

this is where it will lead if Republicans get rid of

35:54

Section 230. You will be

35:56

the ones targeted, not liberals.

35:58

All right, great discussion. you

36:00

know, it's important to have the right discussion. This

36:02

reminds me of the abortion discussion where nobody would

36:04

ever talk about the number of weeks. Like,

36:07

that's at the core of the issue. If we could agree

36:09

on the number of weeks, we can agree on the age

36:11

here, you know, for kids to use these things, maybe we

36:14

can move forward. Let's move forward on the

36:16

docket here. We got so much to talk about. And

36:18

I think the number one story of the

36:20

week was Elon's pay

36:23

package and this ruling that occurred in

36:25

Delaware. Let me just tee this up here.

36:27

Many of you probably know about this

36:29

already, but in 2018, Tesla's board approved

36:32

a performance-based compensation package for

36:34

Elon. It was approved by 73% of

36:36

shareholders. Elon

36:39

and his brother, Kimball, would have put that at

36:41

80%, but they were excluded, obviously.

36:43

This is lower than maybe some other

36:46

support levels. According to Reuters, you

36:49

typically see 95% for

36:51

a executive compensation package. But

36:53

this one was very unique. It was all stock. There's no

36:55

cash bonus, no salary. 12 tranches of

36:58

stock was very creative in how this

37:00

was put together because Elon got nothing if

37:02

he doubled the value of Tesla, but then if he

37:06

increased the value of the top line

37:08

revenue and the market cap increased

37:11

by $50 billion, he

37:13

got 1% more of the outstanding shares, which

37:15

is an amazing help for shareholders, obviously, because

37:17

the market cap on the company went up

37:20

50 billion. The initial plan

37:22

was only worth about 2.6 billion,

37:24

but since Tesla crushed it from 2018 to 2023, we'll

37:27

throw up a chart here. It's one of the great runs

37:29

in the history of capitalism, how

37:31

revenue and sales grew at this company. So

37:35

it made it the largest comp package in the history of

37:37

public markets. And if you compare Tesla

37:39

to Apple, the second highest increasing stock price

37:41

during that same time period, Apple went up

37:43

345%. Tesla

37:45

went up 800%. In 2018, a

37:48

Tesla shareholder sued Elon and Tesla's board,

37:50

claiming the pay package was unfair. The guy had

37:53

nine shares, a full nine shares, not 10,

37:55

nine, worth $2,500. His

37:57

stake went 10X in those six years. So he made a-

38:00

fortunate on that bet. And

38:02

then on Tuesday, a Delaware judge voided Elon's pay

38:04

package siding with the investor. Elon

38:07

can appeal it to

38:09

the Delaware Supreme Court. Sacks, your thoughts on

38:12

this ruling? I teated up. I think I

38:14

got all the details in there. If I

38:16

missed any, please add them. Well,

38:19

I think in order to reach

38:21

this ruling, the judge had to find three

38:23

things and all of them had to be

38:25

the case in her opinion. Number one, that

38:27

the pay package was excessive. Number two, that

38:30

the process by which they came up with

38:32

the pay package was not

38:34

fair, meaning it was not sufficiently adversarial

38:36

enough that the directors in

38:38

her opinion had too many ties to

38:40

Elon and didn't again take

38:43

enough of a antagonistic

38:47

role in negotiating that package. And

38:49

number three, and I think most importantly, that

38:52

the shareholder vote was invalid because even if

38:54

the first two had been true, the shareholders

38:56

approved it. And that would have been good

38:58

enough, but she says that the shareholders weren't

39:01

sufficiently informed. And

39:03

specifically, I think this argument hung on a

39:05

few internal emails where people said that they

39:07

thought that they could hit the numbers. I

39:10

think that of the three legs of

39:12

this, well, all three have been challenged

39:14

by opponents of this verdict. I mean, number

39:17

one, yes, the pay package ended up being

39:19

a gargantuan amount, but you have to look

39:21

at an ex ante, not ex post. Nobody

39:23

thought Elon could hit all these numbers. Back

39:26

at the time, this package was negotiated. Let's

39:28

be frank. It was absurd. The

39:30

idea that he would 10 X, it was great.

39:33

A great clip with Andrew Ross Sorkin, where they're all

39:35

laughing at the idea that he's going to hit these

39:37

numbers. Remember, this is at a time when Elon was

39:39

going through what was called production hell, where he was

39:41

sleeping on the floor of the factory. Yeah, with their...

39:43

The model three hadn't come out yet, and nobody, nobody

39:45

believed that the model three was going to

39:47

be the hit that it was. In fact, all

39:49

the stories were poo-pooing that idea and basically saying

39:51

that Tesla is basically screwed because they can't get

39:53

the production line working correctly. Yeah.

39:56

You want to play this? Tesla now announcing a

39:59

radical new company. It could be

40:01

perhaps the most radical compensation plan in

40:04

history. The executive will receive no

40:06

guaranteed compensation of any kind at

40:08

all. He gets a salary,

40:10

he gets bonus equity, he only

40:12

gets equity that vests over time, but

40:14

only if he reaches these

40:16

hurdle rates which are, dare I say,

40:18

crazy. The only part of it that

40:21

I think is really relevant is where

40:23

Sorkin says that the milestones are crazy,

40:25

meaning that everyone thought it

40:27

was a pipe dream that the

40:29

company would ever hit these numbers. Okay, so that's

40:32

point number one on magnitude. On

40:34

the second part of the ruling about the

40:36

process, it is true that

40:40

like in most venture backed startups, there's

40:42

a long standing relationship between the founder

40:44

and the investor because they work collaboratively

40:47

to try and make the company a

40:49

success. There were emails that

40:51

came out where Elon shows that, I'm

40:54

not trying to go for the maximum here. Did

40:56

the investors go in with a hostile antagonistic attitude that

40:58

we're going to try and pay you the lease? No.

41:02

But did Elon go in with the attitude that I'm going to try and take

41:04

the most? No. The email showed that.

41:06

What they tried to do was come up

41:08

with something that they thought was fair, that

41:11

would fairly reward him for

41:13

outsized performance. And if

41:15

he had merely increased the value of Tesla

41:18

from 59 billion to 100 billion, he would

41:21

have gotten nothing. Let's just keep that in

41:23

mind. So they tried to come up with

41:25

something that would reward him for outsized performance and

41:27

give him absolutely nothing for merely decent

41:30

or good performance. The

41:32

third point about the

41:34

shareholder vote. I don't think

41:36

that there was anything about

41:38

the shareholder vote that the shareholders didn't know. I

41:40

don't think that the company didn't release. Elon

41:43

always said, yeah, we're going to do this. We're

41:45

going to be one of the most valuable companies

41:47

in the world. He's always been super optimistic about

41:49

their ability to reach these targets. But if you

41:51

looked at all the Wall Street analysts, including Sorkin

41:53

there, they thought

41:55

that these targets were unreachable.

41:58

Well, also to add to that, Sam, you're going to be a very good person. This

42:01

had the largest short position, I believe, at

42:03

that time of any company ever. People

42:06

were betting with their dollars that this company was going

42:08

to zero. There were a ton of

42:10

people who the narrative was, they'll never deliver the

42:12

Model 3. It was two years late, right, or

42:14

something in that range. They

42:16

kept trying to get the Model 3 out. It was taking forever. So,

42:19

yeah, it's absurd. Also in that case,

42:21

Sorkin said in that same clip, there's been a lot

42:24

of speculation of Elon stepping down after the Model 3

42:27

is in production in the judge's ruling. He said the

42:29

exact opposite. There is no reason to leave Elon on

42:31

leave, except that he was running like

42:33

two or three other companies. It was actually quite

42:35

possible that he would leave. He never wanted to

42:38

be CEO of Tesla. People forget that, too. He

42:40

had tried three CEOs of Tesla, and

42:42

he only took over Tesla. I remember it

42:44

was because he said, Jason, this

42:46

thing's going to fail if I don't take it over. He

42:49

tried three different CEOs in the beginning. People

42:51

forget that fact. And

42:53

there was Scuttlebutt that he would hire somebody. I

42:55

remember there was rumors about Sheryl Sandberg maybe getting

42:57

off of the job or something like

42:59

that. I remember he was going through production

43:02

hell. He was sleeping on the factory floor. And

43:04

he was talking in interviews about how miserable his

43:06

life was at that time. I mean, pan confirm.

43:09

Yeah, exactly. So, look, did he have leverage to

43:11

basically say, you know what, let's hire a CEO?

43:13

Yeah, absolutely. Okay, Shamath, any

43:15

steelmen you can do of the other

43:18

side here, like as an investor, public

43:20

market investor sometimes? When

43:22

you saw that pay package, what did you think? I

43:25

don't know if you were a shareholder of Tesla at the time

43:27

or not. In the mid teens,

43:30

I started investing in

43:32

public stocks as well

43:34

as private tech companies. And

43:37

I got invited to give a presentation

43:39

at the Irisone conference, which is like

43:41

the most prestigious conference

43:43

of public market investors. In May, you show

43:45

up at the Lincoln Center,

43:48

and everybody in the audience is paying like $10,000

43:50

a ticket or something. And all the proceeds go

43:52

to a foundation

43:55

in support of this gentleman Irisone who passed away.

43:58

But in any event, it's like, Ackman. and

44:04

I picked Tesla. And I was a very big

44:06

supporter in many ways and still

44:08

am. And

44:11

I think that I knew the company, frankly, better

44:13

than most people, except for him,

44:16

obviously. But I think that I studied

44:18

this company quite deeply. When I and

44:20

I'm just setting the context, when I saw the

44:22

pay package, I thought, he's

44:25

making a mistake. This is unachievable.

44:29

I thought the probabilities were in the

44:31

low single digits. And

44:34

then he did it, which just

44:36

kind of shows how incredibly adept

44:38

he is as a CEO

44:41

and a manager and an executor.

44:46

So then, you know, to go back five or six

44:48

years later, after he actually

44:50

does something that

44:53

so massively disproportionately

44:56

positively impacted investors, and

44:59

then to just rescind it and unwind it,

45:02

I think is really un-American and unfair.

45:05

And I think it sets a very poor

45:07

standard for why anybody

45:09

should actually build a company governed in

45:11

Delaware. It makes no sense anymore. And

45:14

just to give you that example, he and

45:16

I have both now done this, but like these

45:18

incremental companies that I've started are

45:21

in Nevada. They're in different places

45:23

because I find the Delaware court

45:25

slightly and increasingly unpredictable

45:29

and acting with other mandates that they

45:31

weren't ever given. So what

45:33

do you think that mandate is? You had

45:35

a place where there

45:37

was highly predictable governance, and

45:40

they had very narrow ways in which

45:42

they would act and opine.

45:46

And I think in a situation like this, where

45:48

you had every opportunity to

45:51

actually vote this thing down, and

45:54

what little of the documentation that I saw

45:56

about the communication back and forth Doesn't

45:59

seem to support. For his theory that

46:01

he rammed through, nobody rams anything

46:03

through over nine months where he

46:05

takes months, long breaks and. He

46:08

tells the Gc this is actually more than

46:10

I wanted. Nobody does. Yeah, if you're needs

46:12

the opposite of ravaged something through. Bread.

46:15

Was A and and I suspect if you really

46:17

asked him and I haven't but I would. She.

46:20

Probably thought it was like

46:22

largely crazy. And. So I think

46:24

a lot of people thought that week that we

46:26

were as shareholders and I'll tell you that I

46:28

felt this way. Getting. His

46:30

hard work. And. That he may

46:32

have just mathematically big mistake it. So yes,

46:35

he got fifty five or. That.

46:37

Packages or a fifty five point eight billion.

46:39

But. You're missing the point where every

46:42

other investor made five hundred billion dollars.

46:45

Friday. And investors as little as there's

46:47

did approve it. It will have a chilling

46:49

effect I think. and how people think about

46:52

compensation it will cause companies. To

46:54

be even more constipated and

46:56

sclerotic. An unimaginative.

46:59

As a result of this, Because.

47:01

The most talented individual

47:03

entrepreneurs. Now. I've even

47:05

more of an incentive. For. Incorporating

47:08

in other places and also

47:10

staying private. And I

47:12

think what that deprives. Is

47:15

the broader shareholders including this gentleman? Look,

47:17

we live in America. He had the

47:19

right to sue. He. Had nine

47:21

shares and he was able to

47:23

bring this lawsuit. Nine Shares. Image.

47:26

David and Goliath. yeah I made the idea that

47:28

are ninety trashy ten x their money but what

47:31

you desert but the you know? unimaginable at night.

47:33

or I don't know if he remained a shareholder

47:35

to this or period. But even those nine shares.

47:38

Text. In value. As

47:40

as. But he would now

47:42

be deprived of that in this next

47:44

the duration of the Lawn Mosques Because

47:46

why would they ever go through this

47:48

to put that much. Work.

47:51

Into something to be so at risk

47:53

personally, your own mental and physical health.

47:55

We saw him in those periods. And

47:59

then to have taken. way I think is

48:01

deeply deeply unfair. Sacks. You're

48:03

right. This deal was a win-win.

48:05

I mean, if Elon could achieve these numbers,

48:07

it was good for him and it was

48:09

great for shareholders and that's why I think

48:11

the key point is that 73 or

48:15

80 percent, depending on how you want to count it,

48:17

approved this deal. I think they knew

48:19

everything relevant that they needed to know when they

48:21

approved it. This is

48:23

the deal that most shareholders and most

48:25

companies would want for the CEO.

48:28

The deal is you get nothing unless

48:31

you deliver an outsized return for shareholders.

48:34

Most CEOs won't sign up for this deal. Most

48:37

CEOs work their way up through the corporate ladder.

48:39

They get into the CEO chair and then they

48:41

pay themselves huge amounts of money in regards to

48:43

whether the company succeeds or fails. That's

48:46

the deal they want because they don't really have confidence

48:49

in themselves to deliver what

48:51

Sorkin called the crazy outcome.

48:54

Elon had the confidence in himself to deliver the

48:56

crazy outcome and nobody was really complaining.

48:58

They were complaining about this until, like you said, this

49:01

small shareholder who's really basically

49:03

just a name

49:05

planer for the trial lawyer's bar or somebody

49:08

who wants to get Elon to bring this

49:10

suit. Six

49:12

percent to create 600 billion

49:15

in value. I mean, it's quite a bargain,

49:17

folks. I think if you went to Ford

49:19

or GM and said, hey, would you like Elon to be

49:21

your CEO? I think they'd offer them

49:23

half the company. Mary Bear doesn't want this deal.

49:26

Yeah, no, J. Cal Sacks said something really important

49:28

that you just mentioned as well, which is that

49:30

if you actually look at the average compensation plan

49:32

of most public company CEOs,

49:35

it actually is very much counter

49:37

to shareholder value. I'll give you

49:39

one simple example. If

49:41

you look at the number of CEO comp packages

49:43

that are tied to earnings per share growth, but

49:47

then if you actually look at how these

49:49

CEOs achieve their EPS targets, they

49:52

do it by raising debt. So in

49:54

debting the company, increasing the enterprise value

49:56

by loading the company up with

49:58

debt and then driving repress. purchase plans. And

50:02

what do those do? I mean, look, if you look at

50:04

Disney, where do their repurchases come from? From debt. So

50:06

does debt help an equity shareholder?

50:09

It categorically does not. Under no world

50:11

does it do that. However,

50:13

for the CEO and for the handful

50:16

of investors that can hold on for

50:18

long periods of time, or to have you, they

50:20

benefit from a lower share count, they benefit from

50:23

increased EPS, and then the CEO gets compensated. And

50:26

so to say that tacitly, what

50:28

you disapprove of are

50:30

performance incentives, and what you

50:32

are actually approving of are

50:35

mechanics that saddle a

50:37

company with debt and allow

50:40

basically gaming of numbers is

50:43

what you've implicitly also set.

50:46

And this is where I think the Delaware

50:48

court used to be

50:50

known for a level of

50:52

intellectual clarity that would have prevented that

50:54

implicit assumption. But that's now what's left

50:56

on the table. And I think it will have

50:58

a ripple effect across how

51:01

so many other companies design their

51:03

compensation plans, how CEOs think about

51:05

risk. No CEO, as

51:07

Zach said, will ever want

51:10

an incentive-laden plan like this, ever. They will want to

51:12

get- Right, because 10 years later, you could do all

51:14

the work and then it gets canceled. Canceled.

51:17

So why would you completely do it? Because it's

51:19

not something that is totally gameable, right,

51:22

where you'll have 90 plus percent

51:24

support and approval because

51:26

of how vanilla and benign it is

51:28

on the surface, but

51:31

it will actually be quite a terrible

51:33

plan underneath the surface. And what I

51:35

mean specifically are these EPS targets for

51:37

CEOs. So Elon did the one

51:39

thing that was crazy, which was I'm just going

51:41

to do it based on pure profitability and performance,

51:45

and he gets punished. And all these

51:47

CEOs in this other class were like, let

51:49

me saddle these companies with debt that actually

51:52

undermine shareholders have been rewarded.

51:54

There is already a mechanism for somebody who disagrees

51:56

with the compaction. This person who owned the mad

51:58

shares could have sold the- their shares, it's a

52:00

liquid market. At any point in time, that person can

52:02

say, I don't agree with this. I'm taking my nine shares

52:05

and $2,500, I'm going to put it in Apple. I'm

52:07

going to put it into, I like Tim Cook's

52:09

pay package better, I like Betteoff's package better and

52:12

I'll make my best share. The person had choice,

52:14

yes, Max? This

52:16

person was not a victim. He's not victim

52:18

because he could have sold the shares and bought

52:20

other shares, and he's not a victim because he

52:22

10Xed his shares and beat the market. Yeah. I

52:24

mean, look, I don't place the blame on the

52:27

shareholder per se because

52:29

this is really about a judge's interpretation of Delaware

52:31

law and what companies are allowed to do. So

52:35

whether the shareholder was harmed or not

52:37

or had one share or a million shares,

52:39

that's just the way that this

52:41

case gets into court. The question is the interpretation

52:43

of Delaware law. And again, the part of

52:46

this that I would go back to that I

52:48

think was the mistake is that I think the

52:50

shareholder vote was valid. I

52:52

think the process was valid as well. I don't

52:54

know that the process has to

52:56

be this extremely adversarial process where one side's

52:58

pulling for the most and one side's pulling

53:00

for the least. I don't think either side

53:02

operated that way. But again, I think shareholders

53:04

knew what they needed to know. And the

53:06

evidence of that is in all

53:08

of the public coverage at that time, nobody

53:11

thought Elon was getting a good deal, right?

53:14

No one thought he was getting successfully good. They thought he was

53:16

getting a delusional

53:19

deal, meaning delusional for him. And

53:22

everybody seemed to be okay with the

53:24

idea that if somehow Elon

53:26

could pull off this miracle, that he would

53:28

be entitled to this compensation, and he would

53:30

get nothing if he did it. Now

53:33

again, I would go back to, do you

53:35

think Mary Barrow would have wanted

53:37

this deal? While Elon was

53:39

spending the last five, six years

53:42

making Tesla go 10x, let's look at GM.

53:45

GM stock price was trading more or less

53:48

in a flat range. I

53:51

don't even think the share price doubled. Yeah,

53:54

compare it to you see the compare to button there,

53:56

but compare to and then put Tesla in there. Right.

53:58

So if Mary Barrow GM had signed for that

54:00

comp package, you would have gotten absolutely nothing,

54:02

which is why I'm sure that the thought

54:04

in everyone crossed their mind of having an

54:07

all incentive-based comp package that doesn't

54:09

even start until you at least double the

54:11

value of the company. By the way, on those

54:13

milestones, it wasn't just

54:16

the share price. It was share price

54:18

and revenue or profit targets being met.

54:21

So in other words, if the stock

54:23

just rallied because of macroeconomic conditions, like

54:26

for example, interest rates go down and then all

54:28

of a sudden the whole stock market goes up,

54:30

that was not good enough. It was also tied

54:32

to the combination of stock

54:35

value increases with revenue

54:38

and profit targets being met. It was a compact

54:40

because that couldn't be gamed. I mean, you

54:43

have to hit the numbers in order to get the

54:45

comp package. That's what she got. And I don't want

54:47

to pick too much on Mary Berry here. I guess

54:49

I'm picking her out because Joe Biden said that she

54:52

created the EV revolution. Well

54:54

done. They sold 17 cars. They

54:57

canceled the car. Congratulations.

54:59

Yeah. I mean, by the way,

55:01

there was a remarkable article in the Wall Street Journal

55:03

just in December where they

55:06

finally admitted that this whole idea

55:08

that GM had been leading any kind

55:10

of revolution or had been a transformational company

55:12

was revealed as basically a ruse. But

55:16

look, you have to wonder how much

55:18

of this is political. I mean, Delaware

55:20

is Joe Biden's state. He's the senator.

55:23

There were articles describing

55:26

how this judge was connected to

55:28

a law firm that had helped

55:30

Joe Biden get elected. And you

55:32

just kind of wonder whether Biden's

55:34

directive from the White House podium that we got

55:37

to get this guy or we got to look

55:39

into this guy, that he's an enemy, which

55:42

has been reflected through all of

55:44

these different administrative agencies' sudden actions

55:46

against Elon's companies for the Glass

55:49

House. Starlink and the FCC.

55:52

Yeah. There's been a whole bunch of these

55:54

issues. And you just wonder, is this another

55:56

manifestation of that? Yeah.

55:58

I mean, the... The conspiracy

56:01

theories are quickly coming

56:04

closer to reality. And we definitely need

56:06

to investigate that for sure, because the

56:08

FCC thinks it's crazy spending

56:11

$15,000 putting fiber into people's homes

56:13

when you could spend $1,500 giving them Starlink

56:15

makes no sense. And those are the same people who

56:18

have to wait for fiber, which I read this in a previous

56:20

episode. It doesn't make Starlink anyway.

56:22

They're going to buy Starlink while they're waiting for

56:24

the government fiber for 10 years. It's absurd. I

56:26

do want to uplevel this just back to what

56:29

I was saying. And I'll try to make the

56:31

point better. I think it's

56:33

really, really unfair what's

56:35

happening to Elon. But I

56:37

want to take a step back

56:39

and think about just the bigger picture. If

56:43

we want an

56:45

economy of vibrant companies that do

56:47

great things, we're going

56:49

to need to reward people to work at those companies.

56:53

In order for the United

56:55

States to continue to exert some

56:59

amount of dominance in the areas that we think are

57:01

important, we need to be economically

57:03

vibrant. And fair. And

57:05

fair. And the problem is that this

57:07

really perverts incentives. And it's going to exacerbate a

57:10

trend that I think has actually held a bunch

57:12

of our companies back. The first

57:14

thing I just wanted to show you guys was just this little

57:17

thing that it's just a pie

57:19

chart that shows, OK, how do CEOs

57:21

get paid? So

57:23

we want CEOs to go and

57:25

run really important companies. Right?

57:28

We want those companies to do great things in

57:31

the world. We want these CEOs to be deeply

57:33

motivated to go and push the boundaries of what's

57:35

possible. Right? We want all that. And

57:38

so we want to compensate them to do those things. This

57:40

is just a representation of how CEOs have

57:42

structured their pay packages. And you'd

57:45

say, wow, all of these numbers seem reasonable.

57:48

Return on capital, total shareholder return, earnings

57:50

per share. So what you

57:52

need to do then is double click into this. Right?

57:55

So this is how CEO pay packages

57:57

are made. and

58:00

this is my problem with a bunch of these

58:02

companies, all the returns,

58:04

all that shareholder return that you saw,

58:06

the return, it's all driven by share

58:08

buybacks. This is an

58:11

artificial gamesmanship of performance. This is

58:13

not companies pushing the boundaries,

58:15

you know? This is not Disney

58:17

figuring out. It's not innovation. This

58:20

is Disney creating foot faults and

58:22

falling into potholes of their own

58:24

making. But you can drive

58:26

great compensation just because you can game the

58:28

way that you are paid. This

58:31

doesn't make America great.

58:34

It doesn't create American exceptionalism. In fact,

58:36

it just creates a bunch of financial

58:40

engineering that results in

58:42

marginal companies. And David just gave an example

58:44

of one that could be considered that. So

58:48

the point is that when you have one person

58:50

that tries to buck this trend, I

58:53

just think it has a huge impact

58:56

by basically saying, hey, play the game

58:59

like everybody else. Just game

59:01

it. Just dial it in from your country

59:03

club. Make sure that you become

59:05

a member of Augusta. That's more important

59:07

to us than actually sleeping on the factory

59:09

floor. And don't take risk. I mean,

59:11

if you think about what do you do if you're Apple,

59:14

you're Google, you're Microsoft, you're sitting on tons of cash.

59:17

The safe thing to do, you see what they do. You

59:20

just buy back the shares. You

59:22

can't do M&A. You issue debt. Yes,

59:24

and then you buy back the shoes. You

59:27

encumber the shareholders with

59:30

debt, and then you

59:32

artificially inflate total shareholder return and

59:34

earnings per share and the return on invested capital

59:37

because of how we can play these games in

59:39

America. So right now what we

59:42

are doing is we are not motivating CEOs

59:45

to run great companies. We're motivating

59:47

CEOs to understand financial arbitrage.

59:49

The result will be crappier

59:52

companies that diminish American

59:55

exceptionalism. That is the only outcome.

59:57

Perfect, we said. And there really are two caveats there.

1:00:00

number one. We have to let M&A occur as

1:00:02

well because that's a better thing to do in some

1:00:04

cases with this excess capital

1:00:06

profits people have sitting there. And the

1:00:08

only time really to buy shares back is when you

1:00:10

think they're undervalued. You would agree. Well, to your point,

1:00:12

I actually think you're absolutely right. If you have a

1:00:15

bunch of CEOs that don't know what they're doing, which

1:00:17

is what these charts kind of show, let them buy

1:00:19

whatever they want, because they're going to screw it up.

1:00:21

And that's fine for us anyways. Yeah, it's great. So

1:00:23

we refer fluid marketplace. The CEO

1:00:25

that you don't want to have be

1:00:27

able to buy companies is the actually

1:00:29

motivated one to get paid

1:00:31

when things go really well and to be more

1:00:33

profitable. So that would be the

1:00:36

CEO where you would be scared. Oh my gosh,

1:00:38

more M&A for that person may be bad. Yes.

1:00:40

But more M&A for these CEOs is who cares?

1:00:42

Yeah, Microsoft starts buying a bunch of stuff. Google

1:00:44

starts buying stuff at their primes. Man, that's scary,

1:00:46

right? When Bill Gates went on a heater, he

1:00:48

bought, I think he bought PowerPoint,

1:00:52

a bunch of these suite was bought, not created.

1:00:54

And then Google and Facebook, man, they went on

1:00:56

heaters, YouTube, Instagram, WhatsApp. And that was the golden

1:00:58

age of M&A. All right, we got a couple

1:01:00

more things on the docket. I just found that's

1:01:04

good sex. Great conversation. So just while

1:01:06

we were talking, I just looked up

1:01:08

what Mary Barra's compensation was over

1:01:10

the last several years. And

1:01:13

according to this source, she

1:01:15

was paid $167 million for

1:01:18

four years. So while Elon got zero,

1:01:20

thanks to the judge's decision, Mary

1:01:22

Barra, basically, if you add in probably the fifth year,

1:01:24

let's call it roughly $200 million of compensation

1:01:28

over the last five years. Now look at

1:01:30

the stock chart of GM. Literally,

1:01:32

it is the same

1:01:34

price today as it was five years ago.

1:01:37

It's $38 a share today. It was $38 a share five years ago.

1:01:41

No, but it's worse. It's probably worse

1:01:43

because they've issued options in sense. So

1:01:45

there's dilution you have to take in.

1:01:47

So it's worse than that, actually. But

1:01:49

at best, the stock price is flat.

1:01:52

And the stock basically fluctuated in line

1:01:54

with market trends when there was an

1:01:56

asset bubble in 2021. The

1:01:58

share price went up about 50%. and

1:02:01

she got even more stock and options

1:02:03

during that time, but it was

1:02:06

a no-lose proposition, basically. She got paid regardless

1:02:08

of how the stock did. And

1:02:10

then when there was simply market volatility, she did

1:02:12

even better. One of the nice things

1:02:14

about Elon's package is that unless

1:02:16

Tesla at least doubled in value, he

1:02:18

would get nothing. And it

1:02:20

was tied to milestones around revenue and

1:02:23

profit. So he couldn't just ride market

1:02:25

volatility to getting more comp.

1:02:27

If you gave this executive the same deal

1:02:30

as Elon, she would have had to double 38. So

1:02:33

she would have had to get to $76. About

1:02:36

80, all around, 80 bucks a share. Yeah, she'd have to

1:02:38

get to 80 bucks a share before she got anything. I

1:02:41

bet you if she got zero, if she got paid $1

1:02:43

in her healthcare and she had to get to $80 a

1:02:45

share, I bet you that would be an $80 stock price.

1:02:48

Well, I don't know if she has the

1:02:50

ability to engineer that

1:02:52

outcome, but I doubt- Oh,

1:02:54

so she's saying she's unqualified. No,

1:02:57

well, the Wall Street Journal said it. The Wall Street Journal just

1:02:59

had this 10 euro- No, that's my

1:03:01

point. Where they said that she failed. They're

1:03:04

hiding the importance then. Well,

1:03:06

look, I mean, I don't want to say she's in conflict. I

1:03:09

just want to say that the stock price is the same. The

1:03:11

Wall Street Journal had an article talking about all- She swept 200

1:03:13

million. All of her transformation initiatives have failed.

1:03:15

She swept roughly 200 million over just the last five

1:03:17

years. I think she's been there 10. So

1:03:20

how is she in charge? She's probably made

1:03:22

several hundred million dollars and

1:03:24

the company hasn't created any value for shareholders. So how

1:03:27

is she still in charge? But just Jason, this is

1:03:29

the way the Fortune 500 works. It's

1:03:31

a country club, okay? Who

1:03:33

gets appointed to these companies? These companies

1:03:36

are not run by founders. They're not

1:03:38

even run by VCs or serious skin

1:03:40

of the game. The kind of directors

1:03:42

that this judge didn't like, okay? It's

1:03:45

run by people who play the game.

1:03:47

They basically are on other boards and

1:03:50

it's basically a back-scratching club. And

1:03:53

they choose the CEO. They choose someone who's

1:03:55

politically savvy, who's worked their way up through

1:03:57

the system, who donates to the right people.

1:04:00

who can get Joe Biden to come to a

1:04:02

factory and talk about how great they are and

1:04:04

hold an E.B. summit at the White House and

1:04:06

invent this fiction that

1:04:09

they were responsible for this innovation. This

1:04:11

is basically how the system works. So, crap.

1:04:13

I'm glad that women have been admitted to

1:04:16

this country club. It is still a

1:04:18

country club. It is still a back scratching club.

1:04:20

It is basically a collection of people who don't

1:04:23

create any value, but pay themselves enormous amounts of

1:04:25

money, hob not with the right people, and work

1:04:27

their way in with the powers that be at

1:04:29

the White House, who then talk about how great

1:04:31

they are, while actually

1:04:33

accomplishing nothing. Nothing. Zero

1:04:36

point zero. I think the common through

1:04:38

line in these two conversations we've had

1:04:40

this morning is

1:04:42

about just how capitalism

1:04:44

can be perverted, if

1:04:46

you will, by a small group

1:04:48

of actors. In the

1:04:51

first example, I think what we were

1:04:53

talking about is the trial lawyers' association

1:04:55

and their ability to impact

1:04:57

and influence what is going to happen

1:04:59

around Section 230. In this example, I

1:05:02

think what it speaks to is the influence that

1:05:04

a small group of consultants can

1:05:07

have in having built a very

1:05:09

thriving business in designing these compensation

1:05:11

plans for CEOs. It reminds me of

1:05:13

a clip of

1:05:15

Buffett and Munger and

1:05:18

Nick, if you just want to play it. I

1:05:20

think they say it in very clear plain English.

1:05:22

Not to debate their opinion, just to state it,

1:05:24

but Nick, if you want to play it. Do

1:05:26

not bring in compensation consultants. We don't have a

1:05:28

human relations department. We don't have a... The

1:05:30

headquarters, as you could see, we don't have any human

1:05:32

relations department. We don't have a legal department. We don't

1:05:34

have a public relations department. We don't have an investor

1:05:37

relations department. We don't have those things, because

1:05:40

they make life way more complicated, and everybody

1:05:42

gets a vested interest in going to conferences

1:05:44

and calling in other consultants, and it

1:05:46

takes on a life of its own. Well, I

1:05:49

would rather throw a viper down

1:05:51

my shirt front than hire a

1:05:53

compensation consultant. Tell

1:06:00

me which kind of consultants you actually like, Charlie.

1:06:03

Oh, man. Waldorf

1:06:05

and Stadler, you know, from the

1:06:07

Mumpet Sachs. They remind me of Stadler

1:06:10

and Waldorf. Well,

1:06:13

you mentioned... Grumpy. You mentioned

1:06:15

this is a problem in capitalism. I think

1:06:17

there's two kinds of capitalism, broadly speaking. There's

1:06:19

crony capitalism and there's risk capitalism. Risk

1:06:22

capitalism is the founder who starts with

1:06:24

nothing but an idea along

1:06:27

with the investors who are

1:06:29

willing to write a check knowing

1:06:31

that nine times out of ten, it's going to

1:06:33

be a zero. But maybe in

1:06:35

that one out of ten chance, it's going to be

1:06:37

an outsized return. That is true risk

1:06:39

capitalism. Everyone has skin in the game. They work together,

1:06:42

entrepreneur, board members to try and create

1:06:44

a great outcome and they

1:06:46

work out an arrangement where

1:06:49

everyone benefits. It's a win-win situation.

1:06:51

Okay. That is risk capitalism. That's

1:06:54

the part of our economy that drives all

1:06:56

the innovation, all the progress, all of the

1:06:58

job creation. Then you got crony capitalism.

1:07:00

You got these companies that have been around for

1:07:02

100 years. The value was

1:07:04

created by people long dead. And

1:07:06

it is now managed by both

1:07:09

directors and professional managers who work their

1:07:11

way up through they go to the

1:07:13

right business schools and they join the

1:07:15

right organizations and they donate the right

1:07:17

politicians. And they somehow

1:07:20

engineer a situation where they get in

1:07:22

control and then they pay themselves as

1:07:24

much comp as they can possibly justify

1:07:27

whether or not they create any value for the shareholder.

1:07:30

That's what we saw at GM.

1:07:32

And that's crony capitalism. And

1:07:34

while they're doing it, the president's

1:07:36

son is probably going to figure out a way to take a nice

1:07:38

big chunk out of it too. Okay. That's

1:07:41

the system that we have. And Cohen's the great. Here

1:07:44

we go. Now, which

1:07:47

of these two systems receives the

1:07:49

brunt of the criticism by the

1:07:51

mainstream media? Who is attacked

1:07:54

and who is celebrated? Okay.

1:07:57

Here we go. There's your rest. I've

1:08:00

seen a zillion attacks on Elon

1:08:02

Musk. I saw one article pointing

1:08:04

out that all of Mary Bear's transformation in

1:08:07

General Motors was a failure, created no value

1:08:09

for shareholders, and the whole thing was basically

1:08:11

a bot. Well, how's the union doing and

1:08:13

how did the union go? One article. And

1:08:15

I'm frankly shocked that the Wall Street

1:08:17

Journal even ran that article. Yeah, okay.

1:08:20

Yeah. And how did the union do?

1:08:22

And who did the union get behind and vote for? Yeah, let's

1:08:24

maybe double click on a couple items there. We'll see. All

1:08:26

right, let's see. I want to go over one more thing. I'm going to

1:08:29

go over the last few years and I just wanted to talk a

1:08:31

little bit about IPOs possibly coming back.

1:08:33

I just interviewed Alexis Ohani and the

1:08:35

founder of Reddit, which according to Bloomberg,

1:08:37

they've been advised by potential

1:08:40

IPO investors to target a $5 billion

1:08:42

valuation. They're going to go public possibly

1:08:44

in March, which is wild. That's only a

1:08:46

couple weeks away. In peak, ZURP,

1:08:48

Reddit had raised at a $10 billion valuation. That

1:08:50

was back in 2021. They

1:08:53

were valued in $15 billion in the secondary market.

1:08:55

That's where people like previous employees,

1:08:58

angel investors, et cetera might trade their shares.

1:09:02

And so if it goes out of five, it's

1:09:04

going to be between a 50% to their tear cut and it

1:09:08

is trading at $4.5 billion. Chamath,

1:09:10

we talked about this a bunch, the

1:09:13

downrun IPOs, Instacrat, I think being

1:09:15

the best example they got out, but they've

1:09:18

been hovering at a

1:09:20

really low number for a long time and

1:09:22

people have been talking about them possibly being

1:09:24

a takeout candidate by DoorDash or Amazon or

1:09:27

Uber. So your thoughts on the Reddit IPO

1:09:29

news? I think that if I

1:09:31

had to price the IPO, I would

1:09:35

be modeling two important

1:09:37

levers in the business. The first is, what

1:09:40

is the actual attainable

1:09:42

revenue from my

1:09:44

audience? And so what

1:09:47

is the ARPU of the average Reddit

1:09:50

user? And you can

1:09:52

model it as a distribution and I think Facebook

1:09:54

has the best data because they've been publishing it

1:09:57

in their quarterly returns for a very long time now.

1:10:00

for over a decade, which is there

1:10:02

is a distribution of value of a

1:10:04

Facebook user economically, right? At

1:10:06

the upper end, you have the folks that are

1:10:08

on Facebook proper in America,

1:10:12

in a certain age band, in certain states that are probably

1:10:14

like 30, 40, 50 dollar

1:10:16

ARPUs, all the way down to, in

1:10:19

developing countries, those users are worth

1:10:22

low single digit dollars economically. And

1:10:26

I think that people will have to very much

1:10:28

understand what is the average Reddit user and

1:10:31

what is the distribution of economic value that they represent. That's

1:10:35

the first thing. And I think that that's really the thing that will

1:10:37

determine whether it's worth 5 billion or 10 billion or frankly 2 billion.

1:10:42

And then the second key lever, it

1:10:44

will be the risk factors in

1:10:47

the IPO because where the shareholder lawsuits

1:10:50

will come from, which

1:10:52

will really dictate if the how the

1:10:55

hedge fund community buys this thing, is

1:10:58

going to be the potential for

1:11:00

my ad ran against content that

1:11:02

is deeply offensive to me, that

1:11:05

whole construct. And

1:11:07

I think that they are going to have

1:11:09

to very carefully ring sense that liability to

1:11:12

get this IPO to be successful, but

1:11:14

also for them to execute a scaled

1:11:18

ad revenue business. And

1:11:20

not spending enough time on Reddit,

1:11:23

I don't know how bad of a problem this

1:11:25

is. I don't think it's 4chan

1:11:27

or 8chan as an example. But

1:11:29

I also don't think it's Facebook and Instagram. And

1:11:32

so it's kind of somewhere in the middle. And

1:11:34

I think that those risks are really what's going

1:11:36

to determine its terminal valuation. You know that they

1:11:38

they've always been under monetized, 800

1:11:42

million dollars in revenue reportedly, and 400

1:11:44

million monthly active users. So two bucks

1:11:46

a user compared

1:11:48

to 2030, 40

1:11:50

for the prime users

1:11:52

on Facebook's

1:11:54

network. So it's totally underutilized. Part of it is it's

1:11:57

a little bit of spicy content. Part of it is

1:11:59

that that's the number of including all the

1:12:01

international users. Of course,

1:12:03

SACS, there is a concept that Reddit

1:12:05

has the greatest pool

1:12:08

of data for large language

1:12:10

models. And something

1:12:12

like say Quora, maybe YouTube, amongst

1:12:14

the great pools of data. So

1:12:16

you think there's a play here with that? They have talked

1:12:19

about they want to get paid for licensing and that if

1:12:21

you want to use their data for your language money, you

1:12:23

gotta get permission to your thoughts, SACS. Sure,

1:12:25

I mean, that's gonna be an incremental revenue source

1:12:27

for sure. It's hard to know exactly how valuable

1:12:29

that is because we're still in the early innings,

1:12:31

but I mean, they can definitely do something with

1:12:33

that data. Brock's whole competitive advantage is having exclusive

1:12:35

access to Twitter's data, which is

1:12:38

updated in real time, basically, by

1:12:40

hundreds of millions of users. So yeah,

1:12:42

look, that data is valuable. We don't know

1:12:44

how much. I guess the numbers I saw

1:12:46

were that they're doing about 800 million

1:12:48

of revenue, growing about 20% a year. The

1:12:52

$5 billion valuation seems, I think, pretty good.

1:12:54

I mean, is it down from 10 at

1:12:57

the peak in 2021? Sure,

1:12:59

but everything's down since that peak.

1:13:01

I mean, that was definitely a bubble. I mean, I

1:13:03

can tell you, we bought some shares as

1:13:05

a late stage investment, I think, in 2018 at

1:13:08

a $2 billion valuation. Okay.

1:13:11

So, you know, a two and a half X in five years. I mean, it's

1:13:13

not setting the world on fire, but it's not a bad outcome.

1:13:16

Yeah. And you know, the

1:13:18

investment bankers will know how to price

1:13:20

this to take it out and make it successful.

1:13:22

Yeah, and if it becomes a billion dollars in

1:13:24

revenue and have a 20% profit margin, 200 million,

1:13:29

you can start doing your back of the envelope math there for

1:13:31

a 25 EBITDA, you know, 20 times price

1:13:34

earnings ratio. So it doesn't

1:13:36

seem outrageous. It does seem like

1:13:38

such valuable and under-monetized asset.

1:13:42

Do you think there's a likely acquirer here, if you were

1:13:44

to think about somebody who might want to own this? Do

1:13:46

you think it's a Microsoft for the data, Google for

1:13:49

the data? Yeah, I think there's probably people who would

1:13:51

like to own this, but the problem is that, well,

1:13:53

two problems. One is they just can't get it through.

1:13:56

We've talked about this before. The M&A window is

1:13:58

even more. closed than

1:14:00

the IPO window, I would say. And

1:14:03

the other thing is just because of

1:14:05

the raunchy content and all

1:14:07

of the brand issues that come with that,

1:14:10

it's not clear to me that let's say Microsoft

1:14:12

would want to own that headache. You know, they

1:14:14

might not want to be hauled up in front of these

1:14:16

congressional hearings that we talked about, these

1:14:18

kangaroo courts where it's very easy

1:14:20

to go on Reddit and pick out, well,

1:14:22

what about this post? What about that post?

1:14:24

Why'd you let that one through? Why'd you

1:14:27

let that one through? Well, because it's a

1:14:29

platform of user-generated content where hundreds of millions

1:14:31

of people post what, billions of items, and

1:14:33

you can have the best content moderation policy in the world.

1:14:37

I think that Reddit is the honeypot of edge

1:14:39

cases. It is the place you go when

1:14:42

you're just so disaffected that you can just

1:14:44

let loose anonymously. It's not. Right.

1:14:47

I mean, at a certain point, you have to realize that when people

1:14:50

cherry-pick those edge cases, what they're really

1:14:52

saying is a platform like this shouldn't

1:14:54

exist. Yeah. Right. Because

1:14:58

there's no way to eliminate every

1:15:00

single one. Which is basically like saying, if

1:15:02

you just take platform, you just take conversation,

1:15:04

you're saying this conversation shouldn't be allowed to

1:15:06

exist. They don't want user-generated content to exist.

1:15:08

They don't want what, remember, within your time,

1:15:10

it's called unfettered conversations. Unfettered

1:15:13

conversations. They want controlled conversations.

1:15:15

Yeah. Yeah. Pretty

1:15:17

dangerous. It just reminds me

1:15:19

of, remember, Bob Iger for

1:15:21

like 10 seconds was actually considering Disney

1:15:24

buying Twitter. Can

1:15:26

you imagine if they actually bought it, what would

1:15:28

have happened? I mean,

1:15:30

it would have been chaos. Well,

1:15:32

it would have been content moderation on

1:15:34

steroids where they would

1:15:37

have massively empowered and scaled up content

1:15:39

moderation even more than what Jack Dorsey's Twitter

1:15:41

was doing. I think Jack actually,

1:15:43

he was unable to operationalize his principles, but he

1:15:45

did have principles in favor of free speech. Absolutely.

1:15:49

Whereas Disney, I don't even think

1:15:51

has those principles. So... No,

1:15:54

they would have censored everything. Yeah.

1:15:56

You know, it's paradoxically the most censored.

1:16:00

social media platform is TikTok. Like my

1:16:02

TikTok is bulldogs and sandwiches like

1:16:04

duch a moth and then Sopranos

1:16:06

clips. And every time there's Sopranos clips,

1:16:09

when somebody gets whacked, they have to blur

1:16:11

it out or the person who's uploading it takes

1:16:13

that clip and cuts out the actual person

1:16:15

being whacked. It's crazy. All right, I got to

1:16:17

give Sax his red meat. We're going

1:16:19

to now go to our war correspondent, David Sax.

1:16:22

But in all seriousness, tragically,

1:16:24

we had a terrorist attack. And

1:16:28

the response from some

1:16:31

of our Republican senators

1:16:33

was absolutely insane. They want

1:16:35

to bomb Tehran and go after Iran

1:16:37

and start World War Three. Sax, I

1:16:39

know you have some strong feelings on

1:16:41

this and you're always about diplomacy and

1:16:44

pursuing peace. What's your reaction to Lindsey

1:16:46

Graham and the neocons? Well,

1:16:49

you had several Republican senators try

1:16:51

to go Biden into striking Iran

1:16:53

immediately. It was, you know, not

1:16:55

just Lindsey Graham is in favor

1:16:57

of every war, but it was

1:16:59

Mitch McConnell, Senator, Republican leader, John

1:17:02

Cornyn. And there are some other

1:17:04

ones who are, you know, demanding

1:17:06

immediate retaliatory strikes. It's very unfortunate

1:17:08

that we had three of

1:17:10

our troops get killed and another

1:17:12

dozen or so get injured. This

1:17:15

was at a base on

1:17:17

the border between Syria and

1:17:19

Jordan. Some of us have

1:17:21

been saying that we have no business being in Syria.

1:17:23

I mean, I tweeted 10 months ago that

1:17:25

all we were doing was putting our

1:17:28

troops in harm's way and

1:17:31

risking getting drawn into a

1:17:33

larger conflagration. And that's exactly where

1:17:35

we are right now. I mean, you

1:17:38

had to know and many commentators pointed

1:17:40

out that these bases are very exposed.

1:17:42

They're very vulnerable. They don't have good

1:17:44

enough air defense against they don't really

1:17:46

have a good answer to

1:17:49

the swarms of drones that

1:17:51

these local militias have.

1:17:53

And based on the intelligence we

1:17:55

have right now, are

1:17:57

the space is tower 22 is attacked. by

1:18:00

an Iraqi militia that's operating

1:18:02

there. Why

1:18:05

did they get attacked? Well, these militias

1:18:07

want the United States out of their countries.

1:18:09

They want them out of Iraq. The

1:18:11

government of Iraq has said we want the US

1:18:13

out of Iraq. The government of Syria says we

1:18:15

want you out of Syria. We

1:18:18

are there without a congressional authorization for

1:18:22

military force. I mean, what are we doing

1:18:24

in Syria? We're just occupying

1:18:26

that country without,

1:18:29

again, without a war being ever declared

1:18:31

against the Assad regime. So

1:18:34

some of us have been saying we need to get

1:18:36

out there for some time, and if we don't, it's

1:18:38

inevitable that something like this is going to happen. And

1:18:40

sure enough, it did. And when it does happen, you

1:18:42

get this lunatic fringe who unfortunately are some of the

1:18:44

leaders of the Republican Party calling for

1:18:47

a larger war against Iran. And I think

1:18:49

Biden, to his credit, has so far

1:18:51

held back. And he has

1:18:53

not... A lot of restraint. He's done some restraint.

1:18:56

However, they've been actively talking about this,

1:18:58

and all the reports are they're gaming

1:19:00

this out, and there is going

1:19:02

to be some sort of retaliatory strike. It

1:19:05

might be focused on these militias

1:19:07

in Syria and Iraq. It could

1:19:09

be attacking Iranian assets. We

1:19:12

don't know. If they do

1:19:14

attack Iranian assets, Iran has promised a response.

1:19:16

So I think the Biden presidency is at

1:19:18

a little bit of a crossroads here. Depending

1:19:20

on the action they choose, we could very

1:19:22

rapidly find ourselves engaged in

1:19:24

a wide wider regional war on five

1:19:27

different fronts. I mean, a war with

1:19:30

Iran would involve us in

1:19:32

Iran, Iraq, Syria, Lebanon, and

1:19:34

Yemen, where we're already bombing.

1:19:37

So this could turn into a

1:19:39

huge conflagration in the Middle East.

1:19:42

I don't think Biden should be drawn into this. Yeah,

1:19:45

no good answers here. The

1:19:48

craziness to me, Shabop, is like, these people

1:19:50

want to go bomb Tehran because

1:19:53

you have

1:19:55

some militias doing these activities. I

1:19:57

mean, this would be some terror.

1:20:00

who's French, we're going to just blow up Paris, and

1:20:02

a bunch of civilians are going to die. There's

1:20:05

no proportionality here, and there's no

1:20:07

direct relationship here. It's like two

1:20:09

or three steps removed. What do you think, Stu?

1:20:12

I think the thing that we've lost

1:20:14

in this whole issue is, how

1:20:17

is it possible that a multi-deca-billion

1:20:20

dollar drone system was

1:20:24

designed by the military industrial complex

1:20:26

in a way where when

1:20:28

one of our drones is coming back and there's an

1:20:30

inbound drone, that doesn't seem like a

1:20:32

weird edge case, where we couldn't

1:20:34

handle it. Because at the root cause

1:20:36

of what happened was a pretty

1:20:39

faulty way in which we were dealing

1:20:41

with confusion about

1:20:43

what was our drone and what was the

1:20:46

enemy drone. And

1:20:48

I think that that's also worth talking about before we

1:20:50

talk about bombing in other countries is, we're

1:20:53

the most sophisticated technological country in the world.

1:20:55

Our weapons systems are the most sophisticated weapons

1:20:58

systems in the world. Am

1:21:00

I supposed to believe that if Palmer,

1:21:02

Lockie, and Andoril were building this system,

1:21:04

that this is what would have happened? Absolutely

1:21:06

not, yeah. That there's no beaconing system

1:21:08

on these drones that you could turn on remotely,

1:21:10

that there's no way when you see

1:21:12

the number of drones in an aerospace that are

1:21:15

yours so that you can quickly triangulate which ones

1:21:17

are not yours. All

1:21:19

of this to me, I think is also worth exploring

1:21:21

because if it's really again,

1:21:23

faulty engineering

1:21:26

because of this monopoly

1:21:29

oligopoly in certain sectors of our economy,

1:21:31

that then cause us to go and

1:21:33

make a foreign relations decision about going

1:21:35

to war. And we don't even

1:21:37

talk about this thing as a root cause, it's worth

1:21:39

talking about. A different example on the

1:21:42

same vein of this is like, it turned out that

1:21:44

in that Alaska Airlines issue, they

1:21:47

actually shipped the plane without the door plugs.

1:21:52

We are the most sophisticated country in the

1:21:54

world, guys. Our

1:21:57

most sophisticated industries. are

1:22:01

not showing their best in this movement. And

1:22:03

I think that this is yet another example of

1:22:06

before we make totally separate decisions about war and

1:22:09

implicating our children's safety, we

1:22:12

can also just ask, wait, we spent billions of

1:22:14

dollars on this thing. How is it

1:22:16

possible that this, like

1:22:19

honestly, like this is exception handling. This

1:22:21

is like CS101

1:22:23

type stuff, guys. Door

1:22:26

plugs, put the plugs in the door.

1:22:29

Regulatory captures the answer. This is

1:22:32

crazy. There's no competition and they're

1:22:34

charging costs plus, they're not innovating

1:22:36

anymore. And they need competition, right?

1:22:38

Just like SpaceX was massive competition

1:22:42

for all of the governmental agencies around,

1:22:44

we have to re-need that for what we need. We

1:22:46

need execution in our industrial complex.

1:22:50

And now the way to answer that turns out

1:22:52

to be a foreign policy decision

1:22:54

to go to war. And I think that

1:22:56

those two things need to be decoupled for

1:22:58

a second so that we can deescalate and say,

1:23:00

hold on a second, this thing happened, but

1:23:03

why did it happen? Meaning, of course people

1:23:05

are gonna attack us. We are

1:23:07

the bright shining beacon on a hill. People

1:23:09

should hate us and want to attack us. That's

1:23:12

just the nature of being a winner. To be

1:23:14

fair, they're not attacking us because we're the shining

1:23:16

city on a hill over here just minding our

1:23:18

own business. They're attacking us because we're, No, I

1:23:20

know that I hate it, but I'm saying, but

1:23:22

I think the minute we're there, we

1:23:25

should expect that these things work. We should expect

1:23:27

it. So there's a few

1:23:29

things happening here with our military industrial complex.

1:23:31

So the first one is that drones have

1:23:33

been a huge game changer. We've seen this

1:23:35

in the Ukraine war. The one

1:23:37

really new technological element has been

1:23:39

drones has completely changed the

1:23:42

face of war. And one of the

1:23:44

things it does, it's a huge leveler because

1:23:46

these cheap drones give

1:23:48

these militias in Syria and Iraq, where it

1:23:50

gives the Houthis and Yemen a

1:23:53

capability to strike at us that they didn't

1:23:55

have before. And we saw that

1:23:57

our air defenses are just not. really

1:24:00

cut out to deal with this. There was an

1:24:02

article describing how it was costing

1:24:04

us $2 million to use an

1:24:06

air defense missile to shoot down

1:24:08

the drone or a cheap rocket that

1:24:10

costs just a few thousand dollars. So

1:24:14

you have this asymmetric warfare now where

1:24:16

we simply cannot afford

1:24:18

over a sustained period to

1:24:21

shoot down all these drones. But wait,

1:24:23

sorry, can I ask a question? Yeah.

1:24:25

One of the most important partners

1:24:27

in that region is Israel. Israel

1:24:30

has what we have fought to

1:24:32

up until now, an impregnable

1:24:34

system, the Iron Dome, which

1:24:37

is meant to deal with all of these edge

1:24:39

cases, projectiles of all

1:24:41

sorts, shapes and sizes coming in

1:24:43

every direction. I've never heard of

1:24:46

when the Iron Dome has failed. And

1:24:48

we are sending Israel billions of dollars. Why couldn't

1:24:50

we actually just buy the Iron Dome system for

1:24:52

them and say, you know what, we're going to

1:24:54

secure our bases in Syria and in

1:24:57

all of these other places? Yeah. Well,

1:24:59

I'll tell you. So there's a military analyst named Stephen

1:25:01

Bryan, who I followed as a former undersecretary

1:25:04

of defense. And he talked about this. He

1:25:06

has been calling now for before

1:25:08

this attack happened to send two

1:25:11

Iron Dome systems to Syria, to Iraq, to

1:25:13

basically the Middle East to protect our troops.

1:25:15

He says, he said in his article, the

1:25:17

reason why the US Army hasn't done that

1:25:19

is because they didn't want to buy the

1:25:21

Israeli system. They've been favoring some homegrown system

1:25:24

that has improved. There's

1:25:27

a big problem there that we should be deploying

1:25:29

Iron Dome there. Look, I would just get out

1:25:31

of that area. I don't think we should be

1:25:33

there, but at a minimum, if we are going

1:25:35

to stay there, we have to protect our troops.

1:25:37

So yes, we should be deploying Iron Dome. But

1:25:39

the problem is that, again, just these drones are

1:25:41

a game changer and they

1:25:43

can overwhelm a system. Even

1:25:46

Iron Dome can be overwhelmed. If

1:25:49

Israel gets in a war with

1:25:51

Hezbollah, which supposedly has one drone,

1:25:53

it's N equals one. We were

1:25:55

overwhelmed by N equals one. Well,

1:25:58

I mean, I'm saying like it's true, right?

1:26:00

We were overwhelmed when n equals one.

1:26:02

There was a report just the other

1:26:05

day that one of our ships in

1:26:07

the Red Sea, it

1:26:10

was fired on by a missile

1:26:12

from Yemen. And it

1:26:14

made it through the Aegis system and they

1:26:16

took it down with their last line of

1:26:18

defense, these like close in guns. That

1:26:21

was kind of scary because the Hooties have

1:26:23

missiles that are capable of making it through

1:26:26

our main air defense system for ships. So

1:26:29

I'm just saying like these what's happening with

1:26:31

these cheap rockets and these drones is giving

1:26:34

our opponents capabilities that level the playing field

1:26:36

a little bit. But this is what I'm

1:26:38

saying. I understand that concept, but I also

1:26:40

understand that we have people on the ground

1:26:42

that work for Team America. Again,

1:26:44

I'll just take Palmer Lucky as an

1:26:46

example, who frankly, I would bet on 1000 times

1:26:49

over a Hootie Rebel, heal out smart

1:26:51

now power these guys. Why aren't our

1:26:53

best and brightest people in a position

1:26:55

to make these things? Well, you're right.

1:26:57

The defense industry is dominated by five

1:26:59

of these prime defense contractors who

1:27:01

work on cost plus are basically an

1:27:03

oligopoly. They're not particularly innovative, but they

1:27:05

just keep charging more every year for

1:27:08

the same product. So we're getting less

1:27:10

for more money. And they're led by

1:27:12

leadership who are motivated in a

1:27:14

way where the returns that

1:27:16

they generate and the success and the progress

1:27:18

they make is not really coupled to progress,

1:27:21

right? It's not really coupled to building an

1:27:23

even better version of Iron Dome. It's about,

1:27:25

as you said, having

1:27:27

a job that you've earned over many

1:27:29

years of fealty and

1:27:31

then getting paid an enormous amount of money to

1:27:33

just keep it going in the

1:27:35

same direction, even if that direction means you've

1:27:38

been adrift for decades. That's the shame of

1:27:40

it. And then you're seeing every day. Like,

1:27:42

isn't it incredible? Lindsey Graham is saying hit

1:27:45

Iran. And I bet you Lindsey Graham is

1:27:47

to getting donations from those five companies. I

1:27:49

don't know. Look, there's no question that we

1:27:52

need to shake up the military industrial complex.

1:27:54

We need to get a lot more startups

1:27:56

in there. There's a lot of VCs now

1:27:58

who are funding. defense startups.

1:28:00

So it is a big area. Andra obviously is

1:28:02

kind of the leader of the pack, but there's

1:28:04

a bunch of others getting funded. I saw that

1:28:07

Eric Schmidt even created a drone company. So

1:28:09

this is going to be a huge area

1:28:11

of innovation. And I think

1:28:13

that because of the Ukraine war, the

1:28:16

Pentagon must now realize the

1:28:18

urgency of being able

1:28:20

to mass produce effective drones as

1:28:23

well as create effective drone

1:28:26

air to counter measures. Yeah, counter measures. Yeah, just

1:28:28

to give you a sense of this, like we

1:28:32

love the series eight a company called sale drone

1:28:34

about seven years ago. And

1:28:36

they make drones for the seas, right.

1:28:39

And what we did was we put these massive

1:28:41

sensor arrays in these drones.

1:28:44

And because of because of the sensors,

1:28:46

it has perfect

1:28:48

visibility into what's going on in

1:28:51

any condition of weather, right? Day night,

1:28:53

it doesn't matter. And so these drones

1:28:55

in the Middle East, all

1:28:57

over the waterways allows us to have perfect

1:29:00

understanding of what's going on. But

1:29:02

despite that, it has taken years for us to

1:29:04

be in a position to generate

1:29:06

enough revenue. And now we're finally at that

1:29:08

scale with the Navy and whatnot. But

1:29:11

David, to your point, it is incredibly

1:29:14

hard for startups, no matter

1:29:16

how innovative we've been to break through

1:29:18

this log jam. And it's and

1:29:20

the reason is because what we are good at is

1:29:22

not what's rewarded, we are good at engineering

1:29:25

and execution. But what is rewarded, to

1:29:27

your point is this very

1:29:29

lobbying, specific form of

1:29:31

relationship management, right? And

1:29:34

cultivating certain pockets of

1:29:36

influence. It's a very difficult game to

1:29:38

play. If what we come as bright

1:29:41

eyed bushy tail from California with a

1:29:43

product that we think is superior that

1:29:45

actually helps advance American exceptionalism, it

1:29:48

still doesn't always land. It takes a lot

1:29:50

longer than it needs to in some cases.

1:29:52

Yeah. So the Pentagon and the military

1:29:55

industrial complexers can be a lot more

1:29:57

permeable to this type of innovation. I

1:30:00

think that they're gonna be incentivized to do it now because

1:30:02

they have to see what's happening in

1:30:04

Ukraine, what's happening in the Middle East, and

1:30:06

they realize that the gap is closed. And

1:30:08

we had three innocent people that were killed. These

1:30:11

people didn't deserve to die. They

1:30:13

didn't deserve to die in an N of one, it's

1:30:15

not an edge case, N of one, there was a

1:30:17

drone. Come on

1:30:19

guys, we're better than that. And so what do

1:30:21

you think's gonna happen if we get into a

1:30:24

war with Iran? Every single one of our bases

1:30:26

in Syria and Iraq, and we have a lot,

1:30:28

are gonna be sitting ducks. To your point, you're

1:30:30

forecasting, you're orchestrating a game plan for them because

1:30:32

it's like, if you were going to enter a

1:30:35

war, does it take a brilliant strategist to

1:30:37

sit in a room and say, wait a minute, if they

1:30:39

can't defend against one, what happens when we send 12? What

1:30:42

happens when we send 12 to every single place?

1:30:44

And to your point, Jason, this

1:30:46

is like the unnecessary escalation

1:30:48

that then happens because then we have

1:30:50

to respond with more force

1:30:52

and with more kinetic energy. It was

1:30:55

a certainty. Look, I've been talking on

1:30:57

the show about how our bases in

1:30:59

Syria and Iraq have been under attack

1:31:01

by these militias for months. I think

1:31:03

the last time we talked about it,

1:31:05

there had been something like 80 attacks.

1:31:07

And it was just a matter of time

1:31:09

before Americans servicemen were

1:31:12

killed, unfortunately. And so,

1:31:14

like you said, Jamal, this was predictable. What's

1:31:16

also predictable is that if we get in

1:31:18

a war with Iran, every single one of

1:31:20

our bases will be attacked. If we strike

1:31:22

on their soil, they will strike back. And

1:31:25

they have hypersonic missiles, they have precision

1:31:28

missiles. They can destroy

1:31:30

every one of these bases, unless those bases

1:31:32

have the top-of-the-line air defense, which most of

1:31:34

them don't. But if you go

1:31:36

to someone like Lindsey Graham and say,

1:31:38

listen, our troops are vulnerable, we need

1:31:40

to basically either pull out of

1:31:43

Syria and Iraq, or we need to consolidate down

1:31:45

to a few bases that have iron dome or the

1:31:48

best systems, these neocons will say,

1:31:50

absolutely not. We're not conceding anything. They

1:31:53

would never pick up a gun. They'd never wear a

1:31:56

uniform. Right, but they want us to strike Iran. So

1:31:59

these... strategies don't line

1:32:01

up. If you wanted

1:32:03

to attack Iran, the first thing you would

1:32:05

do is basically get all of our troops

1:32:08

out of harm's way who are currently sitting

1:32:10

ducks for Iranian retaliation. By the way, I

1:32:12

think it'd be a terrible idea, but that's what

1:32:14

you would do. So they have these strategies that

1:32:16

don't make any sense. John Greenewald All right, everybody.

1:32:18

It's been an amazing show for the

1:32:21

dictator chairman himself, Shamath Palihapitiya,

1:32:24

the rain man, yeah, David

1:32:26

Sacks. I am the world's greatest

1:32:28

moderator. We missed you. Sultan of Science,

1:32:30

David Freiburg couldn't make the show. And

1:32:33

we'll see. He's still in the... If anybody gets into the

1:32:36

Apple Pro Vision, Vision Pro, and

1:32:39

is somewhere out in the universe by Uranus and

1:32:41

you see him, bring him back for next week.

1:32:43

So I'm gonna go find him. We've got to

1:32:45

send out a search party to Uranus. Love you,

1:32:47

boys. David Sacks Bye-bye. John Greenewald Bye-bye. David

1:32:50

Sacks We will be needed

1:32:52

to let your winter ride. Weab

1:32:55

Lords Weab

1:33:02

Lords My

1:33:14

dog. We should drive with you. We should

1:33:17

drive with you. Oh

1:33:20

man. We need to

1:33:22

meet up with you. We should all just get a room and have

1:33:24

one big huge audience. We should all just have this

1:33:26

sexual tension that you just need to release somehow.

1:33:28

What? You're

1:33:31

a baby. You're a baby.

1:33:34

We need to get mercy on you.

1:33:37

I'm going now! We

1:33:45

need to get mercy on you.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features