Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
I'm. David Ferrie it in New Zealand accidently
0:02
marooned in America and I want to figure
0:04
out what makes this country tic nervous. One
0:06
thing I've learned about America this year is
0:08
that a both loves tic toc. And
0:10
also hates tick tock tick tock
0:13
influencers. A beware. Today, a house
0:15
committee voted on a bipartisan bill
0:17
that could effectively ban. The.
0:19
App and United States. The fear of
0:21
the Chinese spying on America using the
0:23
popular social media app became a giant
0:26
talking point in politics. Know American like
0:28
the idea of another country potentially spying
0:30
on them and I get it. And
0:32
but then I started to think about
0:34
all the ways America likes to spy
0:36
as well. America is a spying
0:39
superpower. A huge amount of internet
0:41
traffic runs through America, plus a bunch
0:43
of powerful software the world loves.
0:45
from Zoom to G mail are all
0:47
American an open to the American
0:49
government's prying eyes. And. Twenty twenty
0:51
two alone, a hundred and forty
0:54
five thousand Facebook users in a
0:56
hundred thousand g email accounts were
0:58
rifle through by the Us government
1:00
spy agencies. America's Foreign Intelligence Surveillance
1:02
Act has a bit code section
1:04
seven oh two, which means the
1:06
government's able to take part in
1:08
mass surveillance of things like emails,
1:10
takes social media teams, and whatever
1:12
you been up to on your
1:14
dating apps. I wanted to know
1:16
what this means for me living in America
1:19
and whether it's time for me to grab
1:21
my podcasting gear and run for the hills.
1:23
So. Get ready to share
1:25
your location data on all your
1:27
apps and wonder about who's looking.
1:29
Because this is the surveillance episode.
1:38
Birds. Fly,
1:43
list that it down in and then.
1:50
Monica they're watching they are Now don't
1:52
get it sounds like quite a serious
1:54
meaty topic, but it's also fun thinking
1:57
about these things. You know, Yeah
1:59
I mean. I. Have a weird
2:01
perspective on this. Were.
2:04
To. An accent I don't really
2:06
care. I'm kind of with you on
2:08
a personal basis. It's hard to get fired
2:10
up because others you thinking about the so
2:12
thinking you know I'm in America now I'm
2:14
a suspicious New Zealand or you know maybe
2:16
they're keeping an eye on me They always
2:18
look at me but suspiciously when I arrive
2:20
at the airport they always wanna know what
2:23
I'm doing here say like would have doing
2:25
here my do a podcast like what podcast
2:27
us feel like I'm going get kicked out
2:29
again but when it comes down to it.
2:32
Yeah. I don't really mind of their reading
2:34
my emails. It's funny isn't it? Before I
2:36
came here I put a hold of my
2:38
stuff. a New Zealand into storage, isn't a
2:40
big storage shed and some private stuff, my
2:42
own diaries and stuff. I'm like a someone
2:44
that sets the bow and steals everything from
2:47
my storage shed as anything in there that
2:49
I'd be like all know someone's got that
2:51
and I feel a bit the same about
2:53
that was a bit of a weird sort
2:55
of metaphor by feel a bit like surveillance.
2:58
I don't think it's a weird metaphor.
3:00
I think it's That's exactly what it
3:02
is, except the differences. like if roads
3:04
a bell. Minds. Are diaries
3:06
or you might not want her to
3:09
read as you know cause you have
3:11
a secret thoughts about her and the
3:13
egg and all kinds of things. That's
3:16
true. It's almost worth having a friend
3:18
get a hold of your private things
3:20
than a complete stranger who has no
3:22
context, right? Yeah, and Rosa
3:24
Bell or me we wire read
3:26
your diary. A queen has a
3:28
big internal God's grace since you're
3:31
keeping some far as. I. Think there's
3:33
nothing worse. The. Her. That I
3:35
really truly believes. I mean
3:37
this. I don't think the
3:39
government cares about our. Babies,
3:41
secrets Or even secrets that
3:43
for us. Mean. A lot
3:45
frightened like could ruin our personal
3:47
lives. They don't care. Absolutely
3:50
not. How do you feel about
3:52
that balance of privacy as idea of
3:54
having privacy but also protecting all
3:56
of us from. Whatever. Perceived
3:58
threats with a that in colonel or
4:00
external that balance you? Does it matter
4:02
that much today today. I've moved
4:05
along the spectrum on this because I
4:07
used to really not care about privacy
4:09
of go ahead looked through everything I
4:11
think you a looked through everything of
4:13
every winds. If you're gonna catch a
4:16
predator. Do it. Or. A terrorist or something.
4:18
Go. Ahead. But. Now
4:20
and I do think this has
4:23
to do with my proximity to
4:25
public figures. I do respect privacy
4:28
more. I understand privacy more and
4:30
I do understand that people like
4:32
to exploit public figures. Yeah, and
4:35
those people are my friends, so
4:37
that is scary. So I now
4:40
care about it more. But I'm
4:42
more worried about the hacker who
4:44
would want to expose something then
4:47
the government Because again, I don't
4:49
think the government. Cares about?
4:51
I don't even know some
4:54
dasa be person on intercepts
4:56
someone. Malicious coming in is much more
4:58
terrifying and I guess it comes on day
5:00
your trust of the government. Do you think
5:02
it's great? They what our best interests in
5:05
mind. It's a perception of them right? Some
5:07
people see the government is being this really
5:09
scary force. Others like think gosh as they
5:11
are looking after us. yeah. I don't
5:13
think it's either of those. I
5:15
don't think they necessarily have our
5:18
personal best interests at heart, but
5:20
they do have an overarching plan.
5:22
They are trying to maintained or
5:24
depending on his in charge democracy
5:26
and they are China Ultimately, keep
5:29
America at number one. I know
5:31
that they don't care. About.
5:34
Some. Gossipy sit they just
5:36
like don't have time for that. Government decide?
5:38
Have you overhead the little secret viewed
5:41
as any once veiled you? someone at
5:43
school gotten to a diary or did
5:45
your parents have a open the wrong
5:47
drawer and you know your private stuff?
5:49
He visited him, the privacy violations my
5:52
mum. Winter. My diary one time
5:54
but I was discussing voices did boring. Which.
5:56
Is like rainy day. I was nothing scandalous
5:58
in the i didn't. Scandalous. into I
6:01
was twenty five or something. But.
6:03
Your thoughts were. Poly Scandal. I have
6:05
to relearn. Naughty! Yeah, another source said
6:07
that they stayed in the brain which
6:09
is very healthy. I never run down
6:11
or anything now. just ping pong me
6:13
around out there. Never got them on
6:15
paper, thank god. Okay, I'm sort of
6:17
with you on that. I don't
6:19
trust a journal. Me, you know
6:22
Dax has these journals. He's written
6:24
in these journals every day since
6:26
efforts and presided over here, and
6:28
I am baffled by his willingness
6:30
to just had his turn. All
6:32
obviously trust the people in his
6:34
home. He is like having your
6:36
brain sort of or out there are
6:38
getting your brain and putting it externally
6:40
at this is scary thing that. And
6:43
I've tried diaries a couple times because I
6:45
think it's supposed to be good. They say they
6:47
are you. Say that's
6:49
and every time I do it the
6:51
next day I just like rip up
6:53
the papers from the Davis said nobody
6:56
can ever read thousand. You get rid
6:58
of that, Get rid of the evidence. Does
7:00
the government gets to you Decks could
7:02
get one of those readers as little
7:04
lock on it so lox it was
7:06
a little g one of those diaries
7:08
patently whole point he doesn't care and
7:10
he must be writing. I who knows what's
7:12
in that part. It probably is like. Yours. Name
7:14
diary the in. I
7:18
guess voice. But also I'm sort
7:20
of baffled by it. that trust.
7:23
Also like if he knew that you read
7:25
his or and all your pi just never
7:27
in his life. Ever. Again, imagine if he
7:29
lived on a little dire in the attic and
7:32
he got in there. Now is this rifling through
7:34
it? Taking family? I want to say that. He's.
7:38
So big if he attacked it would be scary. It's
7:40
because I know he does get you can get fired. I
7:42
wouldn't like to be the receiving end. I wouldn't do
7:44
well. No you? what am I mean
7:46
I go ahead and say you and
7:48
nights Okay, this is sort of a
7:51
ding ding ding because we just had
7:53
a diagnosis, he passed. On. Armchair Expert
7:55
on expert, the second a weird air
7:57
your shudder at. The
8:00
doctor therapist but she's a diagnose sociopath.
8:02
She has a book coming out and
8:04
she's written a lot for like publications
8:06
and staff. I love this. It's.
8:09
Fascinating, but part of what
8:11
her says the opposite. the
8:13
presents. She's does a lot
8:15
of stalking and snooping. that's
8:17
a big part of her
8:20
compulsions and said his his
8:22
diary was al. Should be in
8:24
there a hot one hundred percent she be
8:26
in there are so she's fully aware and
8:28
she's decided to use this as a learning
8:31
experience as if teach others and kind of
8:33
spread awareness. see. Describes it as a
8:35
it emotional learning disorder and so
8:38
there are ways you can. Learn.
8:41
It's is obviously not in the ways
8:43
that know typical people do so you
8:45
do have the know that about yourself
8:47
and it's so interesting. She. Should
8:49
work in government surveillance. I think
8:51
she be good at it gets
8:54
secrets. This is
8:56
an abrupt turn and no way totally to
8:58
do this and a good way. Now we're
9:00
recording this remote because you've got a little
9:02
cough coming on in case you by the
9:05
so you don't say to me. So I
9:07
got see Jos I don't see Jos if
9:09
you're going that what hell is this Us
9:11
show episodes Go See Joe is a really
9:14
popular fruit in New Zealand save super common.
9:16
There's certain time of the year they just
9:18
falling off trees that everywhere like you to
9:20
to this too many vetoes and America theory
9:22
had find so after that episode some sort.
9:25
Of produce place Melissa's which is apparently
9:27
it's headquarters is in California but they're
9:29
all over America. They sell Ceos and
9:32
they sent me. Suffer. A
9:34
slight amazing thing I've gotten from doing the shows
9:36
to pick box of these Harrys a senior I
9:38
cooked every house is running I listed above your
9:40
say that it features a you gripping one. Now
9:42
I am. It was so nice of you. I'm
9:44
really excited. We talked about them a lot on
9:47
the episode is that. What? You pictured. The
9:49
skin is tougher than I
9:51
picture. Is Tuscan? It's. Tough. Maybe
9:53
you said that fits. I didn't
9:56
internalize it. Kind of what the shape
9:58
of and eggs and it's got a unique live. though
10:00
thing at the saw a little gear little has
10:02
a solo had on that. Add on
10:04
it does ever had. I
10:07
feel that it it looks
10:09
like a lamb in shape.
10:11
cucumber, You're
10:14
not. You're not wrong. They're very funny
10:16
thing. Associations that you're completely rise, the
10:18
skin feels a cucumber. A note: Give
10:20
it a big snus in high. Oh
10:23
the same. What? Do you think of ammo?
10:25
It does smell right. You're right. It's not really.
10:27
That hungry? a case of i were you
10:29
more I get through floral. Are
10:31
you going to my mere? I want? you know. And.
10:33
I have a spin. Spin and you wonder
10:35
nice what you you may be? Cut through it with
10:38
a spoon. We need to cut in half. However, you'd
10:40
think about cutting the second half. A spoon could do
10:42
it. May. Give it a try. Because.
10:44
It is a tough exterior but it's also
10:47
genes. or if you just sort of saw
10:49
through with the spoon and pass, this should
10:51
do us. And. my square a
10:53
little bit on you heard as a
10:55
juicy since her moist center he really
10:58
get in there The sources. Upon.
11:02
Smoker. That what a beautiful scenes it now smell
11:05
that. give that a good sniffing series and do that.
11:07
Again, It smells delicious. Okay, great
11:09
really does. Doesn't look good. Of.
11:14
White House is the color. And
11:16
looks not vibrant. Not
11:18
vibrant know looks like my pale skin.
11:21
It was like a banana with school
11:23
stuff and. Yeah, I think
11:25
that's fair. Sort a jelly a sensor
11:27
and instead of emotes exceeds that around
11:29
the outside. It. Smells so good,
11:31
right? Really does. Now it is a
11:33
petite silly I could do with another couple
11:35
of days ripening some a little bit worried
11:37
it will be a little bit on the
11:40
sour side but I want I get that
11:42
spoon out. Use Do Outs does the whole
11:44
thing almost like an higgs. the to all
11:46
of it, the white, the hot and the
11:48
game you need to. I slide that spoon
11:50
around and really stood out of it's green
11:52
sell. A think you're right about
11:55
the rightness, but amazon as bite. oh
11:58
wow yeah chewing on Oh
12:00
wow. Now look, is it
12:03
sour? Very. Okay, so
12:05
I've given you an unripe fijo
12:07
which is a bad move on
12:09
my behalf. I got too
12:11
excited. Wow. Gave
12:13
it to you too early. This is so interesting though.
12:16
Okay, mine is very
12:18
sour but wow, it has
12:20
a lot of different flavors. This is
12:22
interesting. This is not what I was...
12:24
You're going back for more. This is a good sign. Okay,
12:27
I just watched this wine show and on
12:29
the wine show, they're identifying all the flavors
12:32
and the aromas. That's what I feel
12:35
like I'm doing right now. I taste
12:37
strawberry. I taste a
12:39
bariness. Absolutely. People
12:41
talk about strawberry. They talk about
12:43
hints of pineapple. Mm-hmm.
12:46
Mm-hmm. This
12:49
is nice. It is extremely
12:51
sour. Okay, so that's
12:54
not the ideal way. That's a wasteful tomorrow. Yeah,
12:57
I've left you a couple of others there. So just
12:59
sit them out on the shelf maybe next to some
13:01
bananas and that'll help ripen them all up. Is
13:03
it bad that I'm eating it not ripe? I know,
13:06
it's great. I mean I just love seeing you enjoying
13:08
New Zealand's best thing. It's
13:10
really good, David. Fantastic.
13:12
Well, yeah, I'm so glad. Okay, this
13:15
is great. I think this is the
13:17
first New Zealand thing. You like Tim Tams. You
13:19
didn't like pineapple lumps. I don't
13:21
think you like the Whittaker's chocolate. You didn't like Jaffas.
13:23
You hated those. Hold on. What
13:26
would... Jeffers,
13:28
do you remember I bought you
13:30
those round red candies? You
13:32
hated those. Oh, are
13:34
you talking about the orange chocolate? Yeah. Ah,
13:37
yeah, that was disgusting. You hated them. Yeah.
13:40
I hated those. But I know the chocolate was good.
13:42
The chocolate was very good. I saw it sitting there
13:44
a couple of months later. It was something
13:46
different, doesn't it? The
13:49
chocolate was good, but it wasn't... You
13:52
made it out to be like it
13:54
was gonna have magical
13:56
powers or something. Which to me
13:58
didn't, but it was very good. No,
14:00
I like the meat pies. No,
14:03
you did like the meat pie. That's right. You've
14:05
had a meat pie, you've had a sausage roll.
14:07
I got you a lambington. Yeah. The coconut sort
14:09
of, you know, that sort of fluffy cake they
14:11
had. Yeah, that was good. That was
14:13
delicious. But I think this is
14:15
the best. Yes. This
14:18
is great. This is great news. I
14:20
actually found out something interesting. It was
14:22
in an article, actually an article written
14:24
by Whittaker's chocolate, which is the New
14:26
Zealand chocolate brand. So possibly a bit
14:28
biased. But the headline in this piece
14:30
was, why does American chocolate taste bad?
14:33
Oh, God. And
14:35
I'm going to look more into this
14:37
because I think I want to do
14:40
a chocolate episode. But apparently, there's a
14:42
certain compound in some American chocolates that's
14:44
also present in rancid butter and vomit.
14:48
No, what? Oh, what's
14:50
that? Cool. So
14:52
sometimes there's some brands I got to be careful.
14:54
There's some brands of American chocolate I've eaten and
14:56
I thought that does taste a bit like vomit
14:59
to me. And so that's
15:01
why we've got very different tastes, I
15:03
think. So you as an American have
15:05
been raised on vomit tasting chocolate. So
15:07
when you actually taste sweet, delicious chocolate,
15:09
you're like, something's wrong here. No,
15:12
it's not. OK, first of all, you
15:14
are going to have to do an episode on this because I
15:16
don't believe you. Also, I on
15:19
a fact check recently because we had Rebel Wilson
15:21
on, she brought us Tim Tam's so
15:24
good from Australia. Yeah, she bought
15:26
you the right thing. They were
15:28
delicious. And then we did get on
15:31
the conversation of Australian chocolate. And then I
15:33
did look it up. It's because in
15:35
Europe and I guess Australia
15:37
and probably New Zealand, there's
15:40
God, I forget all the facts I read.
15:43
My brain does the same thing, believe me.
15:45
They're completely gone. But it's not
15:47
that there's something in our chocolate. There's something
15:49
in your chocolate. It's about like the milk
15:51
solid. It's something about the salt that make
15:53
it good. Yeah, it's the milk
15:55
side of things. We don't have. Well,
15:58
no, it's butyric acids. Turic
16:00
acid is the compound that's found in a
16:02
lot of American chocolate and that's the thing
16:04
that's also present in a lot of vomit.
16:07
Stop saying that! Oh
16:09
God! I vomited
16:11
for the first time in a long time the other day. Oh,
16:14
when you had the flu? Really off topic. Yeah,
16:16
I had the flu. I've been sick. I've
16:18
been in a bad way. It's awful vomiting, isn't it? It's
16:21
horrible. It's degrading. What am I doing?
16:23
It's awful. I'm
16:25
knocking on wood. I don't want that. Okay.
16:28
Now, sorry. I had a fever.
16:30
Really happy. Okay. So
16:32
I'm going to play you this little surveillance
16:34
documentary. I'm very excited about surveillance because I
16:36
read this book which I'm about to talk
16:38
to you about, Means of Control. And
16:41
so I just find this topic so interesting. So yeah,
16:43
this is what I learned. When
16:45
I was 16 and living in a small
16:47
town called Bethlehem in New Zealand, I was
16:49
obsessed with a movie called Enemy of the
16:51
State. It was directed by
16:54
the late great Tony Scott and starred
16:56
Will Smith and Gene Hackman. A
16:58
powerful man has been murdered. Holy.
17:01
A hidden camera recorded the crime. None of
17:03
this goes beyond us. We don't need any
17:06
more problems, do we? I don't remember it
17:08
all that well 25 years
17:10
later, but it had something to do
17:12
with a rogue NSA agent who used
17:14
America's surveillance powers to try and kill
17:17
Will Smith. Now here in
17:19
2024, with all this constant talk about
17:21
TikTok spying on us, I was back
17:24
to wondering what the NSA was up
17:26
to, and the FBI, and the CIA,
17:28
the American government. I lived
17:30
in America now, and I wanted to know
17:32
who might be prying into my business. My
17:35
name's Byron Tao. I'm a reporter
17:37
in Washington DC where I cover
17:40
mainly law, courts, and some national
17:42
security. I was talking to
17:44
Byron because he's just written an amazing
17:46
book called Means of Control. It's
17:49
about how America's tech industry has joined
17:51
forces with the government to make America
17:53
more of a surveillance state than ever
17:55
before. It wasn't an easy book for him
17:57
to write. He interviewed over 300 sources. over
18:00
four years, including a lot of people who
18:02
didn't want to talk. Yeah,
18:05
well, the most difficult part is
18:07
that when you're writing about government
18:09
surveillance programs or law enforcement techniques
18:11
that police agencies would like to
18:13
keep quiet, it's difficult to get
18:15
people to speak candidly. Often
18:17
you have to work on them hard to get
18:20
them to trust you. Often
18:22
you have to offer them anonymity
18:24
or confidentiality. And just in
18:26
general, it's a hard world to report on and
18:28
just takes a lot of time. But
18:30
he's done with the book. It's out, dedicated
18:32
to his mum, Barbara Ann. And
18:35
now he's stuck here with me. What
18:37
do you see as the things that are set in America
18:39
and makes it interesting to you when it comes to how
18:41
we're being surveilled? First, the United States,
18:43
at least in the past 20 or 30
18:45
years, has been the place where a lot
18:47
of these technologies emerged. And so, of
18:50
course, US government entities are kind
18:52
of at the forefront of figuring
18:54
out clever ways to use them
18:56
and to exploit them. And
18:58
there's just a robust technology sector in
19:01
the United States, and that technology sector
19:03
sometimes works with the government. The
19:05
second thing I would say is
19:07
actually in America, there's a deep
19:09
skepticism about government power and government
19:11
surveillance, which makes this topic controversial
19:13
in ways that I don't actually
19:15
think it's that controversial in other
19:17
places. At one point I interviewed
19:20
someone who was from the UK originally, and
19:22
he kind of rolled his eyes at the
19:24
notion that anyone would be concerned about the
19:27
government getting location data from his phone or
19:29
tracking him all the time on CCTV cameras. He
19:31
said that's just kind of how it is in the
19:33
UK. And I think
19:35
that Americans don't have that attitude
19:38
because a lot of Americans don't
19:40
trust their government. There's a long
19:42
history of civil liberties and civil
19:44
libertarianism in the United States, and
19:46
there's a lingering distrust about government
19:48
power. A lingering distrust of
19:50
government power. It has me thinking about enemy
19:52
of the State again, and a whole bunch
19:55
of American movies I grew up with where
19:57
America was the bad guy. I
20:05
suppose Americans not trusting the government is
20:07
pretty clear when you look to things
20:09
like the capital riots or just anything
20:11
and politics in the moment. The selection
20:13
year so far has been a shit
20:15
show of distrust. But. Back
20:17
to surveillance. Something. Out
20:20
Some sort of interesting as an outsider being
20:22
here is that there's so much talk and
20:24
paranoia and America. About the likes
20:26
of Tic Toc and Out China is
20:28
surveilling Americans. And. Yet, and
20:30
we see, America is doing similar things
20:33
and using similar tricks with it's own
20:35
people. I find that kind of fascinating.
20:37
It is really interesting, right? because there's
20:39
obviously not a moral equivalent you draw
20:42
between the People's Republic of China and
20:44
the United States. and in both governments
20:46
can be criticized by China is engaged
20:48
in activities and do that the Uss
20:50
Cole Genocide and in Xinjiang. But.
20:52
If you just talking about the
20:55
tools and the techniques of using
20:57
say, covert social media accounts to
20:59
push our content worth of surveilling
21:01
large parts of the globe, using
21:03
data without is something that the
21:05
United States government does and has
21:07
done. The United States government builds
21:10
apps or has contractors build apps
21:12
that collect data from people around
21:14
the world for intelligence purposes and
21:16
that sort of the same thing
21:18
that Us National Security officials are
21:20
worried about with Ted Talk. So
21:22
these information. Networks, the internet since
21:24
apps, and advertising. All of them can
21:27
be exploited by governments, whether it's the
21:29
United States government or the Chinese government.
21:31
And I think we're suddenly seeing
21:33
the Us government become aware that though
21:36
the U S had the technology
21:38
leads were twenty or thirty years
21:40
the future, it's probably belongs to other
21:42
countries. and then there will be
21:44
a robust Chinese tech sector. There will
21:46
be a robust Indian tech sector,
21:48
there will be a robust African
21:50
tax actors, and in the future. those
21:53
countries will exploit data potentially data about
21:55
americans in the way that our government
21:57
has been explaining it for the past
22:00
decades. What Byron
22:02
became particularly fascinated by is how
22:04
the American government uses consumer data
22:06
that's been hoovered up by various
22:08
apps and programs. When people
22:10
click I agree when they sign up to a new
22:13
thing on the internet, they're not generally reading the small
22:15
print. And a lot of the
22:17
time that small print is saying your data is
22:19
up for grabs to the highest bidder. So
22:22
while generally spy agency aren't allowed
22:24
to spy domestically, there are
22:26
loopholes, like buying data that
22:28
already exists on you. The
22:30
phenomenon I'm describing of data being available
22:32
for sale, I think that is a
22:34
form of bulk surveillance. And in the
22:36
United States in general, we
22:39
as a society have been very skeptical
22:41
about bulk surveillance. We're okay saying we
22:43
think this person is a criminal, I'm
22:45
going to go before a judge, I'm
22:47
going to get a warrant, I'm going
22:49
to look at their email accounts or
22:51
their text messages. I think most people
22:53
think that's reasonable. They don't think it's
22:55
reasonable is that everybody's data gets vacuumed
22:57
up into a giant database. And that
22:59
is technically not really allowed under
23:02
American law if you're targeting Americans.
23:04
The loophole is that if you
23:06
buy the data on the open
23:08
market, suddenly those constitutional protections that
23:10
Americans are used to, they go
23:12
away. And so I think Americans
23:14
generally speaking have been skeptical about
23:16
bulk surveillance programs, about mass surveillance.
23:18
They've been much more okay with
23:20
targeted surveillance, with limited surveillance, with
23:23
a judge supervising surveillance. And so
23:25
I think that the amount of
23:27
data that's sloshing around is starting
23:29
to challenge those traditional notions of
23:31
Americans' privacy and Americans' limited government.
23:33
Now it's possible that the social
23:35
bargain has simply changed and that
23:37
we all essentially have accepted this
23:39
world with both the upsides of
23:41
convenience and free digital apps, as
23:43
well as the downside of persistent
23:46
tracking by corporations and governments. These
23:48
are also comes down to this
23:50
big concept of the lawfulness versus
23:52
how ethical it is, right? Yeah,
23:54
I think so, right? Because in some
23:56
of these programs that I write about, the
23:58
government is buying our amounts of say
24:01
location data on the movement of phones and
24:03
cars. I mean it's technically been deemed legal
24:05
but if you look at the way the
24:07
data was collected, if you look at the
24:10
justifications the government has given for doing this,
24:12
essentially they're saying well consumers have
24:14
opted into this don't worry but
24:17
really consumers are not really being
24:19
told exactly how the data is
24:21
collected, they're not being told where
24:23
it's going, so it's not really
24:25
truly meaningfully consent if you want
24:27
to talk about having a consumer
24:29
truly ethically opt
24:31
into these programs because nobody who's a
24:34
potential target for surveillance by the Department
24:36
of Homeland Security or local law enforcement
24:38
is ever really going to voluntarily opt
24:40
into a surveillance program right? So you
24:43
are relying on some level of public
24:45
ignorance to track people this way. There's
24:48
a whole industry of data brokers
24:50
in America and their big customer
24:52
is the American government. Of
24:55
course, none of this is new. Back in 2019, a
24:58
Harvard professor termed it surveillance
25:00
capitalism. One senior national
25:02
security official told Byron that we're
25:04
backing ourselves into a surveillance state
25:06
and that data collection tilts the
25:09
power away from the individual towards
25:11
the government. In some of the
25:13
ways we're being surveilled, well there's
25:15
things I've never really thought of before
25:17
like when you realize something as seemingly
25:19
innocent as the tires on your car
25:21
mean you can be followed. So whenever
25:25
you start your car and you flip
25:27
through the little screen that shows the
25:29
tire pressure, the way your car
25:31
knows the tire pressure is because there's a
25:33
little sensor, a wireless sensor embedded in the
25:36
tire that's constantly communicating to the central
25:38
computer of the car and saying you know,
25:40
hey I'm tire number 3573 and my tire
25:42
pressure is 42. All is good. But of Course,
25:47
that transmission is actually just going out
25:49
into the clear. And so some very
25:51
clever government intelligence agencies have figured out
25:54
ways to build little sensors that can
25:56
detect and read these tire pressure readings.
25:58
Now I Don't. Witnesses Like a
26:01
true mass surveillance technology. I don't
26:03
recommend the People County fleet and
26:05
tires and to shows that Stairs
26:07
of Factor for surveillance and all
26:09
of these ordinary consumer technologies that
26:11
we use and that none of
26:13
them are really build for privacy.
26:15
They're not built to protect us
26:17
against tracking. There's a sort of
26:19
carelessly designed in many ways and
26:22
the government is very clever. All
26:24
these agencies a half missions and
26:26
they have technologists and they find
26:28
very clever ways to. Deliver
26:30
on these capabilities and so are many
26:32
factors for surveillance that go unconsidered by
26:34
the average person bar and also did
26:37
a deep dive into how dating apps
26:39
can be used in surveillance, how our
26:41
horny this can be weaponized against us,
26:43
the scenes around dating apps of x
26:45
ray to be as well and right
26:48
sort of stouts after new book about
26:50
Grinder and how that can be used.
26:52
Dating apps for the ultimate solid giving
26:54
out your location right? Me too that
26:56
willing. So it's not even that these
26:59
apps are making. The location available for
27:01
sale per se. It's they're just plugged into
27:03
this very complicated advertising ecosystem. And because people
27:05
when they load up dating apps on their
27:07
phone, they wanted a people near them, right?
27:10
they want to know. Are you two miles
27:12
where? you fifty miles away? Often we want
27:14
to date the person that's two miles away
27:17
so it's enable our location settings We turn
27:19
it on. But when you do that, you're
27:21
sharing all this information with tens of thousands
27:23
of parties. Are these advertising networks and some
27:26
of them are just sitting there vacuuming it
27:28
all up. In some of those people. Are
27:30
selling it to who knows right?
27:32
government agencies, private investigators, dark web
27:34
criminals. We don't know where the
27:36
data goes once it leaves are
27:38
phone and in terms of dating
27:40
apps, the really generate a very
27:42
rich repository of information because people
27:44
swipe when they're bored, they swipe
27:46
when they're lonely, and they turned
27:48
on a rich amount of data
27:50
that they're sharing with the app
27:52
and the advertisers. And so of
27:54
course, governments and data brokers have
27:56
become interested in getting location data
27:58
and standing out. Generate a very rich
28:01
stream of that. I'm increasingly aware of how
28:03
much data is being collected on me all
28:05
the time from serious apps and companies. I
28:07
went into one of those Amazon Source when
28:09
I was at a baseball game. An army
28:11
of cameras trained on me as I walked
28:14
around picking things out. There was no check
28:16
out when I was. the store at knew
28:18
what I've gotten and I was charged for
28:20
us. Look, if you're American, the
28:22
sort of shit Saudis to, but to me,
28:25
it's all pretty novel still. And.
28:27
When you throw one of those vision
28:29
Pro things on your face, that's a
28:31
whole bunch of cameras, including infrared cameras,
28:34
collecting your role, biological data, No.
28:36
Doubt at some point it'll be used to
28:38
train a i about everything to do with
28:40
you. But you can't help but
28:42
wonder. Who else might end up
28:44
prying? I think it's like
28:46
previous generations of technology like the car tires,
28:49
like the advertising networks that we're creating. these
28:51
systems and me really don't know where you
28:53
to start to think about how they're going
28:55
to be exploited in the future, right? People
28:58
are putting the large amounts of data and
29:00
to these large language models which I don't
29:02
even think their creators fully understand how they
29:04
work and what they do So and so
29:07
I do think in the future there's going
29:09
to be incidents that we'd don't even conceive
29:11
of because we don't quite understand these systems
29:13
a bit of build them for privacy. And
29:16
security. And to make sure that
29:18
the dignity of all these individuals
29:20
using them are going to be
29:22
protected, We, instead of like Americans,
29:24
rush headlong into embracing new technologies
29:26
without really thinking through that. There
29:28
are downsides. there are costs, there
29:30
are potential concerns about the way
29:32
these things will work, And so,
29:34
I think we're just at the
29:36
ferry very beginning of understanding. How
29:40
they will betray us. Vice
29:47
versa. Will be right back After a
29:49
word from our sponsors. Support.
29:53
For flightless bird comes from
29:55
sector Now I have recently
29:57
started getting sacked Him meals
29:59
deliver. The Me because two main
30:01
reasons I'm a terrible cook and I
30:03
also want to start eating way more.
30:06
How And this is where cytokines and
30:08
youth and eat stress free The Spring
30:10
with sectors delicious ready to eat meals.
30:13
Every fresh never frozen meal is sheath
30:15
crafted, dietitian approved and ready to eat
30:17
in just two minutes. You can choose
30:20
from a weekly mean you have thirty
30:22
five options including popular ones like Calories,
30:24
Mask, Tito Protein, plus all the Gun
30:27
and Vg. Also you can discover more
30:29
than sixty add. Ons every week like
30:31
breakfast. On the guidelines, snacks and my
30:33
favorite one beverages very healthy little geez
30:35
shots So what are you waiting for?
30:37
Gets added today and still up for
30:39
your spring time goes by. no fuss,
30:42
no muss meals and eliminates the hassle
30:44
of prepping, cooking or cleaning up. He
30:46
simply heat and say that the good
30:48
stuff and you can tailor at your
30:50
schedule. So turns that we needed to.
30:53
Kids. Effect meals.com/bird Safety and use
30:55
the Code Bird Fifty to
30:57
get fifty percent off your
30:59
first bucks plus twenty percent
31:01
off your next box sets.
31:03
Code Bird Sixty at sec
31:05
to meals.com/bird to get fifty
31:07
percent off your first box
31:09
plus twenty percent of your
31:11
next box. Will you? Subscription
31:13
is active. This.
31:16
Episode of Flightless Bird is brought
31:18
to you by booking.com backing.yeah Booking
31:20
dot Com of his possibilities across
31:22
the Us for all the travelers
31:24
you want to be from cozy
31:26
vacation homes. Defensive results With so
31:28
many choices across the with, you
31:30
can book who ever you want
31:32
to be unlocked. Family time you
31:34
buy booking a spacious vacation rental
31:36
for the whole family only into
31:38
relax you buy booking a beachside
31:40
resort. With booking.com there are so
31:42
many possibilities I'm starting to think
31:44
about future Flightless Bird. Trips using booking.com
31:46
is a whole lot of places I
31:49
need to go that I haven't been
31:51
yet. Listeners: I love all the suggestions
31:53
you're sending in of where I should
31:55
go and booking.com as helping me figure
31:57
out those trips including one to Montana
31:59
for. Lot of fun things coming up in
32:01
the future as a beer doesn't hurt my body.
32:03
A pass. The. Spring Chicken booking.com
32:06
See your ideal hotel vacation home
32:08
no matter where you go in
32:10
the U. Book whoever you want
32:12
to be on booking.com booking dot
32:15
Yeah. It's
32:22
a lot to take in, but it's
32:24
pretty meteor. I just find it really
32:26
interesting when you think of all these
32:28
day to day things were is a
32:30
very American saying is that because America
32:32
loves many said this advertise everywhere and
32:34
so all this data is being Hoover
32:36
Dam to be used commercially and then
32:38
the American government can jump onto that
32:40
data and legally track it's citizens and
32:42
see what we're up to. Yeah.
32:45
I understand the see around
32:47
it but I also feel
32:49
a little bit lights. We
32:51
want the technology we want
32:53
our hearts be able to
32:56
tell us about the terrible
32:58
I love. I don't want
33:00
it sacked every other day.
33:02
We like that we like
33:04
dating app. We like the
33:06
convenience so to me that
33:08
is a trade off. I.
33:10
Think. Bar and described as the social
33:12
Contract changing over time and the idea
33:14
that at has changed over time. Were
33:17
happy to put all our photos out
33:19
onto the internet. Now we used to
33:21
be paranoid about people having our photos
33:23
that in say so came along and
33:25
suddenly we were uploading thousands of photos
33:28
and I feel as the same with
33:30
the type races and the cameras can
33:32
make satellite so convenient were a title
33:34
that privacy stuff dropped behind a little
33:36
bit. I also like what he
33:38
said and I think it's really true.
33:41
A lot of other countries they just
33:43
don't have this data case and a
33:45
privacy. they're like won't Yeah, this is
33:47
how government works. I think because we
33:49
are a country of liberty, that's such
33:52
a big tennis that it creates this
33:54
big conversation. Is something I hadn't
33:56
really thought about. How America is less
33:58
trusting of his government than. It
34:00
is the people in New Zealand or
34:02
the Uk. I'd seems obvious since he
34:04
stated that, but something I hadn't really
34:06
clocks. I quite like that. Suspicious aspects
34:08
Alex: Cynical. take a little better. sorry
34:11
me. It is very yes But also
34:13
that's I mean when you're in New Zealand,
34:15
were you thinking about any of this? Because
34:17
it's happening. There is absolutely us not loud
34:19
as this On a bunch of people being
34:22
like this. Is an army New Zealand? I
34:24
mean we have a sort of a secret
34:26
spy base in New Zealand And Whip New
34:28
Zealand is part of the sinkhole, the Five
34:31
Eyes Network which her five different countries that
34:33
have joined together with America to surveil why
34:35
sobbing information about each other. So New Zealanders
34:37
via very much a part of this hundred
34:40
refunds this book. I think what Barnes done
34:42
when means who control it? It's just outlines
34:44
things I never would have thought of before
34:47
around how in our day to day life.
34:49
Than. Summation is as been who that
34:51
up and use in various ways and
34:54
obvious thing for us fairly ordinary members
34:56
the public who cares but you know
34:58
you think of journalist from do their
35:01
job or people trying to you know
35:03
speak truth to power suddenly they are
35:05
being surveilled. It's more of an issue
35:08
for them. In China, do their work.
35:10
Yes, it definitely can be. I mean
35:12
I struggle with this topic a little.
35:14
been what I'm gonna say and saying
35:17
it because I I wonder if other.
35:19
People have missed a spot by so I'm so
35:21
for me my thoughts as well Again this book
35:23
I was just as stuff. I haven't really given
35:25
a great live the thought he before there's a
35:27
part. Of this conversation that seals
35:29
conspiracy theory Adjacent: the government
35:31
is tracking your tire pressure
35:33
and I know that show
35:36
with tech with the cloud
35:38
with all of it were
35:40
available to them. You know?
35:42
It's very lines that was paranoid sinking.
35:44
Paranoia. And I think
35:46
there's a level there's a level of looking into
35:49
sensibly him being like this is a thing that's
35:51
happening, but then this is a kind of content
35:53
this gets mentally unwell people really worried about being
35:55
tracked did every moment and having chips and stole
35:58
them their brain as in this We'll Howl. Yeah,
36:00
it's very adjacent and to me dips in
36:02
and out of it. And so I keep
36:04
it at a little bit of an arm's
36:07
length. Something I've literally
36:09
sort of thought about this now as far as
36:11
where America fits in, obviously something like 9-11. We've
36:14
talked about this so much on Amt on
36:29
planes. And I wonder what
36:32
is the consensus? I wonder
36:34
if we did a survey
36:36
of the American populace. Are
36:38
we happier that we have
36:40
those precautions post 9-11 or are we not?
36:44
I mean, I am. I
36:46
like knowing that we're doing
36:48
everything we can to prevent
36:50
a disaster. Oh, completely. Did
36:53
I tell you when in New Zealand I got this
36:55
part-time job at the airport one summer when I was
36:57
studying and I had to go on to planes and
36:59
check that no one had hidden weapons on them to
37:01
be used by terrorists. Did I tell you about this?
37:04
That was your part-time job.
37:06
Pretty loose, right? It is. Yeah,
37:09
I think they had strikes at the main
37:11
Auckland New Zealand airport and so they were
37:13
down on staff. And so one summer they
37:15
did this massive hire and so I was
37:17
brought in. I head out to the airport
37:19
at 3 in the morning and the planes
37:21
would land and I would go
37:23
on with my team of people that had been
37:25
trained over sort of a week and we'd look
37:27
under the seats in the seat pockets up above
37:30
in the luggage storage to make sure no one
37:32
did like hate the knife under the seat or
37:34
something to be used by their colleagues who was
37:36
like on the next flight. So it's very dramatic.
37:39
Very dramatic. Wow. Did you
37:41
ever find one? Did anyone ever find one? Nothing. I
37:43
actually found some vomit under a seat one time.
37:46
Perfect. No, I found nothing exciting. It was the
37:48
least exciting job I've ever had in my life.
37:50
Although it was fun going out on the tarmac
37:52
and a little buggy to like get to the
37:54
planes. That was really fun. That
37:57
was fun. Yeah. Anyway.
37:59
Alright. to Byron a little bit more
38:01
about something called the Foreign Intelligence Surveillance Act which
38:03
is just a very American thing. So this is
38:06
the last little bit of the dark. Something
38:09
I'd forgotten to talk to Byron about
38:12
earlier was that Foreign Intelligence Surveillance Act.
38:14
We probably should have started there as
38:16
it's the thing that allows the government
38:18
to get potential access to our horny
38:21
dating app data, email inboxes and
38:23
whatever means we're texting to our friends. It's
38:26
called the Foreign Intelligence Surveillance Act
38:28
section 702. That's a
38:31
real mouthful. But what it essentially allows
38:33
is the American government to go to
38:35
tech companies like Google, like Meta, like
38:37
Twitter and get private information about people
38:40
that it deems to be lawful intelligence
38:42
targets. And it does that entirely in
38:44
secret right because we can't tell spies
38:46
and terrorists that their data is being
38:48
vacuumed up by the US government. Now
38:50
I should say all governments do a
38:52
form of this. European governments do it,
38:54
Asian governments do it, African governments do
38:56
it. They tap telecom networks, they spy
38:59
they compel companies to turn over
39:01
information for national security purposes. But
39:03
what makes America different is that
39:06
so much of the world's telecommunications
39:08
traffic passes through the United States.
39:10
A lot of it passes through switches
39:12
here in the DC area that makes
39:14
it vulnerable to interception by the US
39:17
government. And second, that a lot of
39:19
the global tech brands are American companies
39:21
and have data in the United States.
39:23
So what makes America special, what makes
39:25
it a spying superpower in many ways
39:28
is the fact that so many global
39:30
tech companies are based here and the
39:32
American government can just walk into Google's
39:34
headquarters or walk into Meta's headquarters with
39:36
a list of all the people that
39:39
it wants to target for surveillance and
39:41
get a ton of information about those
39:43
people. And it does it all in
39:45
secret. America is, in Byron's
39:48
words, a spying superpower.
39:51
And look, all power to America and catching the
39:53
bad guys. But what about the good guys that
39:55
get caught up in this? Look, I've seen enemy
39:58
of the state. I've read about it. Edward
40:00
Snowden. And as I talked to Byron, I
40:02
wondered how paranoid he is about doing the
40:04
kind of work he does here. I realise
40:07
we're talking on Zoom right now. Something
40:09
made in America, with data centres in
40:11
America, could easily be intercepted. I
40:14
wonder if stuff like that plays over in
40:16
his mind. It certainly does
40:18
because, you know, as a reporter, I'm
40:20
talking to a lot of people, I'm
40:22
trying to offer them confidentiality. In some
40:24
cases, people are breaking NDAs, and I'm
40:26
trying to protect them, right? I don't
40:28
want consequences visited on them. So I
40:31
do try to be very, very security
40:33
conscious about where I store data, how
40:35
I store data, what platforms I use.
40:37
It doesn't mean I give up using
40:39
social media. You know, I'm wearing a
40:41
fitness ring that tracks how well I
40:43
sleep. There are benefits to these technologies.
40:45
We don't all have to go live
40:47
in the woods, but we should really
40:49
use them mindfully. We should use them
40:52
aware of the downside risk. As you
40:54
say, I'm talking to you on Zoom.
40:56
Zoom has a potential to be intercepted.
40:58
So I'm, you know, I'm not saying
41:00
anything tremendously sensitive. We're having a conversation
41:02
for public consumption. If I were talking
41:04
to someone who was kind of a
41:06
whistleblower, a government insider, maybe I wouldn't
41:08
use Zoom. Maybe I would use Signal
41:10
or something. So I certainly, as a
41:13
reporter, try to be aware of this
41:15
stuff. And I think increasingly, as more
41:17
and more information becomes digitized, as so
41:19
many more professions are going to entirely
41:21
live on computers, that I think more
41:23
and more people are going to have
41:25
to think about this stuff in their
41:27
day-to-day lives. Before I let
41:29
Byron get back to his life writing about
41:32
surveillance, I wanted to know what other tips
41:34
he had when it came to protecting data
41:36
from prying eyes. It depends
41:38
what you're worried about protecting, right? So
41:40
I do think if you're talking about
41:42
shady data brokers collecting your location, you
41:44
certainly can disable a lot of the
41:46
location settings in a lot of your
41:48
commercially available apps. You don't need to
41:50
tell the weather app where you are.
41:52
You could just type in the city
41:54
where you are. You could just let
41:57
it know your location when you open
41:59
it, not 24 hours. So I would
42:01
tell people to primarily be conscious of what
42:03
information they share with these apps that they
42:05
put on their phone. There
42:08
are certainly very good
42:10
privacy protecting technologies, especially
42:12
communications technologies, WhatsApp, Signal,
42:14
Apple's iMessage. These are all
42:16
encrypted. They will all protect
42:18
the contents of your communications.
42:21
Some of them are stronger encryption than others.
42:23
Some of them collect less data than others.
42:25
But in general, the ordinary person using these
42:27
services will get a pretty decent level of
42:30
privacy. It's not perfect. You could always
42:32
hack people and steal their data. But
42:35
it's harder for snoopers, governments, telecoms
42:37
to see messages transiting on that
42:39
platform. And finally, I would really
42:42
recommend that people rethink their relationship
42:44
to paying for digital services because
42:47
when apps can't rely on people to
42:49
pay $0.99 or $1.99 for an app
42:53
or a few bucks a month for
42:55
a service, that's really when they do
42:57
turn to monetizing their user base because
42:59
it's not free to make an app. It
43:01
does cost money. You've got to hire coders
43:03
to code it. Rent server space, you have
43:06
to hire lawyers to draw up a privacy
43:08
policy. You have to pay employees. So technology
43:10
is never really free. And so
43:12
when people are unwilling to spend even a
43:14
few dollars on an app or a service,
43:16
then that's when these companies start to turn
43:18
around and say, okay, well, they want our
43:20
app and they don't want to pay for
43:22
it. Why don't we just monetize them? Why
43:24
don't we sell their data? Yeah, amen to
43:26
all of that. And yeah,
43:29
do all of that. And obviously also occasionally
43:31
let some air out of your tires, right? That's
43:34
right. If you really are concerned
43:37
about surveillance, yes, you probably want to find
43:39
a mechanic to rip those sensors out of
43:41
your car tires. Look, to be
43:43
honest, I couldn't really care less about
43:45
who's tracking my information. I won't be
43:47
deflating my tires anytime soon. Then
43:50
again, I just rewatch minority report
43:52
and it reminded me what can go wrong
43:55
when our data is harvested and used against
43:57
us. I also think the whole sick
43:59
top panic. has been really interesting to watch
44:01
and shows how differently the likes of
44:03
China thinks about this stuff to how
44:05
America thinks about it. As Byron notes
44:07
in the closing of his book, China
44:09
wants its citizens to know they're being
44:11
watched. In America, the
44:14
success lies in its secrecy.
44:18
By the time this goes to air, the
44:20
whole TikTok thing may have changed entirely, but
44:22
I do think it's kind of fascinating how
44:24
there's no great proof right now
44:26
that China is using TikTok
44:28
to influence the masses in
44:30
America or to surveil us. But it turned
44:33
into such a giant talking point, such
44:35
a big thing it's almost talked about as if
44:37
it's definite fact. I'm not on TikTok,
44:40
so I know so little about this. I know
44:42
the jib. You're lurking on TikTok
44:44
somewhere watching this. I'm lurking. No,
44:46
I'm not. I'm not. Thank God. I'm
44:49
so glad I've avoided that rabbit
44:52
hole. I'm with you. I think
44:54
it's the last social network where I think I just
44:56
haven't gotten on. It's like I've aged out and I'm
44:58
aware of it and that's okay. I agree.
45:00
I agree. I think it's
45:03
okay. But this has been talked about
45:05
since the beginning, since TikTok first blew
45:07
up. Oh, China, Chinese app, they have
45:09
all our data. They're tracking all of
45:11
it. And then
45:13
what made it turn so quickly
45:15
in the past month? There's
45:18
just been hearings about it. It's just been
45:20
this big push by America. It's sort of
45:22
a political talking point, just a crackdown. So
45:24
there was no impetus. No
45:27
reveal or anything. No big data's come out
45:29
and you see quotes from various American
45:32
intelligence agencies and they're not saying this
45:34
is a definite. They're like, oh, China
45:36
could be doing this or this is
45:38
a possibility. It's nothing concrete. But yeah,
45:41
it's been reported on so widely. I
45:43
just think it's interesting the paranoia China's
45:45
spying on us is much more worried
45:48
about by I think your typical
45:50
American than the idea that your location
45:52
data is being hovered up by
45:54
private data brokers and then sold elsewhere.
45:56
It's so easy to like fear
45:58
this outside force. as opposed to
46:00
powers within. Yeah, but there's a reason
46:02
for that, right? Oh, you don't want
46:05
other countries prying on our data. Yeah,
46:07
they have a much bigger incentive
46:09
to use it for, to quote,
46:11
take us down than
46:14
the American government does. So
46:16
are there examples of journalists
46:20
who talk about being surveilled and
46:22
then stifled? Yeah, no, absolutely. It mainly
46:24
comes to, you know, protecting sources. And
46:27
I was just revisiting the Edward Snowden
46:29
story. And that's the idea of data
46:31
being used against American citizens is this
46:33
whole other topic to go down. But
46:35
yeah, means of control, the book, it
46:37
talks about it so competently in Byron,
46:39
you know, he talked to 300 sources
46:42
for this. It's just
46:44
one of those books that's so well researched and so
46:46
level headed. I just loved reading it. So I'm like,
46:48
I've got to talk to this guy and, you
46:50
know, we've got to talk a little bit about surveillance because
46:52
it's just a fascinating space. And I think when I was
46:55
at that baseball game, looking at what
46:57
felt like 60 cameras staring at
46:59
me in the Amazon store as I picked
47:01
up my chocolate bar and my drink, I
47:04
was so stunned that this technology existed. There
47:06
was no counter. It just knew what I
47:09
had. And I walked out and it's got
47:11
my card. Yeah, there are whole
47:13
foods that do that as well. That's
47:15
right. Yeah. Which is so bonkers.
47:17
I mean, you know, when you're going
47:19
into that store or you know, when
47:21
you're going into that whole foods, that's
47:24
what's happening. So I think there's personal
47:26
responsibility. Here's sort of what he said.
47:28
He's like, if you're talking about something
47:30
extremely sensitive, maybe know how to do
47:33
that in a way. And
47:35
I yeah, I'm kind of like have
47:37
some personal responsibility. 100% general. You
47:40
know, like if you're worried about
47:43
being surveilled, probably don't shop
47:45
in that Amazon store. You
47:47
know, there's other things you can do. There's
47:50
other ways to get those, you know, that
47:52
aren't as convenient. But again, that's the trade
47:54
off. And I agree. And it's like what
47:56
he was saying. There's a reason that some apps,
47:58
it's good to pay for. them because then
48:00
they don't have to find other ways to make
48:02
money out of you as a product. So there's
48:05
a reason. That was interesting. So it's just like
48:07
little things like that that I think are interesting
48:09
to think about. Also I've heard
48:11
this come out of a lot of people's mouths that
48:13
they like their algorithm. Oh,
48:15
they're like, oh, this curates it really well for me.
48:18
Yeah, like everything I'm getting is what I
48:20
want. I'm going to
48:22
get ads anyway, might as well get
48:24
the ads of things that I might
48:26
actually use or enjoy or I don't
48:28
know. I find that takes
48:30
sort of interesting as well. The algorithm is
48:32
not working well for me at the moment.
48:35
Do you know what the algorithm is giving
48:37
me all over Facebook and Instagram? Vomit. It's
48:39
giving me, I wish it was vomit. It's
48:42
giving me birds, which I like, but
48:45
they are AI birds with
48:47
giant testicles. That's
48:49
what I'm getting. All I'm getting is a... David.
48:52
No, it's messed up. What are you talking about?
48:55
I'm being served up some kind
48:57
of hellish AI art space and
49:00
now all Facebook is serving me
49:02
and it's my fault for being on Facebook
49:04
in the first place. It's just birds with
49:06
massive nuts. Wait,
49:09
to buy a painting or
49:11
I'm so confused? No,
49:13
just like the caption will be,
49:15
here's some beautiful nature and
49:18
then I'm going to send it to
49:20
you. Don't send it to me.
49:22
I don't want that. I'm going to send
49:24
it to you. Because it's hard to describe
49:26
the stuff that I'm being served up. What
49:29
are you being served up? Some good stuff? I
49:31
mean, I get like a lot of... This
49:33
might be... Oh my gosh. Is this
49:35
the workplace picture harassment? It's all for evidence. This
49:37
is horrific. What in the fuck? I
49:40
also don't get it. I don't get why. Is this
49:42
an ad? No, it's just a lot of Facebook is being
49:44
taken over by bad AI art. It's
49:53
kind of built to go viral because
49:55
it's so weird. There was a big
49:57
phenomenon of Jesus' made out of shrimp.
50:00
That was a big thing. What the
50:02
fuck? Oh my god. Yeah, a lot
50:05
of Facebook has been taken over by
50:07
this You're on regular Facebook Yeah, it's
50:09
just a whole lot of birds is
50:11
big. I think maybe you need to get off
50:13
of that I don't
50:15
have that on Instagram. I just have
50:18
ads of clothes. Actually, I bought
50:20
this Great outfit looking
50:22
great. Thank you. That was
50:25
in targeted ad. I see
50:27
it's working for you I wish I
50:29
was getting great wardrobe advice not big
50:31
massive. Yeah, you need a new suit He
50:34
says you needed a new suit. You should get some
50:36
targeted suit ads. I need
50:39
to fix my algorithm Obviously,
50:41
I actually blame you for this. I think
50:43
it's a problem doing Yeah,
50:46
let's not get into what I'm actually searching
50:48
for my search terms Don't
50:50
make this a me problem Monica All
50:53
right, well I would say I've become
50:55
slightly more American because I Have
50:59
been thinking about American surveillance. So at least
51:01
I've got to be another 2% American. Yeah,
51:03
I mean I think We're
51:06
all being surveilled on this globe
51:08
in this global economy and it's
51:10
only gonna get more Mm-hmm. And
51:12
so gosh, I guess it's
51:14
up to everyone individually to
51:17
embrace it or reject it I
51:20
am tired. I'm just gonna embrace it
51:24
So much effort to tap out of society in
51:26
any way as well. It's kind of part of
51:28
the shit Let's just hope it doesn't get turned
51:30
against us in some way Yeah,
51:33
we'll be alright. We'll be okay. And if it all goes bad,
51:35
we'll just like head to the hills Where are the hills? Where
51:37
do you go to the hills in LA and New Zealand? Just
51:39
like if things go bad, you'll go to the hills be
51:42
like the Kaimai rangers Where do you go
51:44
in LA up the Griffith Observatory or something?
51:46
No, I think we go down I think we
51:48
like go into our bunkers. Ding ding. We go
51:50
down. Yeah, of course you go into the bunker
51:53
Yeah, which I'm kind of excited for
51:55
that. There are some beautiful bunkers All
51:59
right I'm glad you enjoyed the feature.
52:02
Your mom is healing. This is great. A more
52:04
American. Merry Christmas.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More