Podchaser Logo
Home
Its a Musk vs Albanese showdown

Its a Musk vs Albanese showdown

Released Thursday, 25th April 2024
Good episode? Give it some love!
Its a Musk vs Albanese showdown

Its a Musk vs Albanese showdown

Its a Musk vs Albanese showdown

Its a Musk vs Albanese showdown

Thursday, 25th April 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

ABC Listen, podcasts,

0:02

radio, news, music

0:04

and more. Breaking

0:11

news. Elon Musk is arguing with somebody on

0:13

the internet. But this time

0:15

he's arguing with the Australian

0:17

government. Yes, this week on Download This

0:19

Show, the Australian Federal Court has

0:21

ordered Elon Musk's ex, formerly

0:23

Twitter, to hide posts containing videos of a

0:26

stabbing in a Sydney church last week. But

0:29

it's not necessarily playing out the way either side

0:31

expected it to. Also on the show

0:33

this week, what if your social media account could talk to

0:35

you? Imagine talking Twitter. All of that and much more coming

0:37

up. This is your guide

0:39

to the week in media, technology and culture.

0:42

My name is Mark Fennell. Welcome to Download

0:44

This Show. Yes,

0:54

indeed, it is a brand new episode of Download This Show. Very

0:57

big welcome to our guest this week, Tech Reporter with Capitol

0:59

Brief, Daniel Van Boom. Welcome to the show. Thank you for having

1:01

me move back. Oh, you know, we do it under suffering. Don't

1:03

get it. It's lovely to have you

1:05

back on the show. And ABC National Tech Reporter and good friend

1:08

of the pod. Just a phrase I've wanted to say. It's not

1:10

even one I proclaim. Angela Vauquier,

1:12

welcome back. Thank you so much. It does feel good

1:14

to say. It feels good to hear. Excellent. All

1:17

right. Not 100% sure whether we're all aware

1:19

of this, but it turns out Australia's at war with Elon Musk and

1:21

how did that happen? It's pretty hard

1:23

to miss. And boy, has it

1:25

escalated. Look, Elon Musk has

1:27

been at war with Australia in various

1:29

ways at varying volumes, quietly, loudly, for

1:31

quite some time. But

1:35

most recently, this is because

1:37

X would not

1:39

take down the videos from

1:41

the Wakely stabbing. The eSafety

1:44

Commissioner has the power to tell

1:47

platforms to take down content. They

1:49

told a whole bunch of platforms to take down that video because it

1:51

really was everywhere. The

1:54

thing is, almost everyone complied. Meta was

1:56

a little bit slow. They had to sort

1:58

of push Meta a little bit. harder,

2:00

but meta-ultimately cooperated to hear

2:02

eSafety Commissioner tell the story.

2:05

Whereas X has made

2:08

a really big point

2:10

of not complying in the

2:12

process, making an enemy of

2:15

somehow everyone in

2:17

Australian politics galvanising

2:19

support for all kinds of measures

2:22

to rein in big tech as

2:24

that's the rhetoric coming out of Canberra at the moment.

2:27

I guess further annoying, I don't know

2:29

if eSafety Commissioners get annoyed, but whatever

2:31

feelings an eSafety Commissioner has, they're having

2:33

them at the moment and have since

2:36

taken further legal action to try

2:38

and get X to comply with

2:40

this takedown order. At

2:42

the time of recording, there is a bit of

2:44

a legal action underway, so we're going to talk

2:46

a little bit top line about this. Out of

2:49

curiosity, Elon Musk has dubbed what's going on here

2:51

as censorship. Is he

2:53

right, Daniel? I'm not sure, because there's

2:55

not really an agreement about

2:57

what is actually being spoken about. The

2:59

eSafety Commissioner has said that they have

3:01

requested that X take down the actual

3:05

stabbing video, and Elon Musk and Twitter

3:07

are claiming that they were asked to

3:09

not only take down the stabbing video,

3:11

but all commentary around the stabbing video.

3:14

Whether or not it's censorship, I think

3:17

relies a lot upon which

3:19

of those people are lying. I have my suspicions about

3:21

which one of those are telling the truth, but yes,

3:23

it very much depends, which is one of the things

3:25

we will find out in the coming days. I think

3:27

there's also some subtlety around, for example, if there was

3:29

a post with the video

3:35

in it and commentary attached to

3:38

that, and then the takedown order applies

3:40

to the post, and then if

3:42

you were so minded, you may

3:45

characterize that as they're telling us to take

3:47

the post down as well. Sure, that's part

3:49

of the deal, but I wonder how

3:51

much it

3:53

might be rounded up or rounded down based

3:55

on those kinds of finer points. achievement

4:00

to achieve bipartisan support in H.J.

4:02

DeVille on Musk. Why

4:05

do tech companies care about what the safety

4:07

commissioner in a country all the way on

4:09

the other side of the world says? Like

4:11

why do they care? They can find you

4:13

quite a bit of money. Well, this is

4:15

another thing we're going to find out. The

4:17

eSafety Commissioner, this is probably the biggest test.

4:19

It's a relatively new commission. It's been around

4:22

since 2017. In the past, tech companies, like

4:24

MostlyMetta and Twitter before I was ex, have

4:26

actually been pretty good about like complying, at

4:28

least on the surface with their like request

4:30

to take down like child sex abuse material

4:32

or hateful content, things along those lines. This

4:34

was the first time one of the big

4:36

tech companies has really like taken a stand

4:38

against them. And we don't actually know really

4:41

what their powers are. One of the things they

4:43

say they can do though is find ex slash

4:46

Twitter, you know, up to I think it's about

4:48

$700,000 daily while the content is still up. So

4:50

whether or not that will stand, we will find

4:52

out. But eventually, even for Elon Musk, that kind

4:54

of racks up. I think this

4:57

was very quiet. This didn't get

4:59

anywhere near the same amount of

5:01

coverage as the current spat is

5:03

getting. But Elon Musk was actually

5:05

already suing the eSafety or in

5:07

a legal fight with the eSafety

5:09

Commissioner over a takedown notice that

5:11

they didn't want to, that the

5:13

company didn't want to obey. It

5:15

was to do with some, you

5:18

know, alleged hateful content about a

5:20

transgender Australian man who had been

5:22

targeted, you know, misgendered and I

5:24

guess, yeah, awful things said about

5:26

them on that platform that eSafety

5:28

Commissioner issued a takedown notice, Musk

5:31

and ex wouldn't go with it. And so

5:33

there was already, they were already kind of

5:35

suing or they'd filed the action and it

5:37

had been received that, you know, it hasn't

5:39

been heard or anything yet. But, you

5:41

know, these fights do exist. Elon

5:44

Musk is also making a real virtue at the

5:46

moment of a fight that he's

5:48

having in Brazil around free speech. If

5:50

you look at his, if you look

5:53

at his feed, it was very tickled

5:55

to see him effectively fundraising earlier this

5:57

week saying, hey, if you want or

6:00

Free Speech, turns out these fights are really

6:02

expensive. Why don't you buy a premium membership

6:04

in order to support this? So I think

6:07

for Elon Musk, in answer to your

6:09

question, why do they care? I think he

6:11

cares because I mean, he's characterized

6:13

himself as a free speech

6:15

absolutist. This is kind of, which is,

6:17

you know, he's absolutely been more than

6:19

happy to kick certain journalists off X

6:21

at various points. So you could certainly argue

6:24

that he is not that, but- He's

6:26

absolutely interested in his own free speech. That much is

6:28

clear. Yeah, he is interested in Elon Musk.

6:30

Elon Musk is interested in Elon Musk's free speech.

6:32

It's sort of about positioning and ideology for him,

6:34

I have to think, but why

6:36

they care more generally, I think

6:39

is to do with whether or not

6:41

Australia can be influential in, you know, what

6:43

happens in Australia does matter globally because

6:47

tech regulation is a very moveable space at the

6:49

moment. There are all kinds of changes happening in

6:51

key markets. And so to that extent, it matters

6:53

as well. We could delve into the psyche of

6:55

Elon Musk for a bit, which I feel like

6:57

has not been done enough It's more or less

6:59

a good 30% of the show these days. Yeah,

7:01

yeah. Over the weekend, it was announced that he

7:03

canceled a trip to India, a Tesla related trip

7:05

to India, I believe, because, you know, Tesla is

7:07

kind of on fire at the moment. Like

7:09

he doesn't really have time to be meeting- Figuratively,

7:11

not pitifully. Figuratively, you know, they've caught a bunch

7:13

of staff and they're not meeting their numbers, which

7:16

is kind of standard. You know, he's over

7:18

there going hardcore mode or whatever he's saying is,

7:20

and not- he doesn't have

7:22

time to meet the Indian Prime Minister. But here he

7:24

is picking fights with, like, the Australian

7:26

East Safety Commissioner over, like, quote unquote, free

7:28

speech, which the Brazil thing is

7:30

pretty much a very similar

7:33

thing. Like, that all happened because Brazil's

7:35

justice on the Supreme Court was essentially

7:37

saying that the court was getting death threats

7:40

and misinformation and that X needed

7:42

to ban some posts. And Elon Musk decided

7:44

to, like, have a row about

7:46

that. And so he's got, I guess,

7:48

the ultimate case of the Twitter brain, because,

7:50

yeah, instead of, like, doing his job, which

7:53

is his jobs, which are, you know, pretty

7:55

important, feels like he's just arguing with people

7:57

on the Internet. For fun. There is

7:59

one king. in this story that's probably worth

8:01

pointing out, which is over the weekend when they

8:03

did ask for the takedown of the content, they

8:05

kind of did take it down, just not for

8:08

people outside of Australia. And that is one of

8:10

the points of contention here, isn't it, Ange? Right.

8:12

So this is geo-blocking. And so

8:15

there was a measure of cooperation,

8:17

we should say, but the

8:20

e-safety commissioner in, you

8:22

know, filing for an injunction this week

8:24

has been arguing that is not good

8:26

enough, that there should be an actual,

8:28

there should be, needs to be an

8:30

actual takedown. So just to play devil's

8:32

advocate for a second here, why should

8:35

the e-safety commissioner of Australia's jurisdiction extend

8:37

beyond Australia? What's her

8:39

argument there? Well, I mean, it's

8:41

a really interesting point, right? Because I

8:43

think, I think

8:45

the rhetoric you're hearing out of Canberra,

8:48

you know, Peter Dutton over the weekend

8:50

saying they can run their own

8:52

race overseas, but the Australian law

8:54

applies in Australia. So it is

8:57

a good question. The argument

8:59

you could advance is whether

9:01

geo-blocking is sufficient in

9:03

that there are ways around that. You

9:06

know, a lot of people use VPNs.

9:08

I'm wondering if the people that they're

9:10

especially worried about seeing extremist content online

9:13

are exactly the people who are going

9:15

to be getting around geo-blocking.

9:18

You also have to distinguish the legal argument from

9:20

like the moral argument. And I guess like legally,

9:22

we are going to find out and I think

9:24

it is a perfectly fair devil's advocate. Like do

9:27

they have the authority to ask for that? But

9:29

then like on the moral side, like would you

9:31

ever, you know, in the Christchurch livestream,

9:33

would you ever say just

9:36

geo-block that from New Zealand and it's okay to

9:38

be seen everywhere else? No, it's an interesting point.

9:40

I just don't know the answer. I'm not, I'm

9:42

as curious about it as anybody. And I do think there

9:45

are interesting nuances to it. At time

9:47

of recording this season in front of the court. So it'd be

9:49

very interesting to see where this goes and try on it. Yeah.

9:51

And I think it could have much

9:54

grander consequences than Elon Musk

9:56

ever intended, or

9:59

perhaps even. imagined, the

10:01

norm for the longest time

10:03

has been that tech companies

10:05

have co-designed the rules, certainly

10:08

in Australia and in other jurisdictions as

10:10

well. They co-designed the rules. At

10:14

the moment, most of their commitments

10:16

for what happens online

10:19

and the safety around

10:21

that, they're voluntary. This

10:24

is potentially a real turning

10:26

point in that we might,

10:28

and Peter Dutton and the coalition has

10:30

expressed support for moving towards a

10:33

mandatory model. We're seeing that change

10:35

happen in the EU. We're seeing

10:37

the beginnings of it in the

10:40

US. I wonder if maybe

10:42

this is Australia's moment

10:44

as well, which goes far beyond

10:46

X. This isn't a fight that

10:48

Elon Musk has picked, but will

10:51

have consequences potentially for every platform.

10:53

Download this show is what you're listening to.

10:55

It is your guide to the week in

10:58

media, technology and culture. I guess this week,

11:00

Angela Voipier, ABC national tech reporter and Daniel

11:02

Van Boom from Capitol brief. Now, if you

11:04

scroll through the internet and you see Dr.

11:06

Carl's face and he's flogging you vitamin

11:09

gummies or miracle pills, Dr. Carl, amongst

11:11

a whole host of other celebrities have

11:14

a message for you and that is,

11:16

it's not me, it's a scam, Dan.

11:18

Tell me a little bit about what's

11:20

happening here. Yeah, so Dr. Carl is

11:23

one of many celebrities, Australian and otherwise,

11:25

which are being used to spook various

11:27

products along the lines of miscellaneous health

11:29

bills, investing advice into various cryptocurrencies and

11:32

Cauchy's death for some reason. Have you

11:34

seen all those ads about Cauchy being,

11:37

Richard Wilkins being arrested, Grant

11:39

Denner being arrested, Cauchy in financial ruin. They're everywhere.

11:41

They're horrific. Some of them are like, we're trying

11:43

to scam you for money and some of them

11:45

are just, we want to watch the world burn,

11:48

I feel. Yeah. So Why are there so

11:50

many of these at the moment? Yeah, I think,

11:52

I mean, that makes a great point in that

11:54

I Think you do need to like separate out

11:56

the different categories because now have this like magic

11:58

machine called AI that can. Make whatever

12:00

you want if you are willing to put the will

12:02

that of time and effort into it. and you don't

12:04

necessarily need that much skill. but there are all kinds

12:06

of services that says there is. Content.

12:09

Farm so it's really just looking for

12:11

clicks, selling cheap ads and that can just

12:13

be builds there. Is a surgical to

12:15

a very true that? A look at

12:17

Ups and then there's the we're trying

12:20

to sell using that as Vitamin Dummies

12:22

Crypto? whatever. Slightly different. Goals: still

12:24

commercial Dice. And then of

12:26

course you've got the misinformation.

12:28

the kind of more ideological

12:30

flavored variety of ice generated.

12:33

Fakery. And sixteen or a online. you

12:35

know this one very much falls into

12:37

that seconds had agreed to. It is.

12:39

the reason it's everywhere is because it's

12:41

very, very easy. As assurance of Mark,

12:43

there was a really interesting space last

12:45

week. There's some researchers in the Us.

12:47

they called. News God and one

12:49

of either way armor with i.

12:51

Assume we've on the every mountain spoken

12:54

on the floor and but I could

12:56

possibly be to them in Arma and

12:58

one of their race resources Jack barista

13:00

when about setting up a propaganda side

13:02

like of survivor machine a propaganda forever

13:04

machine just like online and at Cern

13:06

it out at cost him one hundred

13:08

and five. Dollars. To. Do

13:10

so that it's and seeks a catering. Yeah, and

13:12

it's and it's endless. like that will just keep

13:14

going. And so what he'd done is he'd he

13:16

didn't do it himself. He found someone I think

13:18

on five on which is it a platform where

13:20

you can find people to do these kind of

13:23

thing? Fathers and sort of abby digits or to be

13:25

so my design and phrases that yes was a redesign.

13:27

Your luggage you at. Precisely you can see

13:29

an email containing some s that is. So

13:31

he paid a hundred and five dollars and

13:33

got this set up and the way that

13:36

it worked is it. Would you know? it's

13:38

great. News sites and you

13:40

just sort of instructs what kind of

13:42

ideological slave you want to have. Any

13:45

it's then, churning. Out posts and

13:47

content. In. Whatever safety

13:49

one become a sandwich or potentially dumb

13:51

question. But. These. scams the

13:53

man is certainly not unique to australia have a

13:56

it's they have a number of the world and

13:58

i usually you know the bottom of websites,

14:00

they're badly photoshopped or done using

14:02

AI. Do they work?

14:04

Yeah, they sure do. Um, so the, the

14:06

throwback to this though, is that, um, this

14:09

is not technically a new thing, I think

14:11

it's a new way to scam like the

14:13

same group of people, unfortunately,

14:15

but like people who are not digitally native,

14:17

for instance, have been falling for like a

14:20

surprising amount of people still fall for the

14:22

Nigerian King uncle, like email tree. He

14:24

promised me he'd send me money. Yeah. Yeah. Yeah. And

14:26

I'm going to use that to buy the top. That's

14:28

why I gave him my PayPal. Yeah. Yeah. The things

14:31

that, you know, digitally native people would instinctively

14:33

and immediately be able to identify as a scam.

14:35

Like even the Dr. Carl videos, if you see

14:37

them, if you, if you're like, I hate to,

14:39

I hate to make generalizations about age, but if

14:42

you're a millennial, you're probably going to spot it

14:44

like immediately. Um, but the same

14:46

group of kind of older people who have

14:48

been falling for the same kind of scams,

14:50

uh, it's a kind of a new way

14:53

to get them, which is sad. Yeah. I think,

14:55

I don't know, I would make a distinction

14:57

there as well, because I used to sort

14:59

of feel the same way, like we've always

15:01

had content farms. We've always had like, like,

15:03

well, not always, but, but far too long.

15:05

We have had, you know, people trying to

15:07

flog us weird crypto or NFTs online and

15:09

you know, also misinformation and foreign interference of

15:11

that nature. We've always had it. So what

15:13

is the big deal? I

15:15

think that $105 price tag really, it

15:18

tells you something, right? This is a scale

15:20

that we have not previously encountered and

15:22

it's a price point that we have

15:25

not previously encountered. And the sheer volume

15:27

makes it a different proposition.

15:29

I also think that sure we

15:31

can spot it now, but

15:34

we are also on a trajectory in terms of

15:36

the sophistication, you know, six months ago, a year

15:38

ago, you couldn't have had that Dr. Carl video

15:42

now you do, and that

15:44

is only going to increase

15:46

certainly faster than the laws to

15:48

rein it in or the measures

15:50

can keep up. I would like make a

15:52

distinction between like misinformation farms and like active

15:55

scams, which are being advertised on Facebook and

15:57

like various other sites or a subsequent point.

16:00

to add to that is that these

16:02

scams have always existed and surprisingly always

16:04

worked, but they shouldn't be

16:06

allowed to be advertised on Facebook. You

16:08

don't actually need new laws about AI

16:10

deepfakes to restrict these ads because I'm

16:12

pretty sure they definitely

16:14

run afoul of existing consumer law. Beyond

16:17

increasing media literacy, which I think is

16:20

just an understandable first response to this

16:22

kind of content, are there

16:24

strategies that have been put in place anywhere in

16:26

the world to actually tackle these things, something that

16:28

works at the scale at which they're being pumped

16:30

out? So something really interesting that

16:33

Google has been doing, and this was in

16:35

their March update, is they've started de-indexing AI

16:38

generated sites. It

16:41

was hundreds, I think, so

16:43

it's not at all at the level,

16:46

but what it was was a bit of a

16:48

shift in policy. And for certain people, particularly people

16:50

in Mark, I had the misfortune of stumbling

16:52

into a marketing corner

16:54

of LinkedIn, and that was where

16:56

I discovered people actually really angry about

16:58

this being like, oh, it's punishing legitimate

17:01

content sort of thing. But yeah, Google

17:03

was de-indexing all these sites. And you

17:05

can imagine that if they're willing to

17:08

do it on some level, that list

17:10

of sites that are de-indexed may grow.

17:12

You also see, for example, NewsGuard, those

17:14

researchers that I mentioned earlier, they have

17:17

a Chrome extension and they rate various

17:19

sites. So tens of thousands, I think,

17:21

many, many, many news sites or

17:23

air quotes news sites around

17:25

the world. And

17:29

so it's like the

17:31

labeling thing versus getting rid of it altogether.

17:33

And I think that's kind of where we're

17:35

going to land on this. I'm not

17:37

sure if I'm aware of any country that's

17:39

had a successful digital literacy drive to curb

17:42

the success of these kind of scams. But

17:44

I would say, to piggyback on

17:46

your point, I do think this is

17:48

legitimately an area where the social media

17:50

companies are responsible. Like obviously some of

17:53

these scams are being pushed out on

17:55

like miscellaneous source websites. But they're also

17:57

on Facebook. And Facebook says that they've...

18:00

remove like 650,000 of them. But

18:02

I feel like if AI is sophisticated

18:04

enough to replicate Dr. Karl, then you

18:06

can get AI sophisticated enough to identify

18:08

those replications. The thing that kills

18:10

me is they're not even replicating him. They're

18:12

just taking existing photos and doing what

18:15

would appear to be some very mediocre

18:17

Photoshop work. Exactly. Well, they did

18:19

make a video as well. And that was surprisingly,

18:21

I don't know. Oh, wow. In that case, I

18:23

figured it out. Yeah, so there's a video and there's his voice

18:26

and he sounds like Dr. Karl

18:30

and the ABC sort of tried to get

18:32

them to take it down. And it's, yeah,

18:34

I mean, it is

18:36

at a level, it wouldn't have been possible technologically before.

18:38

So we are moving a rate of knots here. The

18:41

other thing I would add is that, as

18:43

you mentioned before, this is certainly gonna get worse. OpenAI,

18:46

the company that makes chat GBT, they've

18:49

said that they actually have already created or are

18:51

very well on the way to creating a AI

18:53

model specifically for speech replication. But for obvious reasons,

18:55

they have not really sat to the public yet.

18:57

They were gonna hold off until after the US

19:00

election. Yeah, yeah, exactly. It's much harder to create

19:02

a realistic but visual representation of someone talking. If

19:04

you take out that factor and just rely on

19:06

the audio, I feel like you'll be able to

19:08

do much more damage. On your point before,

19:11

there is something instinctive. There's an instinctive

19:13

logic that if we can create it,

19:15

we can detect it. But the point

19:17

I would make is that that is

19:19

absolutely not been proven to be the

19:21

case with AI. We are

19:23

so much better at making the

19:25

tools that create this content than

19:27

we are at detecting it. We

19:29

still don't really have good detection.

19:31

Watermarking, yeah, it's in the works,

19:33

but in the works is the operative

19:35

part of that sense, it's the important part. We

19:38

don't know how to catch this stuff properly yet in

19:40

a clean way. Yeah, I know Adobe's been working on

19:42

some processes as well. I do love

19:44

that chat GBT have decided to hold off for the

19:47

US election, but just in time for somebody else's election

19:49

somewhere else. Also the most easily

19:51

faked voices on the planet, the US

19:53

president and Donald Trump. All

19:55

right, download the show, it is what you're listening to. It

19:57

is your guide to the week in media technology and. and

20:00

over the weekend a brand new social media

20:02

platform and they just keep making them, it

20:05

launched, it is called AirChat. It has

20:07

been pitched to something like a combination

20:09

between Twitter, yes I'm persisting with calling

20:11

it Twitter not X, and Clubhouse, Dan

20:15

you've tried it. Just for somebody that's never experienced

20:17

it before, just describe the experience. Well here's

20:19

my pitch for those people. You know how social

20:21

media is great and you definitely need more? Well

20:24

have I got the new one for you.

20:26

So basically it works like Twitter except to post

20:28

a tweet or AirChat, you have

20:30

to record yourself a voice memo and then

20:32

it transcribes that and posts it both as

20:34

a text and when you scroll down it,

20:36

it reads your voice aloud. So I actually

20:38

tried it this morning, other than the fact

20:40

that there's only like 12 people on it.

20:42

It was actually- The perfect number if you

20:44

ask me. Yeah, it was actually not as

20:47

horrible as it sounded. It's actually sounded

20:49

by the founder of Tinder and

20:51

a very famous Silicon Valley investor. So the pedigree is

20:53

there I guess. It was a lot better than I

20:55

thought it would be but I still just keep going

20:58

back to the thing of like who needs

21:00

more social media? Okay, so the reason I

21:02

wanna talk about this is because I've

21:04

often wondered if people

21:07

had to verbalize the things

21:09

that they tweet angrily, would

21:11

they still write half the things they tweet?

21:13

That's why I thought it was worth talking

21:16

about. I was interested in about how it

21:18

changes or has the potential or maybe not

21:20

at all to change us as. You're

21:22

bang on, right? Maybe that's why people won't

21:25

use it. Do you know what I mean?

21:27

There's something there. There's this extra user barrier.

21:29

Like kind of the experience of being online

21:31

and it's a sort of like seamless slipping

21:33

between locations and not having

21:36

to use your voice. It's

21:38

like, it's entirely cerebral. And so this gets

21:40

in the way of that. That cuts across

21:42

the appeal. The other thing that cuts across

21:44

the appeal. You've gotta wanna go to the party.

21:46

There've gotta be people that you wanna see at

21:48

the party in order for you to show up.

21:51

And at the moment, as far as

21:53

I can tell, there's a lot of

21:55

like tech types on the side, which

21:57

checks out, they're the early adopters. But

22:00

they're kind of the people like, look, the

22:02

quiet part of the sentence whenever someone launches

22:04

a new social media platform at the moment

22:06

and some, you know, sometimes we do just

22:09

go ahead and say it out loud is

22:11

like, Oh, maybe this will be what Twitter

22:13

once was. Like maybe this will

22:15

be the nicer version of Twitter, but the

22:18

eternal hunt for the nicer version of Twitter. Right,

22:20

right. But it sounds like the

22:23

people that you would be wanting to avoid on

22:26

Twitter are the first people who

22:28

have started at air chat. Me

22:30

you mean? Yeah. Yeah.

22:33

Well, in fairness, we made you do it. On that, like the

22:35

eternal hunt for a good Twitter, like I was saying before, in

22:37

regards to like, who wants more social media. So like we've been

22:39

through a few rounds of that, like blue sky. What was the

22:42

other one? The other nice Twitter. Uh,

22:44

reds and another like blue sky and

22:46

one more, but all of them mastered

22:49

on, mastered on all these things that

22:51

are definitely part of the popular conversation

22:53

that everyone knows about. Yeah, exactly.

22:55

Totally. And if you've done those things before, don't

22:57

worry, you will never need to hear it. Yeah. The

23:00

basic premise was that they were like Twitter, but

23:02

not bad. Um, and everyone was just about it

23:04

for a week and then just gave up. Um,

23:06

I think the path to success for air chat

23:08

would be like teenager adoption. Maybe just people who

23:10

have not been completely disenfranchised by, uh, Twitter

23:14

and meta and such. That's

23:16

an interesting point, right? Because obviously there are online

23:18

services that are built on voice, right? I mean,

23:20

Twitch, for example, and you know, not exclusively

23:23

voice, but it's not like we don't do

23:25

voice online. It does happen to discord exactly.

23:28

But at the same time, I'm, I'm, there's a staff that always come

23:30

back to it. A couple of years ago, I worked for beyond blue,

23:32

the mental health organization. One of the things they said is that one

23:35

of the most taken up of their services was

23:37

chat. A lot of young people,

23:40

and I don't have a number to put in them. A

23:42

lot of young people referred to have a conversation with a

23:44

mental health professional via chat

23:46

rather than verbalizing. And

23:49

so I'm not saying that that necessarily carries over

23:51

into all online behavior, but I do think there's

23:53

a lot of online activities now that we are

23:55

so used to doing non-verbally.

23:58

Right? And I do. I question

24:00

whether there is that much of a desire to

24:02

do talking Twitter, which is, by the way, what

24:04

I'm going to call this thing from now. It's

24:08

honestly a better name and probably more than a

24:10

shit. Maybe in a weird way that you actually

24:12

have to use your voice as like a novel

24:14

selling point by remember using

24:16

your voice. You can do it again. I don't

24:18

know. And you do hear

24:20

employers, I guess, complain a lot

24:22

about, I mean, just sort of

24:24

conversation that Gen

24:27

Z are scared

24:29

to pick up the phone in the workplace.

24:31

Like they are just that much shy. And

24:33

yeah, because that's a generation and millennials have

24:36

this to a lesser extent, but it's still

24:38

present. We've just been socialized in a completely

24:40

different way. I just love it. We're less likely

24:42

to be taken in, according to Dan, by the

24:44

way, this whole like generational warfare thing, hundreds of

24:46

them putting on you, we're less likely to be

24:49

taken in by the Dr Carl Scamm, but we're

24:51

terrified of picking up the phone and calling people.

24:53

Hey, man. You can't have it all.

24:55

You can't have it all. Have you

24:57

guys got any friends? Like I now have friends who

24:59

exclusively send me voice memos, like they don't text me.

25:02

Like I feel like a few of my friends have

25:04

done the like, I'm sick of this like texting business

25:06

like I'm just going to send you a voice. The

25:08

younger they are, the more likely they are to do it. And

25:12

I was resistant and now I kind

25:14

of love it. I'm saying there's

25:16

something there. I've got one friend that does it

25:18

and I think she mostly does it because she

25:21

can't be bothered. Also,

25:24

she tends to send me all the,

25:26

I guess there's two reasons to do it. One, which

25:28

is you realize that Siri is just not that good

25:30

at transcribing you. And secondly

25:32

is maybe only does it

25:34

look slightly drunk. All right, so let's

25:36

look ahead as has been established. We have a

25:38

whole bunch of different apps that are now vying

25:41

to be the next Twitter. You mentioned Vasodon, Blue

25:43

Sky, threads, which being backed by

25:45

Meta, the company behind Instagram and Facebook, you

25:47

would have to assume is the natural successor.

25:50

But it hasn't really taken off, I don't think. And

25:52

the fact that that hasn't really taken off, or has

25:54

it? Everyone's looking around. I disagree. I

25:56

think it's probably, I mean, it's probably the front runner right

25:58

now. It is like the alternative. of someone

26:01

characterised me the other day as

26:04

gay Twitter, which I quite enjoyed.

26:07

I think it's just like that's where all the queers have gone.

26:09

I guess I've got a lot of queer

26:11

networks and so it's kind of

26:14

like, well, let's hate speech. Well, I said that's

26:16

interesting, right? Because that kind of says something,

26:18

right? Because if you've already found your tribes,

26:21

right, that kind of suggests it's working better,

26:23

right? I mean, one of the issues, I

26:25

think, with Twitter is that it was so

26:27

exposed, right, that you would often encounter people

26:29

that you often vehemently disagree with. And to

26:31

some extent, that's good for civil society, but

26:33

then to another extent, it's toxic trash fire.

26:36

Yeah, I think there's like a Goldilocks zone,

26:38

right? Because one of the other challenges

26:40

for new platforms is that when you

26:42

are starting from scratch, you don't have

26:44

all that infrastructure, like people you've been

26:46

following for years, people who have online,

26:48

you don't want like that binfire on

26:50

Twitter anymore and X. But the good

26:52

thing about coming across to thread, I

26:54

think what made it attractive is that

26:56

it suggested people who you were following

26:58

on Instagram. So you had a little

27:00

bit of scaffolding there to create, you

27:03

weren't starting from zero, but nor did

27:05

you have every weird bot

27:07

that's followed you or every person you regret

27:09

following in the last 10 years on Twitter. The

27:11

difficulty for things like air chat is that you

27:14

don't have that existing network to port over. But

27:16

the good thing is, because it's not just like

27:18

a different version of Twitter, I can imagine in

27:20

like two or three years, there being all

27:22

these novel and kind of goofy ways the

27:24

platform is being used that we wouldn't predict

27:26

now. But I could also say

27:29

it becoming like Clubhouse, i.e. dead

27:31

in two weeks. I suppose the telling

27:33

thing will be whether someone's thread account

27:35

starts to get bigger than their Instagram

27:37

account, because at that point, threads will

27:39

become its own. You will have

27:41

your own identity on threads that is unique to just

27:43

the people you inherit on Instagram. And I think that

27:45

will be an interesting turning point for lots of users.

27:48

And we will wait and find out next time. Huge

27:51

thank you to our guest this week, Angela Vaupergier,

27:53

ABC National Tech reporter, pleasure as always. Thanks for

27:55

having me. And Daniel Van Boom from Capital Brief.

27:57

And me as a podcast. Certainly

28:00

enemy and baby boobers. My

28:03

name is Mark Fennell. Thank you for listening to

28:05

another episode of Download This Show. Don't forget you

28:07

can listen to all your favourite ABC podcasts on

28:09

ABC. Listen and I'll catch you. Bye

28:12

bye. You've

28:27

been listening to an ABC

28:30

podcast. Discover more great ABC

28:32

podcasts, live radio and exclusives

28:34

on the ABC Listen app.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features