Podchaser Logo
Home
Sarah Hanson-Young on the debate around free speech on social media

Sarah Hanson-Young on the debate around free speech on social media

Released Saturday, 27th April 2024
Good episode? Give it some love!
Sarah Hanson-Young on the debate around free speech on social media

Sarah Hanson-Young on the debate around free speech on social media

Sarah Hanson-Young on the debate around free speech on social media

Sarah Hanson-Young on the debate around free speech on social media

Saturday, 27th April 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Ryan Reynolds here for Mint Mobile. With the

0:02

price of just about everything going up during

0:04

inflation, we thought we'd bring our prices down.

0:07

So to help us, we brought in a reverse

0:10

auctioneer, which is apparently a thing. Mint

0:12

Mobile unlimited, premium wireless. How did you get

0:14

30-30? How did you get 30-30? How

0:16

did you get 30-30? How did you get 30-40? You bet you get

0:18

20-20, you bet you get 15-15, 15-15, just 15 bucks a month. Sold!

0:23

$45 up front for three months plus taxes and fees. Promote

0:25

for new customers for limited time. Unlimited more than 40 gigabytes

0:27

per month. Slows. How

0:30

can a people-first approach to higher

0:33

education transformation improve success? An EY

0:35

report suggests that taking emotional and

0:37

psychological factors into account is just

0:40

as important as the technology. Six

0:43

factors drive this human-centered approach.

0:45

Leadership. Inspiration. Care.

0:47

Empowerment. Investment. And collaboration. Get these

0:49

rights and they can more than

0:52

double an organization's chance of transformation

0:54

success. Learn more about

0:56

people-first transformation at theguardian.com/transforming higher

0:58

education. This message was paid

1:01

for by EY. Every

1:13

time parliaments have acted on trying

1:15

to regulate, particularly in the technology

1:17

space, it quickly gets out

1:19

of date because how fast the world

1:22

transitions. And that's not an

1:24

excuse for not doing anything. Hi,

1:27

I'm Karen Middleton, Guardian Australia's political

1:29

editor, coming to you from the

1:31

lands of the Ngunnawal and Nambri

1:33

peoples. Today on the

1:35

Australian Politics Podcast, I'm speaking with

1:37

Greens Senator for South Australia Sarah

1:39

Hanson Young. This week

1:42

it's the government versus Elon Musk, owner

1:44

of the social media platform X. Free

1:47

speech, social responsibility and a

1:49

legal showdown over violent content

1:51

online. There are demands

1:54

for better regulation, but can Australia be

1:56

the world's internet police? A

1:58

series of high profile. The Zebra new

2:01

the focus on violence against women

2:03

with rallies around the country and

2:05

the makers of must share for

2:07

claiming green credentials for using bio.

2:09

may say guess in a new

2:11

season greenwashing would just be energy

2:13

transition. Let's see what the center

2:15

some bags of at all. Senator

2:19

has a young thanks for joining us

2:21

on the Australian Politics podcast know it's

2:23

great to they had Karen think he

2:26

there's a lot this thing going on

2:28

this wage that Senor Bailey Wix politically

2:30

speaking at the Can we start with

2:33

social media and the big thought were

2:35

having with a lot mask we say

2:37

to him resisting the demands Israeli government

2:40

has been making particularly about taking down

2:42

a violent video of an incident that

2:44

occurred in Sydney a week or saw

2:47

a guy he's pushing back. And

2:49

saying. You know you can't

2:51

police the entire world Internet and we

2:53

saw Peter Dutton make a similar point in

2:55

the last few days. Did I have a

2:58

point? Are we trying to sort of hold

3:00

back? The I said that we trying to

3:02

police the whole world and is that

3:04

just unrealistic? Will

3:06

get Spain. Quite an interesting debate

3:08

to say explode at last week I

3:11

think. Am now. People

3:13

are rightly frustrated. And angry

3:15

that. A big tech tycoon

3:17

like Elon Musk can ask is

3:19

like he can pick and choose

3:21

them. call your in starling ministers

3:23

than an independent regulated name and

3:26

effectively games and this time government

3:28

the middle finger. I mean I

3:30

think this is probably more about

3:32

is a guy them and much

3:34

shell. However, It's. Has

3:36

created and brought to the from

3:39

abroad a conversation and conversation we

3:41

really need have because so far

3:43

too long seek tech platforms particularly

3:45

in the social media sites have

3:48

grown and grown crime They have

3:50

millions of users now. with

3:53

very little is known regulation and

3:55

that's the key point and the

3:57

tech companies ride around the world

4:00

know this. They know governments

4:02

are increasingly frustrated. There have

4:04

been changes elsewhere around the world.

4:06

Australia has implemented some here. The

4:09

EU has recently introduced a

4:11

digital act that starts

4:14

to regulate social media companies.

4:16

So the big tech tokens

4:18

know that their social

4:20

license is wearing thin with the community and

4:23

governments are being forced to act. What we've

4:25

seen this week is just one tiny part,

4:27

but it misses the broader problem I think

4:29

and the bigger problem. That is what

4:32

is the business model of these companies because

4:35

that's what they're actually into. That's what

4:37

they're worried about protecting. That's

4:40

what they don't want government playing with.

4:43

To be perfectly blunt, it's that

4:45

business model that is actually driving

4:47

the biggest risk and

4:49

this kind of viral nature

4:51

of these horrible posts. So

4:54

are you suggesting that companies like X,

4:57

which is what Lelan Musk is the

4:59

head of or other platforms, are using

5:01

the free speech argument to mask what

5:04

is just a profit-making enterprise? Absolutely.

5:07

The free search argument

5:10

is a bit of a red herring, frankly.

5:12

What they don't want is governments regulating

5:15

how they use algorithms, how

5:17

they use individual users' data.

5:19

These companies are effectively

5:22

advertising companies. It's not

5:24

that we need to regulate

5:26

individual conversations between users. It's

5:28

the data from the users that the company

5:30

sells, that Facebook or

5:32

Twitter or Google sells, to

5:35

make extraordinary profits, billions and

5:37

billions of dollars, and

5:39

then the amplification of

5:41

particular views through those algorithms.

5:44

Free speech and freedom of

5:46

accession is a really important principle.

5:50

You and I can have a conversation and express our

5:52

views to each other. They may

5:54

be different. That's not what we're

5:56

talking about when we're talking about regulating

5:58

social media. We're saying. The things

6:00

that are amplified. The. Views

6:03

that are amplified and cause harm

6:05

is watch. Is that amplification that

6:07

needs to be regulated a shower

6:09

driven at to go viral is

6:12

that The algorithm question is that's

6:14

is that the first. Step then that

6:16

we need to. We need. To force them

6:18

to make public how their algorithms work. Yes,

6:21

I mean that the extraordinary thing about

6:23

this is the real driver of their

6:25

business model is kept secret. There

6:27

is no transparency and used

6:29

against and tool and each

6:31

they. Use it or isn't isn't

6:34

protection around the use of children,

6:36

dasa and algorithms being. Used to

6:38

push content in front of the children or. Minus

6:40

I know the coalition have been

6:43

talking about in a in enforcement

6:45

roads that mine is not using

6:47

social media I think I'm wary

6:49

of than unconfirmed and alongside experts

6:51

that that's. Not really the

6:54

size. It's while protecting children.

6:56

What would be safer is

6:58

to restrict social media companies

7:00

using the daughter of mine

7:02

is against the interests. Of

7:04

minors by forcing them into

7:06

these rabbit holes and algorithmic

7:08

patents. So. He isn't the

7:11

case that we've been doing civic piecemeal

7:13

with. You have been looking at particular

7:15

S aspects so of these issues as

7:17

I said say one cohort all you

7:20

know, one piece of legislation and not

7:22

taking. A a whole of government or

7:24

all of nice and view. Yes,

7:26

I think this is a big part

7:28

of the problem and the take that

7:30

the be a safety commissioner and the

7:33

take down rose is is is actually

7:35

very narrow path and it's not the

7:37

key driver of the issue. Tedious this

7:39

business model it's the amplification be used

7:41

as an individual dasa as. a

7:44

happy since to the broader fear fear

7:46

of media regulation that i'm and insisted

7:48

to turn and when i listen to

7:50

members of the community of the last

7:52

week as it's debate has raged in

7:54

relation to a lot mosques and who's

7:56

rightly so a lot of people are

7:59

saying while he be talking about misinformation,

8:01

where are the rules and regulations around old

8:03

media as well? And I think we've

8:05

got a real problem here. We're in 2024 and

8:08

our media regulation rules for both old and

8:10

new and the new paradigm are

8:13

just not up to scratch and not fit

8:15

for purpose. We actually need an overview

8:17

and an overhaul of media regulation

8:20

across the board because the

8:22

platforms by which we access news,

8:24

whether it is the Guardian, whether

8:26

it is the ABC, whether it

8:28

is the Murdoch press is increasingly

8:30

moving to an online platform and

8:33

yet it's not regulated in any

8:35

of the same way. So

8:37

hold that for a moment before we come

8:39

to the idea of news media regulation. Can

8:41

we just stick with the platforms for a

8:43

second? To the point

8:45

you're making about regulation, it's connected.

8:48

I went back and booked at

8:50

the Senate Economics Committee's report from late

8:53

last year on the influence

8:55

of international platforms. Now, you weren't a

8:57

member of the committee. Your colleague, David Shubridge, was. And

8:59

it was chaired by Liberal Andrew Bragg. But

9:02

it looked at everything across the

9:04

board and how these giant platforms

9:06

are influencing our community, everything from

9:08

the child safety point that you

9:10

make to artificial intelligence, the

9:12

impact that it's having across the board, the algorithms,

9:14

the way they charge for their apps, all of

9:16

it. And one of the

9:18

key takeaways from that report was they

9:20

said no one's really in charge of

9:23

regulating these platforms in a holistic sense.

9:25

They pointed to other industries

9:27

like airlines and banks and even

9:29

telecommunications companies and they all have regulatory

9:32

codes that these platforms don't. The

9:35

government members on that committee were sort of

9:37

playing down the findings, saying, look, we've got

9:39

inquiries, we've got the ACCC, we've got enough

9:41

measures in place we don't need to do

9:43

much. But in light of what we have

9:45

seen in these past couple of weeks, what

9:47

do you make of a recommendation that there

9:50

needs to be some overarching

9:53

regulatory body targeting

9:55

the platforms and all of the aspects

9:57

of what They do, coordinating... the word.

10:00

Say the Ak Safety Commissioner who as you

10:02

say just has a narrow Philly, downright tasks

10:04

and is very busy each. What do you

10:06

think about that is is that a good

10:09

idea and should the government have been doing

10:11

that already. I think we.

10:13

I think you're right across the

10:15

globe. This is happening. Governments have

10:17

been caught flat footed and what's

10:20

happened? Is they platforms is gonna remember

10:22

that when I spoke first came along

10:24

the which would affect can the long

10:26

they saw themselves as disruptive you know

10:28

that was start ups and that were

10:30

disrupted and now the respectively monopolies. We're

10:33

now twenty twenty four In a

10:35

space where they suck, forms are

10:37

effectively. Critical infrastructure. right?

10:39

For the modern world and in any other

10:42

place to be a critical infrastructure. Government.

10:44

Regulate They are the say

10:46

they protects. Name. Sure that

10:49

the public interest is being looked

10:51

after and with got these extraordinary

10:53

critical infrastructure that he's not being

10:55

government. And whether it's in

10:57

a democratic country like Australia or elsewhere,

11:00

Sorry I I do think

11:02

am calls for overarching. Regulation

11:04

and a regulator is where things

11:07

will probably make to die, but

11:09

governments are sorry behind the eight

11:11

ball and on how to be

11:14

concerned. Listening and rating comments from

11:16

Peter Dutton in relation to this

11:18

issue around. The Law Mosque and now.

11:21

It's kind of fact tracking you not to

11:23

being old dumb blazing earlier in the wake.

11:25

Now he sang all when we know it's

11:27

all too hard. To too long and

11:29

government had said it's all too hard. And.

11:31

What's going on is the

11:33

community is same. Now they're

11:35

experiencing the risks. Online.

11:38

Best saying that this is unfair. They're saying

11:40

that. That is this. The

11:42

extraordinary amount of money, But these phones, mike

11:44

and then on. even textbooks like. Billions

11:47

and billions of dollars Trillion dollars we're

11:49

not flooding are the same into their

11:52

coffers and Facebook and Twitter and elsewhere.

11:54

And it's not have. Begun

11:56

Yet this is critical public

11:58

infrastructure. And we need a prop. I prefer what

12:01

we've seen. Talk of help you know,

12:03

maybe the international community that should come

12:05

together that they should be at a

12:07

whole a wholesale at endeavor. Involving

12:10

major countries to try and

12:12

get. These tech companies to pull

12:14

the me to lie but.we struck a

12:16

problem ultimately with something like the constitutional

12:18

free speech protections in the United States

12:21

that end up. Being. A roadblock

12:23

to action because everyone points to free speech

12:25

and as we can't we combine them in

12:27

well as I guess that comes. Back

12:29

to my point that is. this

12:31

isn't about restricting freedom of expression.

12:34

It's about limiting the harm

12:36

that the business models that

12:38

these companies drive and. Sadly,

12:41

It the shortest path to me. That

12:44

they sat phone. Is site. Violence.

12:47

Extremism. Conspiracy Live

12:50

and. That's why

12:52

the algorithms pushed him. Because.

12:55

It's the shortest. Path to

12:57

Clicks and the answers the

13:00

question. Of these extreme views

13:02

are think people need to understand com

13:04

at the cost and the villain. Of

13:07

other views. Level playing field.

13:09

It's a pay to play. Atmosphere:

13:11

Where were operating in here? If you put

13:14

money behind your post, It's more

13:16

likely to get up the top of

13:18

the ranked if you a generate outrage.

13:20

If it's that kind of extremism you

13:22

fades into the algorithm, it goes up

13:24

the ranks. Other views: Legitimate

13:26

views. Perhaps. That

13:29

information out there that would be

13:31

needed. in terms of you know

13:33

that's fair, local community or an

13:35

individual gets pushed down Say it's

13:37

not an equal playing field. That's

13:39

not about freedom of speech. This

13:41

is about making sure they're a

13:43

dog growls around. How these platforms

13:46

monetize people. At. Place people's

13:48

views, people's desire for connection.

13:51

Will We took a lot about

13:53

corporate social responsibility, but but realistically,

13:55

how do you legislate for corporate.

13:58

Social. Conscious. Yeah,

14:01

and I think when you're dealing with individuals like

14:03

Elon Musk, I think you can understand

14:05

why people go, oh my gosh, this

14:07

bloke has no care, no concern,

14:09

obviously no social conscience. But

14:12

the EU is doing it. They've just brought in

14:14

laws over the last 12 months in relation

14:16

to putting some restrictions around and it

14:19

goes right to the heart of the

14:21

algorithm, making things more transparent, having

14:23

restrictions around how data of miners

14:26

is used, that you can't

14:28

use algorithms to target particular groups

14:30

of people. Young people is

14:32

one of them, but it means you can't have

14:34

targeted algorithms on race, on

14:37

religious views, a bunch of things so

14:39

that people aren't not just targeted, but then

14:41

targeted and isolated from the views of others.

14:44

The EU is doing it and Australia needs

14:46

to get on board this global push and

14:48

we're not doing this alone. So let's talk

14:51

more about the media regulation point. I know

14:53

you've got a bill that's being examined by

14:55

a Senate committee at the moment. You've spoken

14:58

this week about trying to

15:00

haul some of these people, if not Elon

15:02

Musk personally, then someone representing him and others

15:04

from other platforms before the committee. What

15:07

would your bill do and what would be

15:09

the point of bringing those people in if

15:11

you can even do that? Look,

15:13

this is about the fact that we

15:16

need fit for purpose media regulation. There

15:18

is this constant connection between the

15:20

old media and the new

15:22

media and players in between and we

15:25

just don't have the rules. I

15:27

don't think it matters whether you're a tech

15:29

billionaire like Elon Musk or

15:31

Mark Zuckerberg or a

15:33

media mogul like Rupert Murdoch or

15:36

Kerry Stokes. The public has

15:38

a right to know and trust

15:40

the news and information that's

15:42

posted by trusted sources. It is

15:44

in the public interest to make

15:47

sure that we have proper media

15:49

regulation that delivers quality public interest

15:51

use to members of the community. Are

15:54

we talking about things that are not

15:57

possible really? Maybe this goes back to the

15:59

point about the we can't police the

16:02

entire world. Is

16:04

there somewhere realistically in between what we

16:06

have now and the

16:08

ideal of a properly regulated media

16:10

industry and across both traditional

16:13

media and new media

16:15

that would deal both with the problems we see

16:18

in the immediate future

16:20

currently and the problems might

16:22

be coming down the pipe? Honestly, when

16:24

licensing for broadcasters was brought in decades

16:26

and decades ago, broadcasters didn't

16:28

want the licensing either. There's

16:30

always been pushback. But when

16:33

you are offering information news

16:35

to the public, there needs to

16:38

be some legitimacy. There needs to

16:40

be accountability. And there

16:42

needs to be a social license. And

16:44

I think there's no silver bullet to

16:46

this. And I guess that's the point

16:49

of my regulation Royal Commission is to

16:51

say, look, we don't

16:53

have the right rules. It shouldn't be

16:55

up to politicians, frankly, to do this.

16:57

I don't think that's a good place.

16:59

But what our job is, is to

17:01

reflect the concerns in the community. And

17:03

there's certainly concerns in the community that

17:05

media regulation is not up to scratch,

17:08

that social media is a free for

17:10

all, that people don't worryingly, increasingly don't

17:12

trust the news that they see, read

17:14

and hear. And that

17:16

is bad for democracy. And it

17:19

is a fundamental threat to democracy

17:21

if we don't have trusted news

17:23

sources. So how do we make sure

17:25

we protect and invest

17:27

in trusted news sources and

17:29

ensure that it's accessible to everybody? And

17:31

that's why we need an overhaul of

17:34

regulation and a Royal Commission will be

17:36

able to do that. Your

17:38

Royal Commission proposal originally, though, was just about

17:40

the Murdoch media, was it not? Like, are

17:42

you now talking about was that on reflection

17:44

too narrow and maybe not fair and now

17:46

it needs to be broader or do you

17:48

think that is still the frame? To be

17:50

fair, the terms of reference have always been

17:52

broader. But yes, you

17:55

called it the Murdoch media regulation bill. I

17:58

think upon reflection. I think

18:00

the community, while they're upset

18:03

and they distrust Murdoch Media, they're

18:05

increasingly looking at other players and going,

18:07

you know, this needs a clean up

18:10

across the board. And so,

18:12

you know, maybe it's a reframing,

18:14

but nonetheless, some type

18:16

of independent commission

18:19

to inquire, to

18:22

test and push the boundaries

18:24

to try and have that

18:26

overarching regulation. It's

18:28

well and truly over time. And

18:30

I don't think it's good enough just to go, oh, it's

18:33

all too hard. Oh, you know, maybe we've missed the boat.

18:36

Every time government has put in place

18:38

and permits have acted on trying

18:41

to regulate, particularly in the technology

18:43

space, it quickly gets

18:45

out of date because how

18:47

fast the world transitions. And

18:51

that's not an excuse for not doing anything.

18:54

I've expressed some frustration around

18:56

the news media bargaining code

18:59

and the way that's being enforced.

19:01

So what you're talking about, I guess, with regulation is with

19:04

a royal commission is, okay, let's have it

19:06

all on the table and see what changes

19:08

we need to make, what comes to light,

19:10

what legislative responses we need to improve the

19:12

situation. But you're also arguing that the government

19:15

has some powers now that it isn't using.

19:17

Now, under the news media bargaining code, there's

19:19

a power of designation, which as I understand

19:21

it, if a platform or a company is

19:23

designated by the government, then that triggers

19:26

some obligations under the code. Firstly, can

19:28

you just explain what those obligations would

19:30

be and why you

19:32

think they haven't been designated any

19:34

platforms? Yeah. So the

19:36

news media bargaining code was

19:39

one element to try and

19:41

deal with some of the need to ensure

19:43

that there is trusted news on

19:46

these platforms, but also to ensure that journalists get

19:48

paid for the work that they do. Right. So

19:50

that was making those big companies pay for every

19:52

time they put other media news on their site

19:54

from journalists, they had to pay for it. And

19:57

now at least one company is saying that I

19:59

want to do that anymore. Yeah, so Mecha

20:01

and Google's trust deals with a number

20:03

of media companies across the country in

20:05

here in Australia. This was leading legislation

20:07

and Mecha have now said they

20:09

won't renew these contracts. They're not interested. They

20:12

say it's because the users on

20:14

Facebook and Instagram aren't interested in

20:17

news, which

20:19

I have a real cynicism for. I don't believe

20:21

that for a second. But what I

20:23

am worried about is that if they are

20:26

to block news and links

20:28

and information from trusted news providers

20:31

like The Guardian, like the ABC,

20:33

like the Murdoch Press, like

20:35

the local newspaper in

20:38

a small rural town.

20:41

They call it Murdoch Press a trusted provider now.

20:43

Hang on a minute. Well,

20:46

they are in the terms

20:48

of, you know, they've got rules and regulations

20:50

that the social

20:53

media publishers don't. There

20:55

are some decent journalists at News Corp,

20:57

of course, who do good

20:59

work, but it's the business model of

21:02

outrage that I see coming from Murdoch

21:04

management that worries me the most and

21:07

their undue influence. But I

21:09

digress. I'll come back. The

21:12

purpose of designation is to say that

21:14

these companies, Mecha, Google and anybody else

21:16

that would be designated, would have to

21:18

pay for the content that they carry,

21:20

news content that they carry. They pay,

21:22

you know, it's journalist work. It's

21:25

being used on the platform. Users

21:27

are accessing it, reading it, consuming

21:30

it. Therefore, as a publisher, those

21:33

social media companies should pay. The

21:36

missing point here, though, is that if

21:38

they're designated under the current law, they're

21:41

only designated for what they carry. And

21:44

Mecha are threatening, like they've done in

21:46

Canada, to just not carry news at

21:48

all. To say, OK, well, we

21:50

don't want to have to pay for it. So we

21:53

just won't even we will ban the

21:55

sharing of news links. We

21:57

will be prioritised using their

21:59

outreach. algorithm, pushing down to

22:01

the bottom, news content. And

22:04

I think that's a real risk. And

22:06

in the real kind of life exchange,

22:09

it would mean something like you might

22:11

be in a Facebook group

22:13

and somebody posts a conspiracy

22:15

theory about something, maybe a

22:17

conspiracy theory about what happened

22:19

with the Bondi attacks. Now

22:22

rather than being able as a

22:24

member of that group to post a news story

22:26

to debunk that myth, you won't be

22:28

able to do that if meta

22:30

banned news sources

22:33

and news links. So therefore

22:35

it just becomes a platform

22:37

for mis and disinformation, conspiracy,

22:39

hatred, lies,

22:43

even more than it is now. And

22:46

so I think, I know the former

22:49

ACCC Commissioner Rod Sims is of

22:51

this view as well, that

22:53

along with designating their social

22:55

media platform to have to

22:57

pay for the content, the

23:00

news content, I actually

23:02

think they need to be designated to

23:04

carry news in the public

23:06

interest because otherwise it

23:08

will just be a platform of

23:11

mis and disinformation that can never

23:13

be debunked by you, me, an individual

23:15

user or an organisation. Is that legally

23:17

possible to do that? Well,

23:20

it's something that I think is worth

23:22

having a look at how you regulate

23:24

it. I know other countries around the

23:26

world are considering these moves.

23:29

Wouldn't they say, well, what is news? Who's news?

23:32

Of course, but there are already

23:35

ways that we categorise this. There

23:37

are companies that

23:40

ACMA, the media regulator, uses

23:43

to decide who these platforms

23:45

should pay for their content.

23:47

There's a number of ways that you

23:49

could say, you're a trusted news source,

23:52

Facebook should not be able to ban

23:54

the carriage of your content.

23:58

It's an interesting idea. There's a lot

24:00

more we could talk about in relation to

24:03

that, but I'm conscious there are other issues

24:05

in your portfolio that I want to ask

24:07

you about. And can we talk about the

24:09

terrible incidents of violence against women that we've

24:11

seen talked about

24:13

publicly, particularly in the last week or so?

24:16

Are we having another moment in relation to

24:18

this issue? Do we keep having moments

24:21

and then nothing happens? Like what should the

24:24

response be to these terrible

24:26

examples of particularly family violence, but not

24:28

only family violence? You mentioned the Bondi

24:31

attacks, you know, all of the victims

24:33

aside from the security guard who valiantly

24:35

tried to stop that attacker were women,

24:37

all the ones who died. Are we

24:40

having another moment? What on earth should we

24:42

be doing that we are not? This

24:45

is horrendous, isn't it, Karen? I

24:48

mean, 31 women killed violently this

24:51

year already. That's more than one

24:53

a week. Even by the

24:55

hands of their partner or their ex-partner

24:58

or someone who wished they were their

25:00

partner, it is just, I

25:03

mean, this is an epidemic. And

25:05

it's time we started talking about it, not in terms

25:07

of just violence against women. This is the murder of

25:09

women. This is the terrorising of

25:11

women in their home and

25:14

on the street. Women don't feel

25:16

safe. Well, we've

25:18

heard some, we've heard to that point,

25:20

we've heard some debate since the Bondi

25:22

junction attacks about whether terrorising is always

25:24

terrorism and should it be that and

25:27

why do we designate other forms of

25:29

violence as terrorism around certain ideologies when

25:31

sometimes there's an ideology or a particular

25:33

view of women that drives that. And

25:35

I'm not making that comment in relation

25:38

to any particular incident involving

25:40

individuals. But more broadly, is it terrorism? Is it

25:42

time that we started to change the way we

25:45

thought about it? Well, I do think we need

25:47

to change the way we think about

25:49

it and the way we talk about it because

25:51

what we're doing so far is not working. It

25:54

is the murder of women. It

25:56

is the terrorising of women and

25:58

often women and children. children in their own home,

26:01

it's not good enough just to say, oh, you're to

26:04

be wringing our hands of it. Like,

26:06

where is the fundamental, they

26:08

have all shoulders to the

26:10

wheel approach to tackle this?

26:13

In what, in what way? I mean, we've

26:15

had many summits, we've had reports, inquiries. Yes.

26:18

And women over and over and over

26:20

again, keep calling for and asking for,

26:23

when are our frontline

26:25

services going to be funded properly? When are

26:27

we going to have a guarantee that

26:29

women won't be turned away when they ask

26:31

for help? When are

26:33

we going to ensure that women are

26:35

protected when they report and even have

26:38

an ex partner convicted of something that

26:40

they were not able to come

26:42

after them in revenge? Women

26:44

have had so many meetings, there've been so

26:47

many summits, there've been so much

26:49

heartache and yet it's still politicians and

26:51

political leaders wring their hands and

26:53

say, oh, this is so terrible. It's

26:56

an epidemic. Oh, it's so

26:58

terrible. I just, where is the

27:00

overarching prioritization that this

27:03

is something that that is

27:05

rife and needs to be

27:07

tackled? And I'm very concerned to be

27:09

perfectly blunt about the

27:12

impact, particularly that social media

27:14

is having on this, that the algorithms again,

27:16

I know we've kind of just come off

27:18

that topic, but I am worried about what

27:21

is being pushed and promoted, particularly

27:23

into young men's social media feeds.

27:26

It is across the board how we tackle

27:28

this issue and yet for some reason it

27:30

gets left to the

27:32

various ministers for women rather

27:34

than the top leadership

27:37

level across all divisions.

27:39

I've taken a lot of time talking about social

27:42

media and media and we've only just touched on

27:44

women. I do want to ask

27:46

you one more question about that and then let's

27:48

quickly talk about some environmental issues, which of course

27:50

are also important to you and generally. What

27:53

about the issue before we leave the issue of women

27:55

about apprehended violence orders? You talk

27:57

about adequate protection, so that's supposed to protect

27:59

women. How well do you think they protect women?

28:01

Do they need more teeth somehow? Well,

28:03

they clearly aren't protecting women. And I think

28:05

the examples, the terrible, terrible

28:08

examples of the last few weeks are

28:10

showing that there does need to be

28:12

a real root and branch review

28:15

of the justice system for

28:17

women and the protection, the protections

28:19

in place for them. It's just not

28:22

excusable. If this was in the health system,

28:24

if one person per week was dying

28:27

in the health system because

28:29

of a lack of infrastructure,

28:31

a lack of the

28:33

kind of systematic problem, kids

28:35

would roll. Why isn't

28:38

it being dealt with in the same way this time

28:41

it was? I know there's rallies across the

28:43

country this weekend. I'll be speaking at the

28:45

one here in Adelaide. And

28:48

what I am hopeful and what gives me

28:51

some sense of encouragement

28:53

is that actually this

28:55

weekend's rallies are being run by young

28:57

women. There is a new generation of

28:59

women coming forward who are sick and

29:02

tired of the excuses and they just

29:04

want to be respected by their peers

29:06

and they want others to

29:09

expect that they should get respected as well.

29:11

Well, what's your message to the rally and to

29:13

the nation? Do we have to have

29:15

an all of government approach to this? And that's

29:17

why the prime minister should declare

29:19

violence against women a

29:22

national emergency. Simple. Let's

29:24

get on with it. It will be interesting

29:26

to see what emerges from those rallies. Weekend rallies.

29:29

Let's briefly talk at

29:31

least about environmental matters. We

29:33

saw the environment minister, Tanya

29:35

Plibersek, recently make a couple

29:37

of announcements within her portfolio,

29:39

creating two environment bodies, Environment

29:41

Information Australia being one and

29:43

an environment protection agency, but

29:45

also putting off till later

29:47

some other aspects of reform.

29:50

What do you make of that? Is it fair enough to say you

29:52

need more time to work on this? I

29:54

think most people who've been in this

29:56

space and watching it and hoping that the

29:59

Labor government were going to deliver. on their promises

30:01

are pretty disappointed, frankly. What

30:03

has happened is the government

30:05

has announced that they're dumping

30:07

the commitment to fix the

30:09

rules. The rules around environmental

30:12

protection and what is assessed

30:14

and how approvals are given

30:17

is really the problem. They're not working. They

30:19

don't stop the destruction of critical

30:21

habitat, for example, even like say

30:24

koala habitat, they don't stop native

30:26

forest logging. They don't assess or

30:28

even think about the impact of

30:30

projects on the climate, what type

30:33

of pollution. They don't stop the

30:35

expansion of new coal and gas. The

30:37

current laws are failing to protect the

30:39

environment. And so rather than fixing those

30:41

laws, that's been put in

30:44

the too hard basket. It's not gonna happen

30:46

this term of government. The

30:48

government dumped that promise and

30:50

instead of moving ahead with what

30:52

they're calling a new agency, but

30:54

really it's basically a new desk

30:56

inside the department, the Environment Protection

30:58

Australia. They're gonna put a new

31:00

public servant in, they'll sit on another desk

31:03

in the department and effectively

31:06

look at compliance and monitoring. But

31:08

compliance and monitoring against what

31:11

are broken rules and ineffective rules.

31:13

So I'm pretty disappointed. I know

31:15

lots of people who were really

31:18

hoping that there was an opportunity

31:20

to protect the environment properly

31:22

and to fix these laws are upset.

31:25

This is gonna mean more logging of native forest, more

31:27

expanding of coal and gas, making

31:30

climate change worse at

31:32

a time when we fundamentally have to put

31:34

the brakes on this stuff. If we are

31:36

actually gonna have a chance of halting extinction

31:39

and stopping runaway climate change. The people of the

31:41

success are just not ready yet. They

31:44

need to be done properly. They're not ready.

31:46

But Karen, the

31:49

mining industry has been lobbying hard. The

31:51

big business groups have been lobbying hard.

31:53

They didn't want these, they don't want these laws

31:55

fixed. And so The

31:58

environment in this is caved in. It

32:01

currently is. Edit it and what

32:03

we're gonna say is more Force

32:05

destroyed. No. Threats to

32:08

native animals. And

32:10

more coal and gas. Pollution. I.

32:12

Went off You Just finally that brainwashing

32:14

on know you have an interest in

32:16

this issue that the Senate inquiry it

32:19

had some more evidence in the past

32:21

wage you've raised concerns about carbon. the

32:23

carbon neutral labeling program. What's what's your

32:25

concern about that per gallon? He gets

32:27

her going where companies pays her have

32:29

a a carbon neutral libel d Is

32:31

it the paying for it? But the

32:33

problem is it. is it how they

32:35

qualify? what else. Besides. Role.

32:38

Greenwashing. Is. It's becoming a bigger

32:40

and bigger problem And and and governments. Are

32:42

having to to tackle this and

32:45

a by greenwashing we may and

32:47

company's coming up wastes ostensibly environmental

32:49

measures that as opposed. To improve

32:51

the public reputation but don't do a

32:53

lot internet environmentally and he says and

32:55

and at least as though because the

32:58

consumer. The. Public Speaking to be

33:00

to members of our community. To

33:02

save isn't wearing an environment cross the

33:04

same? Know that we need to be

33:06

and at acting and I want. To be

33:08

able to make choices. And. Themselves about

33:11

how to show that when you go

33:13

to the supermarket. Nice safe or on

33:15

the shelves. A product it says it's

33:17

greener than the credit next door. He

33:19

bought that one because it makes you

33:21

feel like a healthy environment. The.

33:23

Problem is. Many. Businesses

33:25

and companies. A Psyche

33:27

Net. And then not being honest

33:30

about how Green Bay's products really are

33:32

and one is the one of the

33:34

the systems are using. To to

33:36

change a thing brain as opposed

33:38

to saying staff from the Federal

33:41

government under this Climate Active program

33:43

which we've discovered through. The Senate

33:45

Inquiry just doesn't hit the mark

33:47

in terms of being environmentally credible,

33:49

and they call a Carbon Neutral

33:51

that in of itself is a

33:53

vague. and unhelpful ten

33:56

am wearing a crisis the

33:58

climate crisis now where we

34:01

can't just be carbon neutral, we actually

34:03

have to stop making

34:05

pollution and actually reduce the

34:08

reliance on fossil fuels. Carbon

34:11

neutral as a tagline just

34:13

doesn't have any environmental credibility anymore.

34:16

So the bar needs to be higher,

34:18

you're saying? The bar needs to be

34:20

higher. And interestingly, through this inquiry, and

34:22

this is shocking to people to

34:24

find out, that the

34:26

ACCC, the regulator, the

34:29

government regulator who goes in

34:31

and assesses people's trademarks, whether

34:33

something is 100% organic or whether

34:35

it's fair trade, they were asked to

34:38

assess this climate active stamp

34:40

from the Federal Environment Department. And

34:43

they said the ACCC, that it wasn't

34:45

clear enough, that the

34:47

rules by which it was certified were

34:49

not clear enough, were not up to

34:52

scratch, and they refused to certify. So

34:54

it's not even a properly certified trademark

34:57

under the ACCC rules. That's

35:00

concerning in and of itself. I mean, if both

35:02

two arms of government can't even back each

35:04

other in, you've got a problem. Final

35:06

question here on greenwashing, because we've kept

35:09

you and Chad a long time, but

35:11

the very popular programme MasterChef is back

35:13

on on the CHEN network now, their

35:15

new season is still using gas, but

35:17

they're talking about it being a cleaner

35:19

form of gas. I think Bio methane

35:21

and a number of other measures. What

35:24

do you make of the argument that it's

35:26

better than other kinds of gas that at least

35:28

it's heading in the right direction? I know some

35:30

critics have called that greenwashing, but isn't there an

35:32

argument that at least there's progress? Well,

35:36

firstly, this Bio methane

35:38

isn't even available for customers.

35:40

Right. So this

35:42

sponsorship of the gas company

35:45

using providing this type of gas,

35:47

the MasterChef, is

35:50

greenwashing because all it's doing is

35:52

promoting the gas company and their

35:54

brand and giving it a social licence.

35:57

Why do companies greenwash? Because they

35:59

want a message to... Customers and to

36:01

consumers that they grain as and they

36:03

really are he can't bring up the

36:05

gas company and they all had by

36:07

amazing thanks or for hydrogen. Is

36:09

a hydrogen the using a mouse ship

36:11

other ways and even grain is offset.

36:13

Hydrogen valid Still creating just as much

36:16

pollution. It's still just as talk They

36:18

could still just as bad in other

36:20

countries where Master Chef. Is successful.

36:22

Have. Put induction cooktop seems

36:24

onto the show so. In the

36:27

Uk and elsewhere in Denmark

36:29

they've got induction cook and

36:31

it would be much better

36:33

in Australia for master. Chefs

36:35

to damn Edu promoting

36:37

induction because. That were a

36:40

thrilling how cold the gonna have

36:42

to die as it's cheaper, it's

36:44

cleaner and the government needs to

36:46

stop shopping as trailing household Alexa

36:48

fi their horns that's where we're

36:50

gonna have to guard. Bio

36:52

Me sane. Hydrogen. even

36:54

if it's grain adjust not available to customers.

36:56

so it's it's rubbish. it's a rubbish argument.

36:59

it's wanna sell the name of the company,

37:01

give them a and a bit of a

37:03

grain teams it's greenwashing said at a Sarah

37:05

Hanson Young. So much to talk about with

37:08

covered a lot is plenty more we could

37:10

discuss what we've gotta go Zags I'd very

37:12

much for joining us illustrating whole to wonderful

37:14

debate here thinking. This

37:19

episode with produced by James Wilson.

37:21

The executive producer is Miles Martin

37:23

Jani Thanks for listening! See you

37:26

next! I'm. Tired.

37:39

Of ads barging into your favorite

37:41

news broadcasts. Good. News ad Free

37:43

listening is available on Amazon Music for

37:45

all the music plus top podcast included

37:48

with your prime membership. Stay up

37:50

to date on everything newsworthy by

37:52

downloading the Amazon Music app for

37:54

free or go to amazon.com/news ad

37:56

free. That. amazon.com/news ad

37:59

free. The catch up on

38:01

the latest episodes without the ads. How

38:03

can people. Says the price a

38:05

high education transformation improves success and

38:07

Eli reports suggest that taken emotional

38:10

and psychological factors into account is

38:12

just as important as the technology.

38:14

Six factors lead to see him

38:17

and scented afraid, leadership, inspiration, care,

38:19

empowerment, investment and collaboration. Get these

38:21

rights and they to more than

38:24

double and organizations chance of transformation

38:26

success. Learn more about people says

38:28

transformation at the Guardian that palm

38:31

oil. Sauce transforming higher education.

38:33

This messes with total. Body was.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features