Podchaser Logo
Home
Genetics, Votes, and Colin Firth

Genetics, Votes, and Colin Firth

Released Friday, 10th November 2023
Good episode? Give it some love!
Genetics, Votes, and Colin Firth

Genetics, Votes, and Colin Firth

Genetics, Votes, and Colin Firth

Genetics, Votes, and Colin Firth

Friday, 10th November 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:03

Maybe you've noticed, this country

0:05

has fallen into polarized, partisan,

0:07

political bickering. But where

0:09

did we get our rigid political views

0:12

in the first place? Well, obviously,

0:14

by carefully studying the data on each

0:16

issue and thoughtfully choosing our positions

0:19

accordingly. Right? Not

0:21

quite.

0:22

Most of the studies seem to indicate

0:24

that about 60% of the

0:26

difference between you and me

0:28

and anybody else in their political ideology

0:31

comes from genetic

0:33

or heritable factors.

0:34

That's right. You're a liberal or a conservative

0:37

in the same way you're a redhead or a brunette.

0:40

So wait, are all our debates, speeches,

0:42

and channels of persuasion pointless? Is

0:45

it impossible

0:45

ever to change anyone's mind? I'm

0:48

David Pogue, and this is Unsung

0:51

Science.

0:56

There are so many

0:59

amazing days on the way to your wedding day, and

1:01

Zola's here for all of them. Like

1:04

the day you find your perfect venue. The

1:06

day you almost skip to the mailbox to send your

1:08

invites. And the day you realize

1:10

making a budget isn't so scary. Zola

1:13

has everything you need to plan the wedding you want,

1:15

like a free website for your guests to RSVP

1:18

and shop your registry. And those not-so-amazing

1:21

days? Zola's here for those too. Talk

1:24

to Team Z, Zola's expert wedding advisors.

1:27

Or join the Zola community, full of other

1:29

engaged couples who know exactly what you're going

1:31

through. From getting engaged to getting

1:34

married, Zola is here for all

1:36

the days along the way. Start planning

1:38

at Zola.com. That's

1:40

Z-O-L-A dot com. Spectrum

1:43

Business is made to work the way small business

1:45

works. Made to overcome whatever comes up.

1:48

Made to connect fast and deliver faster

1:50

with the most reliable internet speeds. Made

1:53

to keep you online, even when the power is off,

1:55

with the complete connectivity and reliability

1:57

you can get from wireless internet backup.

1:59

work seamlessly and securely with

2:02

advanced Wi-Fi, made to give you big

2:04

value on a small business budget,

2:07

and made to connect all aspects of your

2:09

business with fast, easy-to-use,

2:11

ultra-reliable internet, phone,

2:14

and mobile services. It doesn't matter

2:16

what your business was made to do, Spectrum

2:18

Business was made to work for you. Get 300

2:21

megabit per second internet as low as $49.99 when

2:24

bundled with wireless internet backup, advanced

2:26

Wi-Fi, and voice. Plus get Spectrum

2:29

Mobile Unlimited free for one year. Learn

2:31

how Spectrum Business can work for you at Spectrum.com.

2:35

Spectrum Business, made to work. Restrictions

2:38

apply. Service is not available in all areas.

2:49

Every country around the world has political parties

2:52

that are fundamentally either conservative

2:54

or liberal. In the US, we call

2:56

them Republicans and Democrats. Other

2:58

countries call their parties different things, but

3:01

underlying all of it, they're essentially

3:03

conservative or liberal. And

3:05

just so we're clear on what those words mean, I

3:08

sought answers from the Oracle, ChatGPT.

3:12

Version 4. Yeah, that's right.

3:14

The one that costs money. I went all

3:17

out. Here's what it said. Conservatives

3:20

tend to support existing norms and values

3:22

and resist change. They

3:24

advocate for limited government intervention. They

3:27

support lower taxes, reduced government spending,

3:29

deregulation, and a strong national defense.

3:32

Liberals generally believe in an active

3:34

role for government in addressing societal

3:37

inequalities. They tend to support progressive

3:39

reforms such as LGBTQ

3:42

plus rights, abortion access, and gun

3:44

control measures. Economically,

3:46

they favor regulations on businesses, progressive

3:49

taxation, and social welfare

3:51

programs. It's essential to recognize

3:53

that these are generalizations and that there's

3:56

a broad spectrum of beliefs within each category.

3:58

Well done,

4:01

Chat GPT. So, conservatives

4:03

care a lot more about order, things like purity,

4:05

things like sticking with people who are just

4:08

like you, and liberals

4:10

tend to score high. The main personality characteristic

4:12

is openness, and so it means open to

4:14

new ideas. It might mean openness to different

4:16

forms. There's research showing there are different liberals

4:18

like modern art more than conservatives. Conservatives

4:21

prefer like classic forms of art. This

4:23

is Jay von Babel. He's a psychology

4:25

and neuroscience professor at New York University

4:28

and co-author of The Power

4:30

of Us, a book about the psychology

4:33

of groups. Okay, so today's

4:35

question is how did we get to

4:37

be conservative or liberal? If

4:39

you talk to like an average person on the street, most people

4:42

think that they chose their politics. You

4:44

know, you turn on the presidential debate, and

4:47

you think you're going in with an open mind, and

4:49

you're going to listen to the two ideas

4:51

and maybe change who you vote for. That's kind of the way

4:54

our political system operates with that assumption. But

4:56

what the brain structure data suggests

4:59

is maybe that's not quite true. See,

5:02

this episode has the most bizarre origin

5:04

story in unsung science history.

5:07

I was reading an article on psychologytoday.com.

5:10

It begins like this. Peering

5:14

inside the brain with MRI scans, researchers

5:17

at the University College London found that

5:19

self-described conservative students had

5:21

a larger amygdala than liberals.

5:24

The amygdala is an almond-shaped structure

5:27

deep in the brain that is active during

5:29

states of fear and anxiety. Okay,

5:32

what? So you vote conservative because

5:34

you've got a big amygdala? I mean, if that's

5:36

true, then what are we doing? Why

5:38

are we having debates and discussions and protests

5:41

and policy conversations? If our voting

5:43

patterns are determined at birth by

5:45

the size of some brain organ, then there's no

5:48

hope of convincing anyone of anything. The

5:50

die for every election is already cast. Free

5:53

will is a lie. So

5:56

I contacted political scientist Rose McDermott,

5:59

a professor now at... Brown University who's

6:01

done a ton of studies on the differences

6:03

between these two species, conservative

6:06

and liberal. And when I say different species,

6:09

I'm serious. Wait till I

6:11

tell you about her armpit study.

6:13

Anyway, I sent her the article. I

6:16

can link you to the abstract here. Yeah,

6:19

interesting. And in the article,

6:21

there's a link to a study that it was based on.

6:24

She clicked it and noticed something spectacularly

6:27

weird. Well,

6:28

this is funny. This article, one of the authors

6:31

is Colin First, who I think is

6:33

the actor. What? No,

6:35

what? Yeah, no, if you click

6:38

through to the link to the original article

6:40

in Current Biology, he's the third author. So

6:44

that tells you something important.

6:47

And it's also like Current Biology

6:49

is a very respectable journal. And

6:51

it could be a different Colin First, but

6:53

I kind of don't think so. No, it's not because if

6:55

you click his link, the affiliation is BBC

6:58

Radio 4. Yes, it's

7:00

true. The third author of this published

7:03

study, which appeared in the journal Current

7:05

Biology, is this guy.

7:08

What I'm trying to say very

7:10

inarticulately is I like

7:13

you very much, just

7:16

as you are. How on earth does the star

7:18

of Bridget Jones's diary wind

7:20

up publishing a paper in a scientific journal?

7:23

Well, Rose has a theory. I suspect

7:25

the reason he's an author

7:26

is because he funded it, right? So like,

7:29

if you pay for something, you get to be an author.

7:32

Oh, Current Biology is a very reasonable

7:35

place, but the Colin First thing makes me

7:37

suspicious. As it turns out, she

7:39

is right. On December 28, 2010, Colin

7:43

Firth was the guest host of a BBC

7:45

radio show called Today. And

7:48

for his episode, he paid for Professor

7:50

Geraint Reiss to scan the brains

7:53

of two politicians and 90 regular

7:55

people. And they found that there were some key

7:58

differences in the brain structure. In

8:00

particular, it's called gray matter volume

8:02

between liberals and conservatives. Apparently,

8:05

conservatives have more gray matter

8:07

volume in their amygdala. Liberals

8:10

have more of it in a different part of the brain. The

8:12

anterior cingulate cortex. Our

8:15

brains are wired a certain way that filters

8:17

and changes how we see the information

8:20

and in ways that are going to make it very hard to

8:22

persuade us, you know, with rational arguments

8:24

and facts and so forth. And so

8:27

this is something that also is related

8:29

to genetic studies. If you take identical

8:32

twins, identical twins are genetic clones.

8:35

They share 100% of the same genes with one another.

8:37

That's why they look identical. Let's

8:39

say you took two identical twins at

8:41

birth and raised one in a liberal family

8:43

and one in a conservative family. And then you, you

8:45

know, you followed up with them 20 years later to

8:48

see who they voted for. I think most

8:50

people have the assumption that the person, that little baby

8:52

raised a liberal family would be total liberal.

8:55

And the kid raised in a conservative family would

8:57

be totally conservative because... It's not? I

8:59

know. Most, on average,

9:01

those twins are actually going to want to vote the same way. Oh

9:04

my God. The data suggests about half of our political

9:07

preferences are genetic. So half

9:09

are shaped by the environment. A big chunk of it

9:11

is biology. Rose McDermott has studied

9:13

the genetic component of your voting tendencies too.

9:15

And by genetic, I mean that it's heritable, right?

9:18

It goes from parents to kids. But

9:20

importantly, it doesn't necessarily mean you

9:22

share the ideology your parents do. Like,

9:25

think about red hair. You can have red

9:27

hair because your grandparents have red hair or your great

9:29

grandparents have red hair. Ideology

9:31

can work that way too, right? I mean, it's

9:33

a trait that passes through generations.

9:36

Okay, so if that study is true,

9:38

then people with more gray matter volume

9:41

in their amygdalas tend to vote conservative.

9:44

People with less tend to be liberal.

9:46

Aha! But

9:47

remember, one of the golden rules of science,

9:50

correlation does not imply causation.

9:53

Just because two things happen together doesn't

9:56

mean that one causes the other. I

9:58

had a high school science teacher... who made this point with

10:00

a phone example. Every

10:03

summer, ice cream consumption goes

10:06

up. And so, do swimming pool

10:08

drownings. But obviously that doesn't

10:10

mean that eating ice cream increases

10:12

your chances of drowning. Those measurements

10:15

both go up in the summer because it's hot out,

10:17

so people go swimming and people

10:19

eat ice cream. Those two statistics are

10:22

correlated, but one factor does

10:24

not cause the other. Anyway,

10:26

the point is, are you conservative

10:29

because you've got more gray matter volume in

10:31

your amygdala? Or do

10:33

you develop more gray matter volume in your amygdala

10:36

because you're conservative? Yeah,

10:38

there is a bit of a chicken and egg problem.

10:41

So we don't fully know. There's kind of like a little bit

10:43

of a missing link there

10:45

right now in that area of science.

10:48

The Psychology Today article points out that your

10:50

amygdala is active when you're fearful.

10:53

No wonder the author says that the

10:55

conservative party is big on national

10:57

defense and magnifies our perception

10:59

of threat, or foreign aggressors,

11:01

immigrants, terrorists, or invading

11:03

ideologies like communism. To a conservative,

11:06

the world really is a frightening place.

11:10

And Brown's Rose McDermott has studied this question.

11:12

My colleague, Peter Tommy, who

11:15

I've done most of this work with, and I did

11:17

a piece really

11:20

before its time. It was about 2012 on fear. And

11:23

what we found really is that fear

11:26

makes people conservative. It's not that conservative

11:29

people are fearful.

11:31

So it's not that you're conservative

11:33

and therefore you're scared. It's that you're scared

11:35

and

11:36

that makes you

11:38

conservative.

11:39

And the reason for that makes sense, right?

11:41

That if you're scared,

11:42

one of the things you want to do is control

11:44

the environment

11:45

so you reduce

11:46

the amount of uncertainty, you reduce

11:48

the amount of things that could

11:50

hurt you.

11:51

It's like the old saying, a conservative

11:54

is a liberal who's been mugged. Right,

11:57

right, right, right, right, right. Social and political

11:59

scientists. seem to uphold the theory

12:01

that a conservative person sees more

12:03

things to be feared in the world than a liberal

12:05

does. And that your amygdala's

12:08

makeup seems to correlate with your political

12:10

leanings. But J. von Bebel's research

12:12

doesn't back up the notion that your

12:15

amygdala size is tied to

12:17

your fearfulness. That's not

12:19

quite the same thing. The Psychology

12:21

Today article, the Oracle

12:24

of this whole thing, spoke

12:26

a lot about fear. The

12:29

article kind of said, well, the amygdala is the

12:31

fear center. And so that makes

12:33

sense because conservatives are

12:35

more fearful than liberals. Yeah, I mean, I actually

12:38

bought into that theory at one

12:40

point. But I've done a bunch of research since

12:42

and a bunch of other labs have,

12:44

and they really find that the fear conservative

12:46

link, it seems to be overstated

12:49

very significantly. Really? Yeah.

12:51

So I've changed my opinion. Or if there's a weak link,

12:53

it's very, very weak. We ran a structural

12:56

MRI study, two of them at NYU,

12:58

where I work, just like the one they ran in

13:00

London. And the variable we found that seemed

13:02

to be related to amygdala size,

13:05

first of all, was support for the status

13:08

quo and defense of the existing

13:10

system. And so conservatives tend to score higher

13:13

on support for the status quo. In fact, that's almost the

13:15

definition of conservatism, is

13:17

just conserving and sustaining the status quo.

13:20

And liberals want to challenge it more. And

13:22

so that seems to be the key variable, at least,

13:25

that we found that's correlated with amygdala volume.

13:27

We found it in two studies. And then we followed

13:29

up those people in our study, I think up

13:31

to a year later. And we found that the people who

13:33

had really low gray matter volume

13:35

density in their amygdala were the people who are more likely

13:37

to go to protests. And so these are the people

13:40

at Black Lives Matter protest, Global

13:42

Climate Change protest, Occupy

13:44

Wall Street protest. And so they're out

13:46

there challenging the system and trying to change it. And

13:49

it doesn't really seem like it maps really cleanly onto

13:51

fear. It seems like it maps more into sustaining

13:53

status quo and existing hierarchies and things like

13:56

that. In other words, like so

13:58

many other things in science. and the real

14:00

world, the truth involves accepting

14:02

a bit of nuance. Your amygdala does

14:05

affect your political leanings, and your

14:07

fear level does too, but it's

14:09

not necessarily true that your amygdala

14:12

determines your general fearfulness level.

14:17

Anyway, none of that changes the startling

14:19

fact that apparently your

14:22

genes help determine your vote.

14:25

Whoa! I mean, I can imagine

14:27

a lot of people reacting

14:29

negatively to this news that sounds like we're

14:32

being puppeteered to a certain

14:34

extent by our genetics and by

14:36

our amygdala size, but it doesn't

14:39

give us the credit for being our own thinkers

14:41

and cultivating our own independent

14:43

opinions. If that's true, then

14:46

there's no point to anything. There's

14:48

no point to every tweet and every

14:50

argument and every debate

14:52

on stage. But I always want to keep

14:54

hammering this home. This is about half of the story. The

14:57

other half of it is that we

14:59

also have these huge prefrontal cortices, which

15:02

sometimes engage in rationalization, but a lot of times

15:04

they're actually thinking. People are reading

15:06

new things, they're learning, they're exposed

15:09

to new people, and they're contemplating

15:11

it all and making decisions. You're shaped by your social

15:13

environments, the groups that you're in, the peer

15:15

groups that you have. And so that

15:18

can guide us in different directions than our predisposition.

15:21

Okay, so there's some hope

15:23

for our egos in thinking

15:25

that we can make up our own minds. Yeah, yeah,

15:27

yeah. There's some hope that we have a little bit

15:29

of rationality.

15:32

And that's not the only reassuring tidbit I

15:34

picked up from these interviews. After the break,

15:36

our guests are going to put the American partisan

15:38

mudslinging into perspective, and

15:41

I will tell you at last about

15:43

Rose McDermott's farm pit study.

15:51

Sometimes you need a distraction, and

15:53

whether it be while traveling, doing dishes,

15:56

or working out, the Audible app makes

15:58

it easy to listen to your favorite audio and music. entertainment,

16:00

anywhere, anytime. You'll discover thousands

16:03

of titles from popular favorites to exclusive

16:05

new series, guided wellness programs, theatrical

16:08

performances, comedy, and exclusive

16:10

Audible originals from top celebrities, renowned

16:12

experts, and exciting new voices in audio.

16:15

The best part? As an Audible member, you can choose

16:17

one title a month to keep from our entire catalog,

16:20

including the latest bestsellers and new releases.

16:22

New members can try Audible free for 30 days.

16:25

Visit audible.com slash WonderyPod

16:27

or text WonderyPod to 500-500 to try

16:30

Audible for free for 30 days. That's

16:32

W-O-N-D-E-R-Y-P-O-D.

16:35

Audible.com slash WonderyPod or text WonderyPod to

16:38

500-500 to try Audible for free for 30 days.

16:42

In today's business world, any edge could

16:44

be huge. And nobody offers more timely

16:46

business advice than the Harvard Business Review. Whether

16:49

it's their flagship magazine or digital content

16:51

featuring articles, videos, podcasts,

16:53

and more, you'll gain real-world insight

16:56

into the most pressing topics facing business

16:58

today. And now, for just $10

17:00

a month, you'll have unlimited access to

17:02

Harvard Business Review content and subscriptions.

17:05

Go to hbr.org slash subscriptions

17:08

and enter promo code business. That's

17:11

hbr.org slash subscriptions,

17:13

promo code business.

17:17

Welcome back.

17:18

This whole episode is dedicated to the proposition

17:21

that at least half your political tendencies

17:24

are outside your control. They

17:26

were determined by your genes. It's

17:29

not impossible for you to change your beliefs, but

17:31

Rose McDermott says it's pretty unlikely.

17:34

Does anybody ever switch

17:38

from

17:38

liberal to conservative? Oh, sure. I mean,

17:40

I'm not going

17:41

to say it's common,

17:42

but this is where I think environment really matters.

17:45

So

17:45

there's good data, for example, that having a

17:47

divorce really changes people's attitudes

17:50

about certain things. So we have to have

17:52

parts of that ideology that are receptive

17:55

to stimulus that we get from the environment. When

17:58

your child gets killed in a...

18:01

mass shooting event, when you

18:03

have cancer, when, you know, there's a big

18:05

thing that happens in your life, it

18:08

can dramatically and very rapidly

18:10

change your ideology in either direction, right?

18:13

And there's pretty good evidence to show

18:15

that. But

18:16

I mean, if we're these genetically pre-programmed

18:19

voting robots, well, half

18:21

pre-programmed, that makes us sound like we're

18:23

two totally different tribes, or

18:25

even totally different species.

18:28

Let's say I'm a liberal and you're a conservative

18:30

and we're fighting over some stimulus.

18:33

The main part of what we're fighting about

18:35

is that we're actually seeing different things. We're

18:38

hearing different things. We're experiencing

18:40

different things. But we think that

18:42

we're each seeing the same thing.

18:45

She says it's like that crazy Internet meme

18:47

from a few years back, where it was a

18:49

photo of a dress and half the

18:51

population insisted that it was a blue dress

18:54

with black stripes and half insisted

18:56

that it was gold and white.

18:57

Just because

18:58

you see it one way and you know other

19:01

people see it a different way doesn't

19:02

change how you see it. You

19:05

just think those other people are wrong. They

19:07

just really should see the dress as gold or the

19:09

dress as blue or whatever it is.

19:12

So, for example, you can do this with these

19:14

eye tracking studies we've done where you show people

19:16

pictures of, for example, a soldier

19:19

picking up a child.

19:22

And liberals will be looking

19:24

at the child who's got blood dripping

19:27

from it and not really paying attention

19:29

to anything else in the picture because you

19:31

can see with eye tracking what they're looking at.

19:34

And the conservative person will be paying attention

19:36

to the uniform, the gun

19:38

on the hip of the person who's carrying

19:41

the child. They won't even see the

19:43

child.

19:44

And then they have

19:45

a fight about whether or not.

19:48

U.S. forces should do humanitarian

19:50

intervention and they think they're fighting

19:52

about humanitarian intervention.

19:55

They don't know that one

19:57

group of people only sees the bleeding child.

19:59

and the other group of people only sees the,

20:02

you know, threatening characters

20:04

with guns. And so

20:07

it's very difficult to have a conversation

20:09

and achieve a compromise when you don't know

20:12

that you don't know that you're not experiencing

20:15

the same phenomenon. I

20:17

see that going on a lot these days where

20:20

people think that they're fighting over values,

20:22

but they're really fighting over perception.

20:24

And that's where her armpit stink

20:27

study comes in.

20:28

The

20:30

armpits really trying to look

20:32

at whether how it is that people

20:34

recognize each other in mating, right?

20:37

If liberals are marrying liberals and conservatives are

20:39

marrying conservatives, how is it that

20:41

they're finding each other? We thought,

20:44

gee, we wonder if we're actually looking for somebody

20:46

who aligns with you who's similar to you

20:48

on your ideology.

20:50

And so we had all these

20:52

subjects who were extreme liberals

20:54

or extreme conservatives, and

20:56

we did all these things to make sure that

21:00

we got a kind of pure sense of their smell.

21:02

So they had to wash their hair and their

21:04

bodies in scent-free shampoo

21:07

and soap, and they

21:09

couldn't sleep in a bed with somebody else for two

21:11

days, and they had to eat food that wasn't

21:13

spicy, and they couldn't, you

21:16

know, sleep with their animals, and I couldn't believe people

21:18

would do it for 20 bucks. I wouldn't have done it before.

21:22

And then we had them wear

21:24

gauze pads under their arms so that we

21:26

could get their sweat. And then we

21:29

extracted that and had

21:30

others looked at who they

21:32

found attractive.

21:33

It was completely predictive.

21:36

So liberals found other liberals,

21:39

the smell of other liberals, really attractive. Conservatives

21:41

found the smell of other conservatives really attractive.

21:44

I wasn't sure it was going to work because it was, you know, everybody

21:46

thought I was crazy for doing it to begin with. They just

21:48

thought it was the nuttiest thing that ever hurt. And

21:50

so the first day I was standing with

21:53

the first vial and I opened the first vial

21:55

and I smelled it

21:56

and I couldn't smell anything. And I thought, oh, this is really

21:59

not going to work. work.

22:00

And then the guy who was doing the statistical

22:03

analysis was there too. And I gave it to him

22:05

and I, he's one of my coauthors and I said,

22:07

can you smell anything? And he

22:09

took a whiff, and he almost

22:11

started to throw

22:12

up. I was like, yeah, it

22:14

did work. On the last day of the study, two

22:17

subjects approached McDermott. So

22:19

there was a guy who

22:21

said, I have to tell you, one of

22:23

your samples is rancid. It

22:26

was so disgusting, so awful. I just

22:28

want to tell you. And I was like, okay. And

22:30

I took down the number and everything. And

22:33

then right after that, a woman came

22:35

in and she said, what are you going to do with

22:37

samples? And I said, I'm

22:39

going to do a molecular fraction on it. She's

22:41

like, well,

22:43

can I take one of them home with me? I

22:46

want to sleep with it under my pillow. And

22:50

I said, why would you want to do that? And she's like, it's

22:52

the best thing I've ever smelled in my life.

22:54

It's the same one that was thought

22:56

to be rancid. It's the exact

22:57

same number as the guy who

23:00

one minute before had told me it was

23:02

rancid and I had to stop the study because

23:04

it

23:04

was so vile. Exact same

23:07

one. What it was, was that

23:09

the sample was a very conservative

23:12

male. The male who said it was rancid

23:14

was a very liberal male. And

23:16

the woman who wanted to take it home with

23:18

her was a very

23:19

conservative female. So

23:21

we think that we're smelling the same thing, but we're

23:24

actually not. There's no way to know what someone

23:26

else experiences. We were

23:28

able to show that we

23:30

could predict political ideology based

23:32

on the attraction that people found

23:35

to the different smells that they had. But

23:38

it just doesn't make any sense. Your

23:41

political leaning is a thought

23:43

process. It's in your brain. It doesn't

23:46

affect your armpits stink. Oh,

23:48

but see, your brain isn't your brain, right?

23:50

Your brain is also your body.

23:52

Right? Those things are intricately interconnected

23:56

and they're connected in a deep somatic

23:58

way.

23:59

So

24:00

information that we get from the world and

24:02

smell is very potent. Smell

24:05

can be a very powerful reminder. Perfume,

24:08

you know, a certain perfume that we associate with a person

24:10

or you know, you

24:11

go to Hawaii and you smell plumeria

24:13

and it reminds you of all the experiences you've

24:15

had in Hawaii.

24:17

You know, there's certain smells that

24:19

are very evocative. And

24:21

so we have more of that

24:24

available to us than we realize and we

24:26

get more information than we consciously process.

24:28

We undervalue those things because we privilege

24:31

our brains

24:32

and think that, you know, our bodies are

24:34

just the, you know,

24:36

six figures that carry our brains around.

24:38

And so, yes, it is

24:41

distinct,

24:42

but it's also integrated. But we

24:44

are saying that in this case, something

24:47

about the way you perceive the world politically

24:50

is affecting the

24:52

chemicals coming out of your... Oh,

24:54

absolutely. Yeah. It's

24:56

crazy. It's affecting not just

24:59

the chemicals, but the way that we perceive other

25:01

people's chemicals. Right.

25:03

And we're not aware of that.

25:05

We think everybody else is having the same experience

25:08

that we're having.

25:09

And so, you know, if there's one

25:11

takeaway that I try to teach my

25:13

students, which never works,

25:16

is don't think that your perception is

25:19

the perception. That the way

25:21

you see the world, hear the world, smell

25:23

the world, feel the world

25:25

is the truth. It's a

25:28

truth. It's your truth, but it's

25:30

not

25:30

necessarily everybody else's truth.

25:33

And that I would hope that that kind of humility

25:36

would allow for a particular kind of mercy toward

25:38

other people that you disagree with.

25:41

Aha. A takeaway from all this, a

25:43

strategy to use, a conclusion

25:45

from all this science that we can use in the

25:47

real world. J. Von Bevel has similar

25:50

advice based on his research indicating

25:52

that conservatives care about purity

25:55

and liberals care about harm.

26:00

convince, say, conservatives to

26:02

support climate change more, you should use

26:04

language that frames it in the type of

26:06

personality style and moral values that they care

26:08

about. So one of the studies

26:11

on this found that if you frame climate change

26:13

in terms of harm, well, that's something

26:15

that liberals care a lot about. But

26:17

it doesn't really translate to conservatives. It doesn't resonate

26:20

with them in the same way. But if you use

26:22

the language of purity, which is

26:24

something that conservatives resonate with

26:26

more, that they're more convinced

26:28

to support climate change because they don't like

26:30

things that are impure, and vice

26:32

versa. If conservatives want to change

26:35

liberals' attitudes about immigration

26:38

policy, they can frame it through the language

26:40

of harm and care in ways that will

26:42

resonate with them more. That's the type of insight

26:44

that this gives us. Wow. This

26:47

whole episode is kind of taking a hopeful

26:49

turn as it approaches the end, isn't it? What

26:51

a beautiful structure. Its writer

26:53

must be some kind of genius.

26:56

There's good news about polarization, too.

26:58

Yeah, yeah. America is divided

27:00

against itself. We all rip each other apart online,

27:03

blah, blah, blah. But according

27:05

to Rose McDermott, you've got to consider us in the spectrum

27:07

of the whole world. I

27:10

hear four sets of terms referring

27:12

to political leanings. You hear left

27:14

and right. You

27:17

hear liberal, conservative, Democrat,

27:19

Republican, blue state, red state. Are those all

27:21

equivalent? No.

27:26

So when you talk about liberal and conservative in the world sense,

27:29

it's much, much wider than Democratic and Republican

27:32

in the American sense.

27:35

And to be clear, the world's spectrum

27:37

left liberal

27:41

would be communism.

27:42

It would be like what the Soviet Union

27:44

was.

27:46

And right is conservative.

27:48

Is

27:51

conservative is fascism the way

27:53

that

27:54

Hitler was. So

27:57

America actually is in the middle

27:59

of it. those things.

28:01

There's still nowhere near as extreme

28:04

as it can be in the world spectrum.

28:06

And so those terms are often used

28:09

anonymously, but they're actually not the same.

28:11

So Republicans and Democrats in this country

28:13

are actually closer than we think?

28:16

Way closer.

28:17

Really?

28:19

We're getting more divided.

28:21

But in terms of world political

28:23

spectrum of liberal to conservative, we're

28:25

much closer than we think that we are.

28:27

So world spectrum issue

28:30

is much broader.

28:31

Of course, we have a two party system

28:33

in the US, which you might think would make

28:36

it harder to compare us with other countries. But

28:38

as it turns out, there are two fundamental

28:41

views of the world that are universal,

28:44

conservative and liberal. Does

28:47

the thing about favoring

28:50

the status quo versus favoring change,

28:52

does that translate universally, even

28:55

if the parties have different names in other countries? Yeah.

28:58

So the tendency, and

29:00

it's called system justification, it was developed by

29:03

a colleague of mine, John Jost, is that some people

29:05

score high in system justification, they want to defend and support

29:07

the system. Other people really want to challenge the system.

29:10

He's measured that in almost every country in the world

29:12

at this point. And almost always, it's

29:14

correlated with how conservative you are in almost

29:16

every country. Two things seem to be different about America.

29:19

One is that we're in a two party system here.

29:21

And so there's very much a psychology of us versus

29:24

them. If I don't agree with your

29:26

party politically, I'll vote for a leader

29:28

that I might not even trust or like or respect

29:31

or who's corrupt, just to stop your

29:33

party from getting power. Whereas

29:35

let's say like, I'm actually from Canada. Canada

29:37

has like, you know, five parties, one of which

29:39

is their main role is just to separate their province from the

29:41

rest of the country. That's a party in

29:44

Quebec, the French part of Canada. But even

29:46

in the rest of Canada, there's at least three major parties.

29:48

And so if your party that you supported

29:50

in last election is corrupt, but you really

29:52

hate that one on the other side of the aisle, there's still

29:55

a third party that you can vote for. And

29:57

so people engage in a lot of what's called strategic voting.

30:00

And so that dynamic gets

30:03

a little bit out of this us versus them psychology

30:05

where I always have to support someone even if I don't like what

30:07

they've done. And it makes people more more flexible.

30:10

And I think it's a system that maybe allows for more

30:12

accountability of bad actors and corrupt

30:14

politicians. And that's very common in Europe

30:16

and other countries that some of them have like 10 or 20

30:18

or 30 parties and they have to form coalitions

30:21

to rule. So that's the one big thing that's

30:23

different. And the second big thing that's different

30:26

in America is that 40 years

30:28

ago it wasn't polarized to the same degree. And

30:31

so people could switch parties or feel comfortable

30:33

in other party if they didn't fully align with them

30:35

ideologically. We're at the point

30:38

where very few people do that or

30:40

feel comfortable doing that. The crazy thing

30:42

is from a political science and evolutionary

30:44

perspective, some polarization

30:47

might be a good thing.

30:49

Societies to

30:51

survive across millennial time

30:54

need groups of people who

30:57

cooperate at home. They build

30:59

houses. They raise children. They

31:03

engage in all kinds of cooperative behavior.

31:06

And society also needs people

31:08

who defend those cooperators against

31:10

out groups, against animals,

31:13

against climate, against other people.

31:16

And those people often engage

31:19

in combat. And those people

31:21

are defenders of that society. And

31:24

I think of liberals and conservatives

31:27

in a kind of you can't survive as a society

31:29

unless you have both. If you get rid

31:32

of all the defenders, you're

31:34

going to be completely annihilated

31:36

by the out group. If you don't have

31:39

a self-defense, you're going to get

31:41

rolled over. You're going to get...

31:43

But by the same token, you have to have cooperators

31:45

and nobody is going to be growing the grain. Nobody

31:48

is going to be raising the children. Nobody is going to be building

31:50

the houses. Nobody is going to be housing the hospitals.

31:53

So you can't survive without both.

31:56

Van

32:00

Bevel has concluded that things really

32:02

aren't as bad as we're led to believe.

32:06

There is this real polarization, but it's exaggerated

32:08

so much on TV, on

32:11

news channels, and on social media

32:13

in particular that often

32:15

know what happens, especially in social medias. You'll

32:18

find the craziest person on the other

32:20

side and then act as if they're representative

32:22

of the whole group. And most

32:24

people actually on that party don't even agree

32:26

with those people. And so if you do

32:28

that, it creates a misperception in people's mind

32:30

of how different the other party is

32:33

from you. Yeah, TV news correspondents

32:35

are the worst. Thanks a lot, Jay.

32:41

But wait a minute. Is there

32:43

some study or research

32:46

that indicates what you just said? Yeah.

32:49

One of my favorite studies asks, if you're a Republican,

32:52

what percentage of Democrats are

32:55

lesbian, gay, or trans? And

32:59

I'll ask a Democrat a question. What percent of Republicans

33:01

make more than $200,000 a year? Most

33:05

people, if you ask Republicans that, they think like 30% to 40%

33:07

of Democrats are LGBTQ.

33:10

But it turns out it's more like 5%. And

33:12

so they have this exaggerated view that a

33:15

huge proportion of Democrats are lesbian,

33:17

gay, or trans. And if you ask Democrats

33:19

how many Republicans are rich, they say like 30%, 40% of them

33:21

make like $200,000 a year. Well,

33:24

of course, it's more like 1%, right? You

33:27

make $200,000 a year, you're in the top 1%. And

33:29

so we have these exaggerated kind of cartoon images

33:32

of the other party in our minds. And

33:34

so once you can correct those for people,

33:37

basically fact check them, and they're surprised. But

33:39

they're being feed-fed images of

33:42

the other party as kind of representatives of either

33:45

of this kind of caricature of their

33:47

party. So

33:49

now you know. Your genetics and

33:51

your brain have predisposed you to

33:54

believe what you believe. You can't

33:56

help it. That first 50%

33:58

or 60% of your political views are not

34:00

based on your careful consideration of the

34:02

issues. You were born

34:04

that way. And so what

34:07

are we supposed to do with this information? Take

34:10

it home, Jay. Once

34:12

you understand that a lot of our political preferences

34:14

are biological and driven

34:16

by our brain and our traits, it

34:19

changes how we think about approaching somebody who doesn't

34:21

agree with us politically, right? Instead of just arguing

34:24

and throwing facts at them, we

34:26

are probably not going to be as convincing as we hope

34:28

to be, because that's not what's driving

34:30

their political beliefs or much of it. And so

34:33

there's some utility in trying to understand more

34:35

where somebody's coming from and listening to them rather

34:38

than just kind of like, you know, debating them. Find

34:41

ways to talk to them that are going to resonate

34:43

with where they're coming from. That might be

34:45

more persuasive.

34:46

If people could become aware

34:48

of their awareness,

34:50

and I know I'm going to sound like a meditation teacher

34:52

here for a moment, but if you can

34:54

be aware of your awareness

34:56

and aware that it's

34:58

not the truth,

35:00

like to know that it's part of the truth,

35:03

but

35:03

that it's transient, it changes,

35:06

there may be a different truth tomorrow

35:08

from yesterday, and it may be a different truth than

35:11

someone else has, and you don't

35:13

get so attached to your truth,

35:16

it gives you a lot more room for compromise

35:18

and agreement.

35:20

Because I know then that you

35:22

believe what you believe sincerely, not

35:24

because you're a bad person, but

35:26

because that's your reality.

35:28

That's where I think you can make some

35:30

progress. And remember, this is not just

35:32

a hunch. This is based on actual

35:34

research. I was just part

35:36

of this huge study, it was run out of Stanford, and

35:39

they created a tournament,

35:41

a worldwide competition to figure out what could reduce

35:43

polarization in the US, a partisan animosity,

35:45

you know, this hatred for other groups, as well as

35:47

get people to support democracy. And

35:50

so they got submissions from all

35:52

these scientists around the world in different fields, 52 groups

35:55

submitted proposals to them of an intervention

35:57

that would reduce polarization. So

36:00

a little messaging, like a five minute little messaging,

36:03

they picked the top 25 and my

36:06

lab submitted one and then they took these 25

36:08

interventions and they

36:10

ran 32,000 Americans from all different

36:12

ages, ethnicities, genders, backgrounds

36:15

and income classes and

36:18

gave them these 25 interventions and found

36:20

out what works. And then they also followed up these

36:22

people two weeks later to see if it doesn't stick around.

36:25

One of them was based on a Heineken beer commercial. That was actually

36:27

the best intervention if you've seen it. Have

36:29

you seen that one, David? The hiding? Oh,

36:31

well you got to go online and watch it. I did.

36:34

It's really wonderful. It's called Worlds Apart.

36:37

They bring in these random strangers. We

36:39

meet them individually and it's clear

36:41

that they have radically different political views.

36:45

Feminism today is my hatred.

36:47

It's actually crystal and healthy

36:50

but they're our own voice. I don't

36:52

believe that climate change exists. But

36:54

they don't meet each other before the experiment

36:56

begins. Two of these people at a time

36:59

are sent into a big sort of warehouse

37:01

and given a sheet of printed instructions. They

37:04

work together to build what looks like it's

37:06

going to be a big piece of IKEA

37:08

furniture.

37:09

You're

37:12

right, mate. A bit better than a lock. Perfect.

37:16

Oh, yeah. There you go. Mate,

37:19

definitely just built a bar. Yeah.

37:22

And indeed, they have built a bar.

37:24

They had fun and they got to know each

37:26

other a little. You got this really, um,

37:29

got a glow. Your aura is pretty

37:31

cool. At this point, they watch a

37:33

video. It is, of course, the

37:36

pre-interview where the participants had

37:38

talked about their political views.

37:40

So transgender, it is very odd.

37:43

We're not set up to

37:44

understand or see things like that. They've suddenly

37:46

figured out that they've been paired with their polar

37:49

opposites. And now they're

37:51

asked, would you like to leave or

37:54

stay and have a beer to discuss your differences?

37:57

And it was, you know, the commercial, who knows how

37:59

well. they edited it, but everybody looked

38:01

at the other person they had built a relationship with, had

38:04

worked together, and it was all the elements

38:06

of what psychology calls contact theory. Working

38:09

together with somebody and doing something

38:11

together with when you're at equal status actually

38:14

builds a connection with somebody and you become, start

38:16

to humanize them. And people were willing to

38:18

stay and have the beer. That was the end of the beer commercial.

38:20

But they showed that to people and it dramatically reduced partisan

38:23

animosity and it lasted two

38:25

weeks later. In the Stanford competition, the

38:28

one based on the Heineken ad was the grand

38:31

prize winner. Ours was the third best intervention

38:34

and we talked about how each part leaders from each

38:36

party supported democracy, all

38:38

these other things I told you about these exaggerated stereotypes

38:41

and caricatures we have of the other party and we

38:44

presented that and ours worked nearly

38:46

as well as a beer commercial and in my

38:48

view this is the best studies ever been run on this topic. These

38:50

things persisted for multiple weeks and they also

38:53

increased support for democracy and democratic

38:55

institutions. So I thought it was just

38:57

something really promising and

39:00

it seems like there's lots of pathways to get to

39:02

somebody in a way that opens their mind and reduces

39:04

their hostility towards you. Wow,

39:07

that is cool. I mean that is that

39:10

is a droplet of hope in this ocean

39:13

of hate. Yeah, yeah, yeah, it is. I mean

39:15

how you scale that to society, I don't know.

39:17

Yeah, but it's something, it's a start,

39:19

right?

39:28

You've just listened to another episode of Unsung

39:31

Science with David Poge. Don't forget

39:33

that the entire library of shows along

39:35

with written transcripts await at

39:38

unsungscience.com This

39:40

podcast is a joint venture of Simon & Schuster

39:43

and CBS

39:43

Sunday Morning and it's produced by

39:45

PRX Productions. For Simon & Schuster,

39:48

the executive producers are Richard Rohrer

39:50

and Chris Lynch. The PRX production

39:52

team is Jocelyn Gonzalez, Morgan Flannery,

39:55

Pedro Rafael Rosado and Morgan

39:57

Church. the

40:00

unsung science theme music, our facts

40:02

checker is Christina Rubello and

40:04

Olivia Noble fix the transcripts. For

40:07

more of my stuff visit davidpogue.com

40:10

or follow me on twitter at poke. That's

40:13

p-o-g-u-e. We'd

40:15

love it if you'd like to follow

40:17

us on science wherever you get your podcast

40:20

and spread the word will you?

40:28

Upgrade your style for less during Indochino's

40:31

Black Friday event. This limited time

40:34

sale starts in store and online November

40:36

6th. Don't miss out on the best prices of the year.

40:38

If you've been waiting to upgrade your old suit,

40:41

Indochino's Black Friday sale is the perfect

40:43

time to give yourself the gift of a personalized

40:46

look without the luxury price tag. Indochino

40:49

makes it easy to keep your wardrobe fresh

40:51

with customizable looks that fit you and

40:53

your style perfectly. From suits

40:56

and shirts to outerwear and more. And

40:58

during Indochino's limited time Black

41:00

Friday event, suits start at

41:02

just $349. Discover

41:05

countless made-for-you options with new styles

41:07

and fabrics added throughout Indochino's

41:10

Black Friday event including unbelievable

41:12

bundles like two suits starting at $7.49 and

41:15

five shirts at just $2.49. Book

41:17

your appointment today for Indochino's Black

41:19

Friday event starting in store and online

41:22

November 6th at Indochino.com.

41:25

That's I-N-B-O-C-H-I-N-O.com.

41:30

Hey there, it's Mary Harris and I host

41:33

Slate's daily news podcast, What

41:35

Next? It's a show I made because

41:37

I was grappling

41:38

with this question. Why

41:39

is the news everywhere and

41:42

I have no idea

41:42

what to pay attention to?

41:44

My daily short podcast is here to help

41:46

you make sense of things. From fleshing out new

41:48

angles to uncovering stories that have largely

41:50

gone unreported, when the news feels overwhelming,

41:53

I'm here to help you answer, What Next?

41:56

So subscribe wherever you're listening,

41:59

right now. you

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features