Podchaser Logo
Home
Ep57 "When should new technologies enter the courtroom?"

Ep57 "When should new technologies enter the courtroom?"

Released Monday, 6th May 2024
Good episode? Give it some love!
Ep57 "When should new technologies enter the courtroom?"

Ep57 "When should new technologies enter the courtroom?"

Ep57 "When should new technologies enter the courtroom?"

Ep57 "When should new technologies enter the courtroom?"

Monday, 6th May 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:05

Can you measure pedophilia in

0:07

a brain scan? Can you measure

0:09

a lie from somebody's blood pressure?

0:12

And how should a judge in court

0:14

who's not an expert in science decide

0:17

these things? What does any of this have

0:19

to do with President Ronald Reagan or

0:22

antisocial personality disorder

0:24

or how the television show CSI

0:27

has impacted courtrooms. Welcome

0:32

to Inner Cosmos with me David Eagleman.

0:35

I'm a neuroscientist and an author at

0:37

Stanford and I've spent my career at

0:39

the intersection of our brains

0:41

and our lives. In today's episode

0:44

is about an aspect of

0:47

the intersection between brains and

0:49

the legal system, and it's a tricky

0:51

one. The

0:57

question is when neuroscience

1:00

techniques are allowed

1:03

in courts, When should they be allowed?

1:05

What bars need to be passed

1:08

for a technology to be accepted.

1:11

So let's start on March thirtieth,

1:14

nineteen eighty one, when the President

1:16

of the United States, Ronald Reagan, has

1:18

been delivering a speech and

1:21

afterwards, he and his team are returning

1:23

to his limousine and he gives

1:25

a two big arm wave to the

1:28

crowd and suddenly there

1:30

are gunshots ringing out and

1:33

everyone's diving, and President

1:35

Reagan is hit with a ricochet off

1:37

his limousine, and Press Secretary

1:39

James Brady falls and

1:41

Secret Service agent Tim McCarthy falls,

1:44

and a DC Police officer named Thomas

1:47

del Haunty is also wounded, and

1:49

the President arrives at the

1:51

emergency room in critical condition

1:54

and almost dies. And for

1:56

those of you who weren't alive in nineteen eighty

1:58

one, or for whom this has re seated

2:00

in memory, just try to picture

2:03

the horror that this entailed. Now

2:06

you may remember that the gunman, John

2:08

Hinckley, had a deep psychosis.

2:11

He was divorced from reality,

2:13

and he believed that if he shot

2:16

the president, he would win the

2:18

love of the actress Jody Foster.

2:21

There's a lot to say about this case, and

2:23

in episodes thirty six and thirty seven

2:26

I talked about the insanity

2:28

defense, but here I want to zoom

2:30

in on a very particular aspect.

2:32

The thing most salient to us

2:35

today was the fact that this was

2:37

the first high profile case to use

2:39

a form of brain imaging.

2:42

Hinckley's lawyers pled not

2:45

guilty by reason of insanity, and

2:48

to support their defense, they introduced

2:50

brain imaging evidence so

2:53

his defense counsel argued that

2:56

he was schizophrenic, and they

2:58

argued they could prove this by

3:00

showing CAT scans or

3:02

CT scans. CT stands

3:05

for computer aided tomography

3:07

or computerized demography. Now,

3:09

the lawyers on both sides agreed

3:12

that cat scans had never before

3:14

been admitted as evidence

3:16

in a courtroom. Neuroimaging

3:19

was brand new at this time, So

3:22

should the judge allow this new

3:24

Fengal technology to be

3:26

accepted or not? Well,

3:29

it's not obvious. Can you really

3:31

tell if someone suffers from schizophrenia

3:33

just by looking at an anatomical

3:36

picture of the brain. It's

3:39

not obvious. So the judge decided

3:41

to dismiss the jury so that he

3:43

could hear the arguments about

3:45

whether or not the technology was

3:48

relevant and should be admitted. And

3:51

expert witness, a physician, pointed out

3:53

that Hinckley's soul

3:55

sigh, which are the valleys running along

3:58

the outside of the brain, these were

4:00

wider than average, and this

4:02

physician cited a paper suggesting

4:05

a connection between schizophrenia

4:08

and wider sulsie.

4:10

So the assertion was, if you have

4:12

schizophrenia, you can see

4:14

that just by looking at a cat scan

4:17

of the brain. So this doctor

4:19

said, quote the fact that

4:21

one third of schizophrenic participants

4:24

in the study had these widened

4:26

sulsie whereas in normals probably

4:28

less than one out of fifty have them. That

4:31

is a very powerful fact. But

4:33

the prosecution rebutted

4:36

this. They said, no way, it has not been

4:38

proven that a cat scan

4:40

can aid in the diagnosis

4:42

of schizophrenia, and therefore this

4:45

evidence should not be presented

4:47

to the jury. In other words, they

4:49

argued the technology should be excluded

4:52

from the courtroom because it was not yet ready

4:54

for prime time. The judge

4:57

listened to the arguments and

4:59

he finally decided that he would not

5:01

admit the cat scan. Then

5:04

nine days later he heard more expert testimony

5:07

and he confirmed that he would not take the

5:09

cat scan. And then he

5:11

changed his mind and reported he would

5:13

take the cat scan. Okay, So what

5:16

is this back and forth illustrate? It illustrates

5:19

the difficulty for a judge to

5:21

decide what makes meaningful

5:24

evidence and what does not. In

5:26

the end, Hinckley was found

5:29

not guilty by reason of insanity, although

5:31

that had little or nothing to do with the cat

5:33

scan. But this high profile case

5:36

is just one of hundreds

5:38

where this question comes up, should

5:40

neuroimaging be allowed in

5:42

the courtroom. There's no single

5:45

answer to this question, and in part that's

5:47

because there are many different guyses in

5:49

which it comes up. And so that's what we're

5:51

going to talk about today. We're going to talk

5:53

about how any technology gets

5:56

into courtrooms. So

5:58

to motivate this, imagine

6:01

that we start seeing advertisements

6:03

for a new Silicon Valley

6:05

company that has developed a new mind

6:08

reading technology. They call

6:10

this the Palo Alto three

6:12

thousand, and they strap it to your head

6:15

and they measure some brain waves and

6:17

pass that through a large language

6:19

model, and they print out in words

6:22

to a screen what you are thinking.

6:24

So you might be thinking about wanting

6:26

a hot dog with pickles. In this machine

6:29

will print to the screen, I want

6:31

a hot dog with pickles. Now

6:33

this is totally made up, but pretend it's

6:35

true. In a few years that said

6:38

company launches, and

6:40

let's say the technology looks pretty good.

6:42

It captures the gist of what

6:45

you are thinking about. Now, the question

6:47

is should this be admissible

6:50

in a court of law. Let's

6:53

imagine that someone puts it on

6:55

and states under oath that on April

6:58

twenty fifth, he was getting dinner with his family,

7:00

but suddenly the screen prints out I

7:03

committed the crime. Now, how do you

7:05

know whether to believe that or not? The

7:08

company is started by a

7:10

handful of young people who dropped out of college,

7:13

and they claim to be experts in neuroscience,

7:15

But how do you know whether it really

7:19

works? And especially in a high

7:21

stakes situation, should you accept

7:23

this in a court of law or

7:25

not? Well, some people, in order

7:27

to judge the quality of the technology, they

7:29

ask, well, are they charging for this technology?

7:32

But that's not a meaningful measure.

7:34

Of course they're charging. They can't develop

7:37

new technologies for free, anymore

7:39

than you would expect Apple to not

7:41

charge for their laptop. But the fact

7:44

that they're charging certainly doesn't rule

7:46

in or out anything about its efficacy.

7:49

So how do you know whether the technology

7:52

is efficacious? Can it be used in a

7:54

court of law? How do you know whether

7:57

it works and provides what the legal

7:59

system calls probative value,

8:02

which means can it do what it's supposed

8:04

to do? Can it provide sufficiently

8:06

useful evidence to prove

8:09

something in a trial? So

8:11

this is what we're going to talk about today.

8:14

Most of the time we don't realize that

8:16

new technologies always have to be assessed

8:19

by courtrooms to know whether

8:22

they should be accepted or rejected.

8:24

And some get in and then we take that as background

8:27

furniture, and others never make it. And

8:30

what we're going to see today is how and

8:32

why. So fast forward

8:34

some decades from the Hinckley trial.

8:36

Where are we now? What is

8:38

allowed in the courtroom? Well, we have

8:41

more sophisticated technologies

8:43

to image the brain. Now, for example,

8:46

we can get a picture of the

8:48

brain in an MRII

8:50

scan. Magnetic resonance imaging MRI

8:53

gives you a snapshot

8:55

of what the brain of a person looks like.

8:58

You're not seeing activity there,

9:00

You're just seeing the anatomy. Think

9:02

of this an analogy to the way you would

9:04

look at someone's skeleton with an X ray.

9:06

You can't see anything moving around what

9:08

you see as a snapshot. So

9:11

with MRI you can hope to see

9:13

abnormalities like a tumor

9:16

or evidence of a stroke, or the

9:18

consequences of a traumatic brain

9:20

injury. Now, I've been

9:22

called by many defense lawyers over

9:24

the years who say, I have a client

9:26

who's going up for trial. Can you take

9:28

a brain scan and see if you

9:30

can find something wrong with their brain,

9:33

so this can serve as a mitigating

9:35

factor. But I always tell them the same

9:37

thing. If you find something wrong with your

9:39

client's brain, that can serve

9:41

as a double edged sword. The duray

9:44

might think, Okay, I'm convinced there's something different

9:46

about this man's brain. But this

9:48

presumably means he'll be predisposed

9:51

to committing this kind of crime again,

9:54

so we should probably lock him up for a longer

9:56

time. So a defense lawyer has

9:58

to utilize this argument

10:00

with care. In any case, what

10:03

MRI gives you is an anatomical

10:05

snapshot. And now I want to tell

10:07

you about the next level of technology called

10:10

fMRI. Where the F stands

10:12

for fancy MRI. Okay, I'm kidding.

10:15

It stands for functional magnetic

10:17

resonance imaging fMRI. And

10:20

this is because it's telling you about the

10:22

function of the brain. It's

10:25

measuring blood flow to show

10:27

you where the activity in the brain just

10:30

was. This works because when brain

10:32

cells are active, they consume energy,

10:35

and the blood flow to that specific

10:37

region needs to increase so

10:39

that you can bring fresh oxygenated

10:41

blood to the area to restore

10:43

the used energy. So in fMRI,

10:46

we see where the new oxygenated

10:49

blood is going and we say, aha,

10:51

there must have just been some activity

10:53

there a few seconds ago. So

10:56

that's the difference between an anatomical

10:58

snapshot or a functional picture

11:00

of what's going on. Now. Part

11:03

of the reason that you can use the static

11:05

snapshot the MRI in court

11:08

is because it's generally seen as

11:10

hard science. This is the guy's brain.

11:13

But when we're talking about fMRI,

11:16

what we're looking at is the activity in the brain, and

11:18

we're generally asking something about the

11:20

person's mental state, and

11:22

can that be the same kind

11:25

of hard science. On the one hand,

11:27

it's a clear question with a clear answer

11:29

if someone has a stroke or a brain tumor.

11:32

But this isn't the case if you

11:34

want to pose a question like did

11:36

this defendant intend to

11:39

kill the victim? fMRI doesn't

11:41

and can't give you clear answers

11:44

like that to questions that are useful

11:46

for the legal system. So we're going to dig

11:48

into this now. So

12:03

first let's start with the question of

12:05

whether fMRI has

12:08

been used in courts. The answer is yes,

12:10

But the technology can be used in different

12:12

ways. It doesn't always have to involve an

12:15

individual's brain, but can sometimes

12:17

be about brains in general.

12:20

So let me give you an example. There was

12:22

a murder case in Missouri where

12:24

a young man named Christopher Simmons

12:27

broke into the home of a woman named

12:29

Shirley Crook. He covered

12:32

her eyes and mouth with duct tape,

12:34

he bound her hands together, and

12:36

then he drove her to a state park and

12:39

threw her off a bridge

12:41

to her death. Now, this was a

12:43

premeditated crime, and the evidence

12:46

was overwhelming, and he admitted to the murder.

12:48

So the judge and jury handed down

12:51

a death sentence. But

12:53

there was a complication. Christopher

12:55

Simmons was only seventeen years

12:57

old at the time he committed the crime.

13:00

And so this case spun all the

13:02

way up to the United States Supreme Court,

13:05

and the question was can you

13:07

execute someone who was under

13:09

the age of eighteen when they committed

13:11

the crime? After all, the argument

13:14

goes, adolescence is characterized

13:16

by poor decision making, and young

13:18

people should have the chance to grow

13:20

up into a different life. Well,

13:23

one of the things that happened at his trial

13:26

is that the Supreme Court considered

13:28

fMRI evidence. Now this wasn't

13:30

from Simmons's brain in particular, but

13:33

from adolescence in general.

13:36

The study compared young people

13:38

and adults performing the same cognitive

13:41

tasks, and what the researchers found,

13:44

not surprisingly, is that young brains

13:46

are not doing precisely the same thing as

13:48

older brains. There are measurable

13:51

differences. A juvenile's

13:53

brain just isn't the same thing as an

13:55

adult. So the Supreme Court

13:57

justices saw this evidence,

14:00

considered it, and presumably this is part

14:02

of what led the court to conclude

14:04

that it is unconstitutional to

14:07

execute someone for a crime who is a

14:09

minor. Now that's an example

14:12

of fMRI making it into the

14:14

court. It's been used in this way

14:16

to compare groups of people,

14:18

juveniles versus adults in this case.

14:21

But things get a little trickier

14:24

when you're trying to say something about an

14:26

individual's brain, the brain of

14:29

the one guy standing in front of the bench.

14:32

So what can we and can we not

14:34

say with the technology? So let's zoom

14:36

in on a few examples. Many

14:39

researchers and legal minds have been asking

14:41

whether one can use brain imaging

14:43

to diagnose whether someone has antisocial

14:47

personality disorder, which is a condition

14:49

in which a person has a long term

14:52

pattern of manipulating, exploiting,

14:54

and violating other people people

14:56

with antisocial personality

14:58

disorder or a SPD. They'll

15:01

commit crimes, they'll flaunt rules,

15:04

they'll act impulsively and aggressively,

15:06

they'll lie and cheat and steal. Now,

15:10

this is a condition that is massively

15:12

overrepresented in the prison population.

15:15

But biologically it's not obvious

15:18

what it's about. There's no single gene

15:20

here, and there's not a single environmental

15:22

factor. It's a complicated combination.

15:25

And the legal system often cares to know

15:28

whether someone has ASPD

15:31

or not. And so researchers started

15:34

to wonder a long time ago, could you use

15:36

brain imaging to determine

15:38

in some clear categorical

15:40

way, does this person have

15:43

ASPD or not. So, in

15:45

one study, researchers highlighted

15:48

the brain regions that had high

15:50

probability of being anatomically

15:53

different between people with

15:55

ASPD and those without. And you can

15:57

look in the cortex what's called the

15:59

gray matter, or below the core

16:01

text what's called the white matter, and you can

16:03

measure these small anatomical differences

16:06

between those with and without. So the question

16:08

arose, can you use this technology

16:11

in court as a diagnostic tool

16:13

to say that this person has

16:16

ASPD or not? Now

16:19

do you see any problems with this off the top of your

16:21

head about whether this technology

16:23

can be used. The problem is

16:25

that all the scientific results come about

16:28

from examining groups

16:30

of people, like fifty people in each

16:32

group, and the question is whether

16:35

these group differences are strong

16:37

enough to tell you about individual

16:40

differences. So this is known as the

16:42

group to individual problem. In

16:44

other words, you have data from

16:47

groups of people that can be distinguished

16:49

on average, but you're trying

16:51

to say something about this individual.

16:54

It would be like making an accurate

16:56

statement that men on average

16:59

are taught than women, and

17:01

then asking whether some individual,

17:04

like a tall woman, could be categorized

17:06

as a man because her height clocks

17:09

in at the average mail. The legal system

17:11

is well aware of this grouped

17:13

individual problem, and so as

17:16

technologies are introduced, the

17:18

justice system always needs to ask how

17:21

specific is this technology and how sensitive

17:24

is it? Is it good enough for individual

17:26

diagnosis. Brain imaging studies

17:29

generally just give us group average

17:31

results, and the question is whether

17:34

it tells us enough or anything

17:36

about the person who's standing in front

17:38

of the bench right now. Now.

17:41

The idea of bringing functional

17:44

brain imaging to bear on questions

17:46

of criminal behavior is an old one,

17:48

and this grouped individual problem

17:51

is just as old. For example, there

17:53

was a study in nineteen ninety seven where

17:55

researchers image the brain of

17:58

normal participants and murderers,

18:01

and they found, on average, there was less

18:03

activity in the frontal lobes

18:05

in murderers. So you

18:08

look at the activity in the front of the brain

18:10

behind the forehead and you say, hey, on average,

18:12

there's less going on here in the murderer

18:15

group. But you can't use this on an

18:17

individual. You can't say, oh, this

18:19

person has less activity, so he must

18:21

have been the murderer. In other words, it

18:23

has no power in a court of

18:26

law. You still face the problem of

18:28

trying to say anything about

18:30

an individual from a group average.

18:33

And so it's for reasons like this that

18:35

brain imaging on individuals

18:37

has not gotten very far in courtrooms.

18:40

Let me give one more example. Another research

18:42

group used brain imaging fMRI

18:45

to see if they could identify pedophiles.

18:49

They found twenty four pedophiles

18:51

and thirty four controls, and they

18:53

showed them images of naked

18:56

men and women and boys and girls. And

18:58

what they found is that they could on average

19:02

separate the participants who were pedophiles

19:05

from the participants who were not. In

19:07

other words, the pedophilic brain shows

19:09

a subtly different signature of

19:12

brain activity than the non

19:14

pedophilic brain when shown these

19:16

pictures. It turns out that heterosexual

19:19

versus homosexual seems to be distinguishable

19:21

as well. So you might think that

19:23

sounds quite useful for the legal system,

19:26

but when scientists and legal

19:29

scholars take a closer look, it's not

19:31

as clear. The first question is

19:33

what are these brain signals actually

19:36

measuring. The assumption is

19:38

that it's measuring a state of arousal,

19:40

like sexual attraction, but what else

19:43

might be going on? Well,

19:45

the difference in brain signals could be driven

19:47

by a stress response

19:49

or an anxiety response by the

19:52

pedophilic participants who know they're

19:54

being measured. Or perhaps

19:57

what you're seeing is a measure of disgust

19:59

by the non pedophilic group who

20:01

knows the purpose of the study and doesn't

20:04

like gazing at pictures of children in

20:06

this context. Or what if the

20:08

pedophilic participants were just

20:10

slightly more likely to avert their

20:12

eyes because of shame or

20:15

not wanting to get measured. That would

20:17

cause a statistical difference

20:19

in the brain signals and could, in

20:21

theory, explain the results. So

20:24

there are lots of things that

20:26

could yield this brain imaging

20:29

result of a difference between the two groups,

20:31

beyond the hypothesis that it's

20:33

just measuring arousal, So stress,

20:36

anxiety, discuss shame, all

20:38

these things might be what's getting

20:40

measured here. And part of why this matters

20:42

is because there are many brain imaging

20:45

measures where it turns out

20:47

it's easy to manipulate the results.

20:50

So let's say you are a pedophile

20:52

who doesn't want to be labeled as such. Can

20:54

you purposely move your eyes

20:57

whenever you see a picture of the children,

20:59

and that messes up the ability

21:02

of the scanner to measure something. If

21:04

something can be faked or messed

21:06

up, then the technology is useless.

21:09

But let's say, for argument's sake, that you have a technology

21:11

that can't be faked or manipulated,

21:14

and that allows us to move on to the second point.

21:16

Let's say you don't even care what's getting measured,

21:19

like stress or anxiety or whatever. All

21:21

you care to know is whether there is a

21:23

neural signature that can distinguish

21:25

the pedophiles from the non pedophiles, irrespective

21:28

of what is causing that signal. Well,

21:30

there's also a legal problem here, which is

21:32

that it's not illegal for a person

21:34

to be attracted to children. It is

21:36

only illegal if they act

21:38

on that. All that's illegal is

21:41

whether you have committed a crime

21:43

or not, not whether you are attracted

21:45

to children. So you can think about whatever

21:48

attracts you ostriches or

21:50

jello or whatever, as long as you

21:52

don't commit an illegal act.

21:56

So whether you're talking about ASPD

21:58

or murderers or pedophile while, you'll

22:00

see that measuring something that

22:03

matters for a court of law isn't

22:05

as straightforward as it might

22:07

have originally seemed. So

22:10

now let's return to the Palo Alto three

22:12

thousand. The question is just

22:14

because the company claims

22:16

that it functions, well, how do

22:18

you know whether or not to admit it into

22:21

the courtroom? After all, remember

22:23

what I said about the John Hinckley case,

22:25

how cat scans were admitted into

22:27

the court to argue that he had schizophrenia.

22:30

Well, it's now known that wide

22:32

and salsa in the brain have no relationship

22:35

to schizophrenia. There are other

22:37

better anatomical signatures that we have

22:40

now, like thinner cortices in the

22:42

frontal and temporal lobes and shrunkened

22:44

thalamuses. But it turned out

22:47

that the idea of white and salsa I

22:49

just didn't hold up. Now, there was nothing fraudulent

22:52

going on with the claim. It was just a new

22:54

technology at the time and they

22:56

were doing the best they could with small sample

22:59

sizes. But it turned out the

23:01

theory of white and sul sight was

23:03

scientifically unsound. Remember

23:05

how I mentioned that the judge went back

23:07

and forth several times about

23:09

the issue of whether to accept Hinckley's

23:12

cat scan into the courtroom. That's exactly

23:14

the right thing that should have happened. Not

23:17

all claims are going to be correct just because

23:19

a scientist says so. Despite

23:22

best efforts, science can often be

23:25

incorrect, and that is the

23:27

importance of the scientific method.

23:30

It's always knocking down its own walls.

23:33

So what is a court

23:36

to do about all this? Well,

23:38

let's say that someone wants to introduce

23:40

the Palo Alto three thousand into

23:43

a court case, and you are the

23:45

judge. You have expertise

23:47

in the legal system, but you don't know the details

23:50

of what's possible in neuroscience

23:53

and large language models, and you have

23:55

questions about whether this technology

23:58

should be admitted, questions about whether

24:00

it can accurately read people's

24:02

thoughts, So how do you decide

24:04

whether it should or should not be admitted.

24:21

So let's step back to nineteen twenty three.

24:23

There was a man named mister Fry who

24:26

said that he had developed a lie

24:28

detection technology and it

24:30

relied on a measure of your blood

24:32

pressure, and he wanted to introduce this

24:34

into a court case the

24:37

way you might want to get the Palo Alto three

24:39

thousand into a case. But it turned

24:41

out that mister Fry's claims were

24:43

not widely accepted by anyone

24:45

else in the scientific community, and

24:48

so on those grounds, the

24:50

court decided not to admit

24:53

it into the courtroom. What they said

24:55

was, look, we'll accept

24:58

expert testimony that comes from

25:00

well recognized science, but if there's some new

25:02

technology, it has to be

25:04

sufficiently established so that it's

25:07

gained general acceptance

25:10

in the field in which it belongs. In

25:12

other words, if other experts

25:14

in the field don't believe that

25:17

mister Fry's systolic blood

25:19

measurement is actually good at detecting

25:21

lies, then you can't admit it

25:23

as evidence in the court. And

25:25

that case set the bar for what came

25:27

to be known as the Fry standard,

25:30

which is that technologies need to

25:33

be generally accepted by other experts

25:35

in the field before they can be

25:37

admitted into the courtroom. So under

25:39

the Fry standard, the court would work

25:41

to determine whether the Palo

25:44

Alto three thousand has met

25:46

the general acceptance of the scientific

25:48

community. If science experts

25:51

around the world say, I've never

25:53

heard of this Palo Alto three thousand, I

25:55

don't think that it can actually work, then

25:57

you, as the judge, can

26:00

glued it from admissibility. So

26:02

the court solves the problem by

26:04

deferring to the expertise of other

26:07

people in the field. But this

26:09

isn't the only way to make that decision.

26:11

The Fry standard still is the rule

26:14

in about half the states in America,

26:16

but the rest use a different

26:18

rule to decide whether evidence

26:21

should be admitted, and this is called

26:23

the Dowbert standard. So in

26:25

nineteen ninety three there was a lawsuit from

26:27

a man named Jason Dalbert.

26:30

He was born with severe birth

26:32

defects and his parents brought

26:34

suit against Merrill Dow the pharmaceutical

26:37

company, and they said these severe

26:40

birth defects were caused by the

26:42

medication that the mother was on called

26:44

Bendicton. So the

26:46

pharmaceutical company said the birth

26:49

defects were not caused by this medication,

26:51

and it went to federal court and

26:53

Dalbert said, look, here

26:56

are animal studies showing that

26:58

this drug is related to birth defects.

27:00

And the pharmaceutical companies expert

27:02

witnesses got up and said, look, this

27:05

is not generally accept in the field because

27:07

these are just animal studies and there's no

27:09

conclusive evidence that

27:11

shows the link between these and humans.

27:14

So if you're the judge, how

27:16

do you know how to arbitrate this? It's

27:19

difficult. Right, here's some science from

27:21

the laboratories and here's the pharmaceutical

27:23

company saying it's not generally accepted in the

27:25

field that this causes birth effects. So

27:28

what do you do? Well, what happened

27:31

is the case was decided in favor of the pharmaceutical

27:33

company. So Dalbert took it on appeal

27:36

to the Ninth Circuit and the Ninth Circuit judges

27:38

also awarded this to the pharmaceutical company.

27:41

So Dowbert brought this case to the Supreme

27:43

Court, and the Supreme Court analyzed

27:45

this carefully, and what came out of this was

27:48

a new standard for when evidence

27:50

should be admissible, and that's known

27:52

as the Dalbert standard, and

27:55

the Doalbert standard says, look,

27:57

you accept expert testimony about,

28:00

for example, these labrat studies if

28:02

it will help the jury to understand the evidence

28:05

better or determine the fact

28:07

in issue. In other words, it doesn't

28:09

demand general acceptance

28:12

in the community. Under the Dalbert standard,

28:14

the key is just whether some piece

28:16

of evidence is relevant and reliable.

28:19

Now, the key is that the Fry standard

28:22

made the scientific community the gatekeeper,

28:25

but the Dalbert standard makes the

28:28

judge the gatekeeper. The judge gets

28:30

to say from the beginning that they'll

28:32

evaluate this and ask is

28:34

this evidence relevant and reliable?

28:37

Does it pass my bar for that? So,

28:39

regarding this hypothetical palo alto

28:41

three thousand, the judge might ask

28:44

has the technique been tested in actual

28:47

field conditions as opposed to just in a

28:49

laboratory. Have there been any papers

28:51

on the palab alter three thousand that were

28:53

published in peer reviewed journals? What

28:56

does the rate of error? Do

28:58

standards exist for controlling the

29:00

operation of the machine, and so on? These

29:02

are often difficult questions. It's not always

29:04

easy for a judge to make a decision

29:07

about whether or not to accept a new

29:09

technology. But this gives a pathway

29:12

where the judge is the gatekeeper.

29:14

So let's imagine for a moment

29:16

that the Palo Alto three thousand

29:19

passes the standard for admissibility.

29:22

Is there any reason why the technology

29:25

might still be excluded from

29:27

the courtroom. There is one reason.

29:29

Let's say that you're the defence lawyer

29:31

and you say, Gosh, this thing is

29:34

so stunning that it's going to

29:36

prejudice the jury because they're going

29:38

to look at this fancy technology, and

29:41

even in the absence of really good

29:43

evidence, they'll say, Wow, this

29:45

guy seems guilty. Let's send him to

29:47

the electric chair without considering

29:49

the other points. So

29:52

to prevent that from happening, there's a special

29:54

rule called Federal Rules of Evidence

29:56

four h three, and this just says

29:59

you you should exclude evidence if

30:02

what you can learn from it is substantially

30:05

outweighed by the risk of undue

30:07

prejudice. In other words, does

30:10

it sway the jurors more

30:12

than it should. So what you'll see in

30:14

courtrooms all the time is that if a lawyer

30:16

tries to exclude a piece of evidence

30:19

from being admitted based on let's say a

30:21

Doubert objection, but the evidence

30:23

gets past that, then the lawyer

30:25

is going to take a second bite at the apple

30:28

by calling on federal rules

30:30

of Evidence four three, saying, look,

30:33

even if this is relevant and reliable,

30:35

it's going to have too much sway

30:38

on the jury. So why is this

30:40

an issue? Are there technologies that

30:42

have undue sway on

30:44

jurors? Is that a concern? It

30:47

is? And this brings us back

30:49

to fMRI. In a

30:51

court of law where jurors

30:54

are your neighbors and your community

30:56

and probably not experts in neuroscience,

30:59

a lot of people will be swayed

31:01

by a colorful brain image.

31:03

They're going to put a higher weight

31:06

on this than maybe they should, and possibly

31:08

at the cost of not weighing this evidence

31:11

appropriately in the context of

31:13

the whole case. And this is part

31:15

of the concern that some legal scholars

31:17

have, and this has come to be known as

31:20

the CSI effect. So

31:22

you remember the television show CSI.

31:24

This stood for Crime Scene Investigation,

31:27

and it's a television drama about a team of

31:30

forensic scientists and detectives

31:32

in Las Vegas who use cutting

31:34

edge scientific techniques to solve

31:37

murders. So they go around

31:39

each week and meticulously gather and analyze

31:42

evidence from crime scenes and each

31:44

episode features a complex

31:46

case with an intricate puzzle and the

31:48

CSI team has to solve this

31:51

to bring the criminals to justice. Well,

31:54

the idea with the real life

31:56

CSI effect is that jurors

31:59

come to expect what they've seen on

32:01

TV in terms of magical

32:03

machinery that does something

32:06

like you hit a button to enhance

32:08

the picture and then the computer enhances

32:10

it, and they see everything with clarity, where

32:12

the plot twist requires that the

32:15

investigator pull out some magical

32:17

technology that suddenly solves the crime,

32:20

or looking at the pedophile's brain with neuroimaging

32:22

and knowing whether he did the crime or not. So

32:25

jurors have come to expect this sort of

32:27

thing because you don't spend

32:29

all your time in a courtroom if you're not a lawyer,

32:32

and something like the television

32:34

show CSI is their only window into

32:36

that world. The problem

32:38

is that it often turns out to be a false

32:41

window, and when researchers do

32:43

studies on this, they generally find that

32:46

jurors see neuroimaging

32:48

as the truth of the matter asserted.

32:51

So we just spent a minute on

32:53

looking at the claim that you can measure pedophilia

32:56

and we noted that the brain signals might

32:59

represent that you're a pedophile, or

33:01

it might represent stress or anxiety,

33:03

or disgust or shame or averting the

33:05

eyes or all kinds of things. But that kind

33:08

of nuanced analysis doesn't usually

33:10

get done, and so neuroimaging

33:12

often comes to be interpreted by the

33:14

jury as the truth of the

33:17

matter asserted. This is what scholars

33:19

sometimes call the fallacy of

33:21

neurorealism, and the fallacy is

33:24

just that what you see in these

33:26

pretty false color images is the

33:28

truth. In other words, somebody thinks,

33:30

oh, you're capturing the moment

33:33

of pedophilia in its raw form

33:35

there whereas, of course, the truth is

33:37

that fMRI signals are not

33:40

direct proof of the experience

33:42

itself. As a side note, these

33:44

questions of bringing visual

33:47

evidence into the courtroom, they're not unique

33:49

to brain imaging. They've been around for a long time.

33:52

It goes back at least to X

33:54

rays. So when X rays got

33:56

introduced in the eighteen nineties, they

33:58

immediately started showing up in court

34:01

and everybody was absolutely blown away

34:03

by the idea of being able to

34:05

see inside of a body.

34:07

It's like magic. So what happened

34:09

over a century ago is people asked

34:11

this question of can we use this as evidence

34:14

in court? And the judge said

34:16

at the time, as long as it was scientifically

34:19

reliable, it could be introduced.

34:21

But the same questions about influence

34:24

on the jury came up, because there's a

34:26

real power to seeing

34:28

something. And of course what

34:31

we have currently with brain imaging

34:33

is even a deeper issue because

34:35

it touches on all our notions of

34:37

being human. For example,

34:39

I saw a cover of a Time

34:42

magazine a while ago and the title

34:44

read what makes us Good

34:47

or Evil? And the cover image

34:49

was a huge picture of a brain

34:51

scan, and there was a little picture

34:53

of Mahatma Gandhi with a pointer

34:56

to a part of the brain. And there was a little picture

34:58

of Adolph Hitler with a pointer to

35:01

a different part of the brain. And

35:03

in case you haven't heard my other episodes on this,

35:05

I want to make it clear there is no such thing.

35:08

You can't measure some spot in

35:10

the brain to determine whether

35:12

someone is good or evil.

35:14

And by the way, Friedrich Nietzsche

35:16

wrote about this over a century ago, the

35:19

words good and evil don't

35:22

even represent something fundamental,

35:24

but instead these words end up getting defined

35:27

by your moment in time. What is

35:30

good right now may be seen as

35:32

evil in a century. These terms

35:34

are defined by your culture.

35:37

What you think is good might be seen as sacrilege

35:40

by another group. So the idea

35:43

that you could just measure something in the brain and

35:45

say whether the person is good or evil

35:47

really makes no sense. However,

35:50

millions of people see this kind

35:52

of Time magazine cover, and this

35:55

is why legal scholars worry that

35:57

brain images could be

35:59

persue of past the point that

36:01

they should be in the legal

36:04

argot. This is known as something having undue

36:07

influence. Brain images are influential

36:10

because they take some abstract

36:13

issue like evil intent and

36:15

seem to nail it down to

36:17

the physical. So this is why

36:20

something like Federal Rules of Evidence

36:22

four h three plays an important

36:25

role in asking whether

36:27

something has undo influence,

36:29

whether it sways people more

36:32

than it should now. At the extreme,

36:34

some people say functional brain

36:37

images should never be allowed in the

36:39

courtroom because of their influence. One

36:41

solution that a colleague of mind suggested

36:44

is that you ban the visual

36:46

aspects of brain images from the courtroom,

36:49

so you just have expert witnesses come on to

36:51

the stand and tell you what they

36:53

think is going on as best they can. But

36:55

they're verbally presenting the results, not

36:57

showing them. But these are tough issues,

36:59

right because you can show a gory

37:02

photograph from a crime scene, which

37:04

can also prejudice an entire courtroom.

37:07

Or you can show a reenactment

37:09

of a murder, but if you can't show

37:11

a brain scan, that seems like

37:13

maybe a double standard. So should you

37:15

rule out all visual images

37:18

or allow everything? And

37:20

if you heard episode nineteen, I

37:22

talked about eyewitness testimony

37:25

and how massively swaying that

37:27

is to jurors. You can have all

37:29

sorts of expert scientific testimony,

37:32

but then you have the person get up on the stand

37:35

with tears and a cracking voice and say,

37:37

I don't care what they say. I know that's the

37:39

guy. And we're all moved and influenced

37:42

by that, even though eyewitness testimony

37:44

is so deeply fallible. So

37:47

this is all just to say that the question of undue

37:50

influence always has to be

37:52

asked. Compared to what compared

37:54

to other technologies, compared

37:57

to gory photographs of the

37:59

crime scene, compared to acting

38:01

out a rape scene or a murder scene, do

38:04

those unduly sway a jury?

38:07

So I hope what you see is that These are tough issues,

38:10

perhaps tougher than you

38:12

had intuited at the beginning of the episode,

38:14

So let's wrap up. We

38:17

often think that when a new technology

38:19

comes along, like a new brain technology,

38:22

it always gives useful information,

38:24

and we might assume that courts start leveraging

38:27

it right away. But there are complexities

38:29

around this. For example, in an

38:31

earlier episode, I talked about lie detection. How

38:33

do you know when somebody is actually lying? There

38:36

are lots of technologies that try to measure

38:39

some version of this, but nothing

38:41

can simply tell you the answer because

38:43

the whole concept of a lie is

38:45

complex. Sometimes you might be

38:48

telling the truth but you're factually incorrect,

38:50

for example, because you're honestly

38:53

misremembering how something went, but you

38:55

believe your memory. Or for someone

38:57

else, they might have no associated

39:00

stress response because they just don't care

39:02

that they're lying. So when somebody comes

39:05

to the courts and says, hey, I have a

39:07

new lie detection technology,

39:09

the judge can't just say great,

39:12

bring it to the case, because the judge first

39:14

has to decide whether it should

39:16

be admitted or instead, whether

39:19

its promise will sway

39:21

the jurors more than its value.

39:24

We're all enthusiastic about the next

39:26

stages of technology and being able

39:28

to make important measures about

39:30

what's happening in the brain. But the

39:33

legal system has to be very careful

39:35

about this, whether by standards

39:37

of general acceptance in the scientific community

39:40

or by the choice of the judge's gatekeeper.

39:44

Each new technology has to be weighed

39:46

carefully for admissibility every time

39:49

before it can enter the esteemed

39:52

halls of justice.

40:00

Eagleman dot com slash podcast. For

40:02

more information and to find further

40:04

reading, send me an email at

40:06

podcast at eagleman dot com with

40:09

questions or discussion, and check

40:11

out and subscribe to Inner Cosmos

40:13

on YouTube for videos of each episode

40:16

and to leave comments Until next

40:18

time. I'm David Eagleman, and this is

40:20

Inner Cosmos.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features