Podchaser Logo
Home
Rachel

Rachel

Released Tuesday, 2nd April 2024
Good episode? Give it some love!
Rachel

Rachel

Rachel

Rachel

Tuesday, 2nd April 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

When I was in college, a scammer called

0:02

me up. He's like, look, I'm

0:04

not selling you anything or even telling you what to do. I

0:07

just have information about a stock and

0:09

I wanted to share it with someone. And you were just

0:12

like the lucky guy I found in the phone book. Listen,

0:14

stock Z is gonna go up next

0:16

week. That's all. I'll call

0:19

you back next week to prove it. I was like,

0:21

all right, that was a strange call. Whatever.

0:25

And yeah, he calls me back in a week.

0:27

And sure enough, the stock he told me about

0:29

went way up. He was spot on. He

0:32

was all excited about how much money he made. But

0:34

I told him, he just got lucky and he should

0:36

cash out and take a trip somewhere. He's like, no,

0:39

no, no, it's not luck. There's an

0:41

algorithm that can accurately predict this.

0:44

And he said he knew which stock was gonna go up next.

0:46

I was like, all right, so which one's gonna go up

0:48

next? And he tells me and says to keep an eye

0:50

on it and he's gonna call me back next week to

0:52

prove he was right. So yeah,

0:54

another week goes by and

0:56

the same guy calls me back and he's like,

0:58

boom, you see what I mean? And

1:00

he was all excited again. And I was like, I

1:03

don't see what you mean, but let me check the

1:05

price. And I checked the price and again,

1:07

he was right. And I was like, dang,

1:11

good job. But I think

1:13

you got lucky again. He said, no, he's been doing

1:15

this for a solid year now. And he's been

1:17

right every time. And he tells

1:19

me more about this algorithm and how he's

1:21

analyzing different indicators and watching the stock market

1:23

extremely close and just has everything dialed in.

1:26

And he tells me about another stock that

1:28

he says is surely going to go up.

1:31

And I'm like, okay, call

1:33

me back in a week. Let's see if you're right. And

1:35

sure enough, after a week I checked and he

1:37

was right again, three accurate stock price

1:39

predictions in a row. And

1:41

he called me back and he's like, dude. And

1:44

I'm like, dude. And he's like, you see

1:46

that? I said, I saw that. How are you

1:48

doing this? And he's like, I cracked the

1:50

code. But then like

1:52

the snake he was, he tried to strike at me. He

1:55

said, listen, the next one is

1:57

the craziest one I've ever seen.

2:00

There's this company whose stock price is gonna explode,

2:02

but the best part is they're in the

2:04

initial investor round. So you can get in

2:06

on the ground floor if you want. How

2:08

much do you wanna invest? 10 grand? You've

2:11

slept on three of these. You're not gonna wanna

2:13

miss another, right? And I'm

2:15

like, I'm

2:17

a college kid, dude. I've got 30 bucks

2:20

in the bank. I don't have 10 grand.

2:23

And he's like, ah, crap. He

2:26

hung up the phone. But

2:28

this fascinated me. Who was this guy

2:30

that was always getting the stocks right?

2:32

What was his algorithm? So I

2:34

looked into it. I met with a stock broker and

2:37

I asked him, how is this possible? And he's like,

2:39

ah, that guy was a scammer. And I'm like, duh,

2:41

I know. But how did he get the stocks right

2:43

every time? And this guy broke it down

2:45

for me. He said, okay. That guy called up

2:47

a whole bunch of people on week one, told half of

2:49

them the stock was gonna go up, I told the other

2:51

half the stock was gonna go down. Then

2:54

he called back the people who he was

2:56

right with. And he told half of

2:58

them about another stock that would go

3:00

up. And the other half, he would say, that stock's

3:02

gonna go down. And then he did

3:04

it a third time, calling back the people he

3:06

was right two times with, telling half of them

3:08

that the stock is gonna go up and the

3:11

other half saying it's gonna go down. So by

3:13

the time he did that three times, he had

3:15

this small pool of people who

3:17

he was right with every time. But

3:19

really, he was just playing a math game

3:21

with his victims. And

3:24

I think this is such a long but

3:26

brilliant scam. Seemingly, this guy

3:28

was golden, perfect, getting it right

3:30

every time. But what I didn't know

3:32

is that he was getting his

3:35

predictions wrong all over town. And I was

3:37

just one of the unlucky few that

3:39

saw him get it right every time. These

3:46

are true stories from the dark side of

3:48

the internet. I'm

3:53

Jack Resider. This

3:55

is Dark Knight Diaries. This

4:14

episode is brought to you by

4:16

Varonis. So many security incidents are

4:18

caused by attackers finding and exploiting

4:20

excessive permissions. All it takes is

4:22

one exposed folder, bucket, or API

4:24

to cause a data breach crisis.

4:27

The average organization has tens of

4:29

millions of unique permissions and sharing

4:31

links. Even if you could

4:34

visualize your cloud data exposure, it would

4:36

take an army of admins years to

4:38

right size privileges. With how

4:40

quickly data is created and shared, it's like

4:42

painting the Golden Gate Bridge. That's

4:44

why Varonis built Least Privilege

4:46

Automation. Varonis continuously eliminates

4:48

data exposure while you sleep by

4:51

making intelligent decisions about who needs

4:53

access to data and who doesn't.

4:56

Because Varonis knows who can and

4:58

who does access data, their automation

5:00

safely remediates risky permissions and links,

5:02

making your data more secure by

5:04

the minute. Even when

5:06

you're not logged in, Varonis is

5:08

classifying more data, revoking permissions, enforcing

5:10

policies, and triggering alerts to their

5:13

IR team to review on your

5:15

behalf. To see how Varonis can

5:17

reduce risk while removing work from

5:19

your plate, head on over to

5:21

varonis.com/darknet and start your free trial

5:23

today. That's

5:26

Varonis, spelled

5:29

V-A-R-O-N-I-S,.com/darknet. This

5:36

episode is sponsored by ThreatLocker. These

5:39

days, one wrong click of a button can

5:41

lead to a catastrophic cyber attack on your

5:43

organization and nobody has time to keep training

5:45

poor Doris in accounting on what not to

5:48

click. Cyber attacks are devastating

5:50

to businesses and without the correct solution in

5:52

place, your operation remains at risk.

5:55

Threatlocker has built an endpoint

5:57

protection platform that strengthens your

5:59

information. Structure from the ground up

6:01

with zero trust. Security posture step

6:04

protects business critical data for also

6:06

mitigating cyber attacks. Through. Lockers

6:08

allow listing ring fencing, swords control,

6:10

elevation control, and network control solutions

6:12

to name a few. give you

6:14

a more secure approach to blocking

6:16

the exploits of known and unknown

6:18

threats and application phone or billings

6:21

working together to deny applications by

6:23

default, lot in bow network traffic,

6:25

and ring fencing your applications. Threat:

6:27

Locker provides zero trust control at

6:29

the Colonel level. If you're

6:31

looking for a proactive solution

6:33

that will keep your business

6:36

better protected in the face

6:38

of cyberattacks, Outset Lockers Cyber

6:40

Heroes At W W W.threat

6:42

locker.com. That's

6:44

threats locker.com.

6:50

Gather. Around in this episode we're going

6:52

to hear stories from Rachel Tobacco and

6:54

she's one of the best social engineers

6:56

I've ever met. Let's start with

6:58

your origin story. As.

7:01

A kid. How did

7:03

you get interested in this type

7:05

of work? And.

7:07

Pay my origin story so. My.

7:11

First time that I ever saw

7:13

about being any sort of hacker

7:15

was when I realized that being

7:17

a spy. Is a job

7:20

that people deal and it's a

7:22

job that girls could do. And

7:24

I learned this. screw with the

7:26

movies Harriet the Spy She goes

7:28

around sneaking into people's houses, dying,

7:30

she takes her notebook everywhere. She sneaks

7:32

through the dumb later in this

7:34

rich woman's house and gets caught

7:36

and I just thought I had no

7:38

idea that a girl could be

7:40

a spy. So that basically became

7:42

my personality from my childhood. Did you

7:45

get into computers when you were older or

7:47

of instill in middle school? I suppose. i

7:49

did not get into computers i wanted

7:51

to get into computers i went to

7:53

my guidance counsellor and i think sixth

7:55

grade and and i said hey i

7:57

wanted he pees coding classes and my

8:00

guidance counselor, she said, you don't

8:02

want Rachel, you don't want to take those,

8:04

those coding classes. Those coding classes are

8:06

40 boys. You'd be the

8:08

only girl there. Just take home ec

8:10

instead. Oh, you serious? I know. And

8:12

me being a child, I was like,

8:15

Oh, good call. I mean,

8:17

yeah, I don't want to like blame her for

8:19

like me never learning to code. I mean, I

8:21

could have tried later in life, right? People try

8:23

later in life all the time. But no, I've

8:25

actually never written a single line of code. I

8:27

ended up getting my degree in neuroscience and behavioral

8:30

psychology. I was a teacher's assistant

8:32

for statistics. I never got into

8:34

code. neuroscience.

8:37

Yeah. So my path to info

8:39

second hacking to the

8:41

untrained eye, it doesn't make any sense. It's almost

8:43

completely nonlinear to me looking back, it makes a

8:46

lot of sense. So I

8:48

got my degree in neuroscience and behavioral

8:50

psychology. I was doing

8:52

improv on the weekends. I was a teacher. I

8:55

then was like, Hey, I want to try and get into tech.

8:57

I moved to San Francisco. I

8:59

was a community manager at a tech company,

9:01

actually ended up leading a UX research team.

9:04

And then while I was at

9:06

that tech company, my husband was in

9:08

security the entire time I actually met my husband in

9:10

high school, I met him when I was 15 years

9:12

old. So my husband was

9:14

like, Hey, I heard about this

9:16

thing called Defcon, I think you would get a kick

9:18

out of it. And I was like, ah,

9:22

I pass. I think I'm not

9:25

going to do that. Okay, so

9:27

Defcon is the annual hacker conference in

9:29

Las Vegas is wild there. You'll see

9:31

people walking around with antennas sticking out

9:33

of the backpacks talks about how to

9:35

bypass just about anything on a computer

9:37

and tons of villages that focus on

9:39

specific areas of hacking. The social

9:41

engineering village is one of the more popular

9:43

ones. And when Rachel husband went into this

9:45

village and saw what they were doing, he

9:48

immediately called her up to tell her what he

9:50

was seeing. They do this thing where they put

9:52

you in a glass booth. It

9:54

soundproof in front of an audience of 500 people,

9:57

You call companies and. Read

10:00

the information out of them over the phone if

10:02

think the exact same skill that you use every

10:04

month to get the bill lowered when you call

10:06

these companies he built a poor are you get

10:08

a deep discount on things like the cable that

10:10

you'll love it. As he still like

10:12

now. I'll pass thinks he

10:14

called me back in his like I just

10:17

saw more of these calls. The Social engineering

10:19

course you have to com like I promise

10:21

you if you don't like it will just

10:23

go gambling. As a side and. Still,

10:26

Like, okay, fine, I packed my bags.

10:29

I get the first flight out Saturday

10:31

morning. As you know, if you're on

10:33

Saturday morning, it's like that sounds like

10:36

a third over at that point, if

10:38

not more social up. I see a

10:40

few calls. What? She's

10:42

watching was the Social Engineering contest.

10:45

There's fourteen contestants and a given

10:47

the task to basically good enough

10:49

information to hack into a company.

10:52

All through phone calls, See.

10:54

Have to prepare and figure out who would be

10:56

an easy target to get information from and

10:58

with cell phone number and you better have backup

11:01

numbers increased. The person who called doesn't answer

11:03

or hangs up on you. Once you do get

11:05

someone on the phone you get points for

11:07

every bit of security data. Using get off stamps

11:09

of you can get them to tell you what

11:11

operating system they use. He get a point

11:13

or a flag. And. Maybe from their,

11:16

you try to figure out what browser

11:18

they use information about their security guards

11:20

would janitor service to. Can't

11:22

just ask these questions directly. It

11:25

raises suspicions. See. That provide

11:27

a pretext or pretend to be

11:29

someone else. Maybe someone who

11:31

works in another departments or someone

11:33

brand new to the company who

11:35

doesn't know anything but urgently needs

11:37

to get a report done today.

11:40

It's tricky, it's it's heads is high

11:42

stakes because if you get caught on

11:44

the phone, your burns and now you

11:46

don't get any points. And

11:48

the best part is the audience

11:50

gets to watch all this lives

11:52

I see. A few calls it's like

11:54

on like this is me like I

11:56

can do this I was born for

11:58

that matter with mine. right? I told

12:00

you. So

12:03

she immediately is like, okay, how do

12:05

I compete and this and yeah, it's

12:07

a whole process. You need to submit

12:09

an application, create a video of yourself,

12:11

and stand out from the crowd because

12:13

only fourteen are chosen to compete out

12:15

of hundreds of people who try out

12:17

for it and. As like all

12:19

I got just the thing. I made

12:21

his Twin Peaks style video to convince

12:23

them to let me get in and

12:25

has somehow they agreed to let me

12:27

participate. Hundreds of people

12:30

fly and fourteen contestants are selected

12:32

Every. Single Year now. They actually give you

12:34

the target company that you have to attack ahead

12:36

of time so you can do your research on.

12:39

And a lot of research was because you want

12:41

to find as much information as you can about

12:43

this company, like going to google searches or just

12:45

looking at public places. Maybe you get a list

12:47

of people and phone numbers, the costs so that

12:50

when it's your turn to call, you know exactly

12:52

who to call and what questions to ask. and

12:54

that becomes quite a lot of work to prepare

12:56

for that moment. for. When you're gonna call someone,

12:59

you could spend a solid month learning everything you

13:01

can about your target company. So you

13:03

can sign when you're in the booth. It

13:05

was a major consulting firm is really all I

13:07

can say. How these companies don't

13:09

know they're about to get hacked

13:11

as really extraordinary into Was is

13:13

basically a life hack with an

13:15

unsuspecting target. So. She gathers as

13:18

much intel as accounts and heads to Def

13:20

Con to compete. I get an

13:22

eye glasses. Now. All eyes

13:24

and ears are on her. Not only does

13:26

he have to trick one person on the

13:28

phone to give her the information they shouldn't

13:30

be giving her personal to do in front

13:32

of an audience. But. She's done

13:35

improv before and was absolutely ready

13:37

for this eye. contact my target

13:39

company that i pretend to be and

13:41

employees is confused is just starting out

13:43

and i end up getting flag after

13:45

flag and to get out of the

13:47

booth and i'm like maybe i did

13:50

okay and then there's like a standing

13:52

ovation i'm like oh maybe i did

13:54

better than okay and i ended up

13:56

getting second place at first time ever

13:58

hacking anybody at first I remember hacking somebody

14:00

happen in a glass booth in front of 500 people.

14:04

Dang. Second place. Of

14:06

course, no, she's hooked. That was fun

14:08

as hell. The nerves, the adrenaline, hacking

14:10

and social engineering, all of this, she

14:12

was just craving more of. So she

14:14

applies to compete again the next year.

14:17

And then I ended up getting second place the second year

14:19

and I got second place the third year as

14:21

well. Competing

14:23

three years in a row in the social

14:26

engineering contest and getting second place all three

14:28

years, that's what started

14:30

her career in social engineering. After

14:33

folks started seeing me get second place at DEF

14:35

CON, you know, they'd see me on stage, they'd

14:37

be like, Hey, I want to chat with you.

14:40

Can you come speak at my company about how you

14:42

hack and how we can test you? And

14:44

I live in Silicon Valley. So I got

14:47

really lucky that people started

14:49

asking me to do things that like are

14:52

a job, right? Like I'm like, Oh,

14:54

I guess I need to make a company. So

14:56

I made social proof security in 2017. And

14:59

I mean, I live in Silicon Valley. I was

15:01

so lucky. Some of my first clients were

15:03

like Facebook, Snapchat, PayPal, Twitter. And

15:05

from there it was like US Air Force,

15:08

NATO, Uber, Google, Cisco. It's like, Oh my

15:10

gosh, you know, I feel like I just got

15:12

really lucky in this life. The

15:15

crazy thing is that I've heard this

15:18

story over and over. Someone who has

15:20

no interest in hacking goes to DEF

15:22

CON, sees the social engineering stuff going

15:24

on there, immediately wants to compete, does

15:26

pretty good in a competition, and then decides

15:28

to do that for a living and start

15:30

their own company. It's mind boggling.

15:33

How many lives have changed from

15:35

people attending DEF CON? Oh, it's

15:37

totally wild. I mean, if you would

15:39

have asked me decades ago, like, what did you think you were

15:41

going to get into? The word

15:43

hacker would have never even made the top

15:45

100 list because I didn't know it was

15:47

possible, didn't know it could be a job.

15:49

And I certainly didn't think I would be

15:52

good at it. I, when

15:54

I saw the concept of a hacker in

15:56

TV or movies, it was usually a guy

15:58

who wore a hoodie. in a

16:00

basement. And I mean,

16:02

I wear hoodies and basins are fine, but I didn't think

16:04

that I was going to be good enough. So yeah,

16:08

you just have to see yourself in the position.

16:10

And I've had multiple women come up to me

16:12

and say like, Hey, I saw you in that

16:14

competition, didn't realize it was possible for people like

16:16

me. And now I do this

16:18

for a living. Okay,

16:21

so she started a company called social proof

16:23

security, which is basically social engineering for hire.

16:25

And companies were starting to hire her to

16:27

see if they were vulnerable to social engineering

16:29

attacks, and what they can do to stop

16:31

them. And of course, I'm fascinated by these

16:33

social engineering stories. How do you hack into

16:36

a company with just your voice or

16:38

your term? So a bank

16:40

hired me to penetration test them, effectively,

16:42

they hired me to hack them. And

16:45

they told me that I could hack via phone

16:47

call, email or chat. And

16:50

my job was to take over multiple

16:52

accounts and steal access effectively steal the

16:54

money out of the accounts. You

16:56

want to steal money out of customers accounts?

16:59

Yes. And when we do

17:01

a penetration test, it's very particular, I

17:03

don't want to steal money from like

17:06

everyday people that would be horrible and

17:08

really scary for bank customers to just

17:10

randomly have money stolen because of a

17:12

pentest. So what we do is we

17:15

create fake bank accounts. We

17:18

work with the team on the back end,

17:20

so that the support organization for all intents

17:22

and purposes, sees a real

17:24

customer. But we've

17:26

created fake bank accounts for me

17:29

to steal. So I don't actually

17:31

harm real people. But the

17:33

support team doesn't know they're fake. Okay,

17:35

so this company is a bank. And

17:38

she's told that she can target customer

17:40

support to see if she can access

17:42

a customer's bank account. And

17:45

she's given the options to use a phone

17:47

call, email or chat to get

17:49

through. That's right. So I started

17:51

with the chat feature. And

17:54

I posed as a customer to see

17:56

if I could take over a customer account with

17:59

just chatting. So I told

18:01

the bank support people my sob story,

18:03

I lost access to my phone, my

18:05

email, my laptop. I got lost

18:08

and I had a night out and I'm traveling

18:10

abroad. I mean like the whole story, right? And

18:13

I really need access to my bank account because I'm stuck and I

18:15

don't have money. And

18:17

the first thing that I usually try when I'm

18:19

trying to do an account takeover is I try

18:21

to see if I can get them to change

18:24

the email address or the phone number on the

18:26

account. Because if I

18:28

can do that, then I can change

18:30

effectively the admin on the account. Just

18:33

by changing the email address, I can then

18:35

reset the password or reset to a phone

18:37

number that I control. There's some

18:39

swapping and all of that that could happen after that. But you

18:42

know, that's basically how it works. And

18:44

they're like, oh, well, we can't do

18:47

that because we need to only send the

18:49

password reset to the email address already on

18:51

your account. Good for them. That's

18:53

the protocol they're supposed to follow. That's exactly right. So

18:55

good job bank. That's horrible for me as a pen

18:58

tester. A lot of times I have to play both

19:00

sides of this game. I have

19:02

to train the company and update their protocols

19:04

to prevent people like me from getting in.

19:07

But when I'm first attacking them, it's so frustrating. So

19:10

I try chatting with multiple other support

19:12

people. I'm trying again and again. They

19:14

will not make any exceptions

19:16

for me. It doesn't matter my pretext. That's

19:19

who I'm pretending to be. It doesn't matter

19:21

how I contact them, what I say, my

19:23

story, nothing. So I decide

19:25

to switch to phone call based attacking

19:27

because I tend to be much more

19:29

successful. So

19:32

I switched to phone calls. It leaves less of a

19:34

paper trail. People tend to get

19:37

less suspicious because I can build rapport. They can hear

19:39

my voice. They can hear how trustworthy I sound. And

19:43

also when I'm calling, I

19:45

can spoof phone numbers. And

19:47

a lot of times that helps me gain access.

19:50

Spoofing phone numbers. How

19:52

is this still possible? You

19:55

can download an app from the mobile app store.

19:57

And within a few taps, you can change what

20:00

phone number you're calling from to have

20:02

any phone number you choose. So

20:05

you can make it look like where you're calling from is

20:07

not actually where you're calling from. Now,

20:10

when I was young, I used to do this

20:12

with emails. I would love to send emails to

20:14

my friends pretending to be from the FBI

20:16

or the president of the United States. And I'd

20:18

be like, Bill, you're in serious trouble. We're coming

20:21

to get you. Something

20:23

that my friend Bill would be like freaking

20:25

out and it was awesome fun. But

20:28

then the email protocol

20:30

got updated. They implemented SPF

20:32

records somewhere around 2006. And

20:35

this ensures that the place you sent the emails

20:38

from is where the emails are supposed to

20:40

come from. This effectively put an

20:42

end to email spoofing. Of

20:44

course, not all companies configure their SPF records

20:46

properly and you can still spoof it, but

20:48

at least the option is there if you

20:50

want to block someone from spoofing your email.

20:53

But for phones, which have been around a

20:56

lot longer than email, it's

20:58

an unpatched vulnerability, in my opinion,

21:01

you can still spoof phone numbers. Yeah,

21:03

it's kind of wild in the U.S.

21:06

Right now it's still possible because all of

21:08

the telcos have to make the same decisions

21:10

at the same time. And

21:12

unless all of the companies get together and make

21:14

the same choices, it's going to

21:16

be really hard to implement the right solution. So

21:19

at least in the U.S., spoofing is so really

21:21

possible for me. Now, since phone

21:23

companies refuse to fix this, their

21:26

solution was to help pass

21:28

a law making it illegal to spoof

21:30

phone numbers. So for now, it

21:32

just seems like telephone companies are just relying on

21:35

the police to help keep people from doing this.

21:37

But to me, this is an awful way to

21:39

secure things. Telephone companies can

21:41

fix this if they want. But

21:44

while I see this as a

21:46

vulnerability, telephone companies have historically said,

21:48

wait, why are you using telephone

21:50

numbers as identifiers? They were never

21:52

meant to be identifiers. And

21:54

they put the blame on us for doing that because

21:57

for a long time, our phones didn't have

21:59

screens. So we never knew who was calling until

22:01

you picked up the phone and said hello But

22:04

then telephone companies gave us caller ID where

22:07

our phones would show who's calling and so

22:09

I do blame telephone companies for making Us

22:11

think it is an identifier since they were

22:13

charging extra for that feature back in the

22:16

90s and mobile phones today all come with

22:18

This feature so I say phone companies turn

22:20

caller ID off if you don't want us

22:22

to use it as an identifier Otherwise

22:25

patch it so phone numbers can't be

22:27

spoofed anymore so

22:29

anyway Rachel

22:32

was trying to get into a

22:34

customer's account let's call the customer

22:36

Kelly and she figured out what

22:38

phone number Kelly had and Rachel

22:41

spoofs her number to look like

22:43

she's calling from Kelly's phone. I

22:45

spoof my phone number I make it

22:47

look like Kelly on the account and by the way on

22:50

data brokerage sites when we're doing OSINT

22:52

open source intelligence Typically we can find most

22:54

people's phone numbers within a minute or two

22:57

So when we're searching we can just know okay,

22:59

this is Kelly. This is Kelly's

23:01

phone number I'm gonna go ahead and spoof that I

23:03

set that up It usually cost me a dollar

23:05

or so on the tools that are available on the

23:07

app store These are not

23:10

like heavily regulated. You can just find them

23:12

on the app store and I

23:14

go ahead and I place that call Can

23:17

you give me an example of how you sound on these

23:19

calls I'm gonna

23:21

make me act. Yeah, okay.

23:24

Okay. Give me one second. I gotta get into character.

23:26

I mean, yeah I'm gonna change my clothes. I can get the

23:28

character. Here we go Okay,

23:30

here we go Ring ring ring.

23:32

Oh, wait, we both said ring. Okay Thank

23:34

you for calling the bank. How can I help you today? Hi,

23:38

I am so sorry. My name is

23:40

Kelly Smith So I'm

23:42

traveling right now and I just lost

23:44

my laptop. My phone's not working I

23:46

cannot get access to any of my

23:48

funds I'm super stressed out.

23:51

Can you please please help me? Hmm

23:56

That was good. I won't make you do more. Thank you

24:01

So yeah, I mean, they've got a script that

24:03

they go through where they're just like, okay, well,

24:05

you know, do you have the last four digits

24:07

of your phone number or whatever the case is

24:09

to verify you? Or is that how you're challenged

24:11

or what happened? No. So

24:14

this bank knew that KBA,

24:16

knowledge-based authentication, things like what's

24:18

your address? What's

24:21

the last four digits of your phone number? This

24:23

bank knows that that information is very

24:25

easily found online. So

24:27

they don't use KBA, knowledge-based authentication, to

24:29

verify your identity. They usually use MFA,

24:31

multi-factor authentication. Now this is great. This

24:34

is exactly what I recommend, you know,

24:36

send a code to the email address

24:38

on file and make them read it

24:40

out to you rather than going

24:43

through this process of verifying identity

24:45

with information that can be found by an

24:47

attacker in five minutes online. So

24:50

that's good. But as an

24:52

attacker, that's going to be a challenge because I don't

24:54

have access to that email address. And

24:57

when I'm spoofing a phone number, I

24:59

actually can't receive text messages. And

25:01

if they call back, I'm not going to be the

25:03

one that answers that phone call. I'm just spoofing. It

25:05

looks like I'm calling, but I don't actually have access.

25:07

Now, of course, I could sin swap and many criminals

25:09

will do that. But for the purposes of

25:12

this pen test, that's not what I'm testing. So

25:16

they say, okay, we

25:18

have an edge case here. Let

25:20

me see if I can talk to

25:22

my manager and have you send in

25:24

a picture of your driver's license, your social

25:27

security card and a utility bill. And

25:30

instantly I'm like, okay, bingo. We're

25:32

in. The other half

25:34

of social security is my husband, Evan. He does all

25:36

the technical stuff. I do all the human hacking stuff.

25:38

This is great because I need a fake driver's license.

25:40

I can't wait to hear how you got a fake

25:42

driver's license. No. Okay. So

25:45

my husband, Evan, he gets to

25:47

work editing a driver's license, a social

25:49

security guard, and the utility bill to

25:51

the exact information that they're expecting for

25:53

this account, which again, we can

25:56

find through a data brokerage site. So

25:59

we're hoping. that this

26:01

company does not know the

26:04

actual driver's license number, the actual

26:06

social security number, and they're

26:08

just looking to ensure that the name and

26:10

address that are on the account match

26:13

those documents. You

26:15

can find those pieces of information through OSINT, and

26:17

a lot of times I've noticed that when they

26:19

ask for these types of documents, they don't know

26:21

the right info, they're just hoping that it matches,

26:23

and they stop there. So

26:26

you didn't need a real driver's license, social security

26:28

card. You just needed a JPEG, right? Correct.

26:31

And that's the trick there. Photoshop

26:34

was your friend. Photoshop,

26:36

yes. We spend all night

26:38

on these driver's license, social security cards,

26:40

and utility bills of the accounts we're trying

26:42

to hack. I

26:44

email the bank at 8

26:46

a.m. the next day. I

26:48

tell them my story. I tell them the edge

26:51

case that we have set up with support. I

26:53

send them the driver's license and social

26:55

security card and utility bill. By

26:58

9 a.m., I have full admin

27:00

access to the bank account. I

27:03

have changed it to be controlled

27:05

by my attacker-controlled email address, and I can

27:07

steal all of the money in the account. So

27:11

once I finally get in, I have

27:13

access to everything. I use the same method again

27:15

and again. I get access to two more accounts

27:17

throughout the day. I end up spreading out the

27:20

request so that we're not raising suspicion with the

27:22

same attack method over and over

27:24

again, back to back. In the end,

27:26

we took over each bank account that we

27:28

were asked to hack within

27:30

two days. Hmm. It's

27:33

truly astonishing the sheer force of

27:35

the human voice, its

27:38

ability to persuade, to move,

27:40

to manipulate all through a

27:42

simple phone call. It also

27:44

reminds me of how vulnerable customer support is

27:46

to this kind of exploitation.

27:49

When you're met with a soft voice telling

27:52

you a sad story, but wrapped

27:54

in kindness, it tugs at your

27:56

heartstrings. You find yourself

27:58

eager to assist. especially if you

28:00

just got off the phone with a real prick who

28:02

was yelling at you about overcharging him 10 cents. Contrast

28:06

that with a kind voice that's

28:08

truly asking for help. And

28:11

it really makes it hard to say no. I

28:13

know that in a lot of these organizations there are

28:15

edge cases. So I'm helping companies

28:19

say, okay, we

28:21

did this pen test. We figured out what the edge

28:23

case is. We figured out how we

28:25

got access. How do we make sure

28:27

we don't fall into this trap next time when the real

28:29

criminals get here? So I then

28:31

helped them with, okay, let's set up some

28:33

edge cases back to back so that we

28:36

have something like a

28:38

callback that would thwart spoofing. If

28:41

you don't want to use that, you can use

28:43

email verification one-time passwords, you know, sending a code

28:45

or just a word to the email on file

28:47

and having them read that out. SMS

28:50

verification. Okay, they claim

28:52

they're calling you from this phone number, but

28:55

maybe they're just spoofing it. See if they

28:57

can read out a text message, callbacks, sort

28:59

spoofing, service codes, pins

29:01

or verbal passwords. If

29:03

it's some sort of internal support

29:06

ticket, you can loop in a manager. There's

29:08

so many ways to do this right that

29:10

a huge part of the pen test is

29:12

not just hacking the company, but helping the

29:15

company figure out what is a real practical

29:17

way that we can solve these edge cases

29:19

in the future to verify

29:21

identity the right way and make it harder for you

29:23

to get in the next time. Because I'll go in,

29:25

I'll make it harder for me to get in as

29:27

an attacker. And then the next year, I'm like, Oh,

29:29

my, this is so hard for me to get in

29:32

until the point where I can't get in anymore. And

29:34

that's when I'm like, Okay, you've done the most that

29:36

you can do. It's time for a

29:38

sponsor break. But stay with us because Rachel has a few more

29:40

stories that she's going to share with us. This

29:45

episode is sponsored by Exonius.

29:47

Complexity is increasing and IT

29:50

and cybersecurity control it with

29:52

Exonius, the system of record

29:54

for all digital infrastructure. The

29:56

Exonius platform correlates asset data from

29:59

existing tools. to provide an

30:01

always up-to-date inventory, uncover security

30:03

gaps, and automate response actions.

30:05

Go to axonius.com/darknet to learn

30:07

more and get a demo.

30:09

That's axonius spelled A-X-O-N-I-U-S.

30:15

axonius.com/darknet.

30:21

On another engagement, Rachel was hired by

30:23

a company to help them sort out

30:25

an issue that they kept encountering. It

30:27

was a large technology company who would

30:29

sometimes buy or acquire smaller companies. Now,

30:32

when you're buying another company, you typically want

30:34

to keep it quiet until the official announcement.

30:36

It could affect share price or cause panic

30:39

in the company if things aren't communicated properly.

30:41

But for some reason, when this technology company

30:44

would do any merger or acquisition, it

30:46

would get scooped by some news agencies.

30:48

The announcement would show up on news

30:50

sites way before the company was ready

30:52

to tell the world. So this

30:54

company was like, Rachel,

30:57

maybe you can help us figure out how this

30:59

news keeps slipping out ahead of schedule. And

31:01

so they approached me about doing a

31:03

pen test to figure out how this

31:06

M&A info was getting leaked, where

31:08

they could possibly improve their training, their

31:10

messaging, their internal protocols to figure out

31:13

why is this happening? Why are folks

31:15

being incentivized to talk about this and what can we

31:17

do about it? When you hear this,

31:19

what's your mind first go into? Like,

31:21

you've got an insider threat somewhere. You've

31:24

got a breach, active breach. Yeah. So

31:26

insider threats happen. But

31:28

what is usually most

31:30

common is people

31:33

just make a mistake. I

31:35

kind of live in this world where I

31:37

assume that people are making mistakes and I

31:39

try and help them. So we came out

31:41

with a few different attack methods that might

31:43

work to uncover where this is happening. Number

31:46

one, I was going to attempt

31:48

to pose as a journalist and reach

31:51

out to various team members, asking

31:53

them via social media DMs,

31:56

email, text message, etc. about their

31:58

experience in tech and see if

32:00

I could siphon out M&A info and just see

32:02

where it goes. And number

32:04

two, I was going to

32:06

apply to their product manager role,

32:09

go through the entire hiring process

32:11

and see if I could extract

32:13

M&A related info during the question

32:16

portion of the hiring interview. I

32:18

did not know what was going to work and what wasn't,

32:21

but I just wanted to try both. All

32:26

right, so if you can oppose as either

32:28

one of these people, it sounds like you're

32:30

going to need a LinkedIn account or at

32:32

least some online presence. You can't just show

32:34

up as a nobody, right? Or at least

32:36

it helps establish your background and your pretext.

32:38

Definitely. Do you have any framework

32:42

or methodology for how you establish an

32:45

existing ghost in the world? Yeah.

32:48

So we call these ghosts,

32:50

we call them sock accounts. Sometimes

32:53

they'll be real people. And

32:55

so we'll fashion them pretending to be a

32:57

real person. Sometimes they'll be fake people. And

33:00

they'll just have this full life online. With

33:03

the fake journalist, I figured it was

33:05

going to be a lot easier to

33:07

pretend to be a real journalist and

33:09

just not actually be them, then create

33:11

an entire persona of a fake journalist

33:13

and populate real content. So

33:16

I built a fake journalist pretext,

33:18

email background and social media based

33:20

on a real journalist who I'm not

33:22

going to name of her. Interesting.

33:25

Rachel tried to be another

33:27

journalist that actually exists, maybe

33:30

by doing something like using a similar email address

33:32

or social media accounts. But the question

33:35

is, how do you know

33:37

who to ask in a company to

33:40

get information about upcoming mergers and acquisitions?

33:43

These are typically closely guarded secrets. But

33:47

there is a website that's

33:49

extremely helpful to social engineers.

33:53

There's a website that lists pretty much every

33:55

company and most of the employees that

33:57

work there and it tells you their job title.

34:00

role, what duties they have, and

34:02

full name. The

34:04

website is LinkedIn. And

34:06

personally, I feel like LinkedIn is a

34:09

security risk to most companies on there.

34:11

It makes it really easy for someone like

34:14

Rachel to go down the list of people

34:16

who work at a company and pinpoint the

34:18

exact person to target. Once

34:20

you have their name, it's probably

34:22

easy to get their email address.

34:25

It's usually first.lastname at companyname.com. I

34:28

mean, not only is there a list of people

34:30

who work at most companies on LinkedIn, but

34:32

they like to list their skills too. And

34:35

if someone says they've worked for a company

34:37

for 10 years as a database admin, and

34:40

specifically they say they're excellent at

34:42

Microsoft SQL Server, now

34:44

you can guess with high confidence

34:47

this company runs Microsoft SQL Server

34:49

internally, and this person probably

34:51

has the admin password for it. And

34:54

we all know how susceptible people are to phishing

34:56

emails. I mean, my

34:58

opinion is if you list and stuff like that, you're

35:00

just putting like a big old beacon over your head

35:02

saying, hey, I'm the person you're going to want to

35:04

hack. If you want to get in the database of

35:06

this company, come at me. Essentially,

35:10

the private information that should just

35:12

be kept inside the company is

35:15

posted publicly for anyone to see on LinkedIn.

35:18

And I mean, here's a story where the company

35:20

is wondering, hey, how come the public knows about

35:22

one of our internal memos? I

35:25

say start by auditing what your employees are

35:27

posting to LinkedIn. If

35:30

the company is totally cool with all this internal

35:32

stuff getting posted publicly, then

35:34

maybe that's perpetuating a

35:36

culture that's okay to blab

35:38

about exciting news to whoever asks. I

35:43

had someone message me on LinkedIn the other day asking me, hey, how

35:45

can I get my data taken off the internet? Like,

35:49

is this your photo? Is this where you work?

35:51

Is this where you went to school? Is

35:53

that your actual name? And you

35:56

posted all this to LinkedIn and you're wondering,

35:58

how come the internet knows? is all

36:00

this stuff about you? Because

36:03

the thing is, a lot of what data

36:05

brokers know about us is from the stuff

36:07

we post publicly. Data brokers are scouring our

36:09

social media profiles, our blog posts, and any

36:11

mentions of us on the internet. And

36:13

then data brokers store all that information

36:15

about you that you posted. It's frightening.

36:18

And I mean, the reality of the

36:20

situation is that anybody can do

36:22

a full background search in less

36:24

than five minutes on most people in the US.

36:26

And people don't realize that this information is out

36:29

there about them. They have no idea that it's

36:31

being sold. They just don't Google themselves. I

36:33

say we should take our own

36:36

privacy seriously. Because the more we

36:38

don't care about our privacy, the

36:40

more companies won't care about your privacy.

36:44

Anyway, as you can imagine, Rachel had

36:46

this target company and was able to

36:48

quickly guess at who might know about

36:50

upcoming mergers and acquisitions and started hyper-targeting

36:53

them, doing full background

36:55

searches on them, gathering up their details,

36:57

and just started reaching out, acting

36:59

like a journalist, emailing them, wanting to see

37:01

if she can easily get this information from

37:04

people. Exactly. Or we can reach out

37:06

over social media at DM. DM

37:08

on LinkedIn or Twitter or Instagram. And

37:10

I mean, that's the thing. Journalists really

37:12

do reach out using all of those

37:14

methods. So it's hard to know what's

37:16

real and what's fake sometimes. But

37:19

it didn't work. No matter who

37:21

she reached out to or how convincing

37:23

her backstory was, people weren't freely giving

37:25

her information about upcoming mergers and acquisitions.

37:28

This method wasn't working. They

37:30

let me know some minor details about excitement

37:33

about potential M&A, but they're not going to

37:35

confirm any juicy details. And I

37:37

try to get people on the phone

37:39

to talk with me, but I think

37:41

there's just this inherent distrust of this

37:43

particular pretext. So I'm like, OK, I got

37:46

to really go for the big guns here. I

37:48

want to attack via the hiring process. Attack

37:52

via the hiring process? What an

37:54

interesting sentence to say. I

37:56

don't think that idea crosses many people's minds. That

37:59

people. applying for jobs might

38:02

have malicious intent. I've

38:04

heard of the evil maid attack, but

38:06

what's this called? The phantom

38:08

applicant attack? There's

38:11

a lot of information that you can get

38:13

just from reading a job posting. Like when

38:15

a company lists the job duties, it might

38:17

tip their hand into what endeavors the company

38:19

is going to do next or expose what

38:21

technology they have in the company. And

38:24

these things can be used against the company

38:26

in social engineering attacks. I think

38:28

if you read enough job listings, you could

38:30

probably develop a map of the data center.

38:33

Packing into the company through the employment

38:35

process is actually a decent

38:37

attack vector. I don't think many companies

38:39

would expect you to come in through that door. Anyway,

38:42

what Rachel was going to do was

38:44

pose as a job candidate and try

38:46

to get an interview. And in the

38:48

interview, she was going to see if

38:51

she could get some insider information about

38:53

upcoming mergers and acquisitions. Something

38:55

to understand is as an attacker,

38:57

this is not easy to do. I've

38:59

never been a PM. So to apply

39:02

for a PM role takes a lot

39:04

of background research. I

39:07

mean, I led a UX research team at a

39:09

tech company. So I do have a sense of

39:11

what a PM, a product manager does, but I

39:13

am in no way prepared for a PM interview.

39:15

So I have to study for three full weeks

39:17

for this role. I'm watching

39:19

YouTube videos. I'm doing interview prep quizzes

39:21

online. I'm taking free online courses like,

39:24

so you want to be a PM,

39:26

like the whole nine yards. So

39:28

I'm building a full persona, a

39:31

resume, a Twitter, LinkedIn, Facebook, all

39:33

of these stock accounts have photos,

39:35

thousands of friends, reviews of my

39:37

work from networking groups on LinkedIn.

39:40

People I've never met that like

39:42

you give them a review and they give you a

39:44

review. All of this stuff is so gameable.

39:48

I suppose this is stuff you do at

39:50

social proof. Like there's

39:52

just like one person sitting over there like,

39:54

just keep making stock accounts all day. I'm

39:56

going to burn through them so fast. Unfortunately.

39:59

Yes, we do change the names

40:02

of many SOC accounts, but then you have to populate a

40:04

lot of new information. It

40:06

ultimately takes me about three weeks

40:09

to build a believable social media

40:11

account and enough examples of previous

40:13

PM work to get anywhere near-convincible

40:16

during the interview process. Did you

40:18

get help at getting your resume to the

40:20

top of the pile? No, I just had

40:22

to apply. That's not easy, you know.

40:24

There's a lot of people who don't get callbacks. Well,

40:28

during this period of time, the

40:31

tech hiring process wasn't as bad as it is

40:33

in this current year. So,

40:36

I apply for the role. I

40:38

get a phone screen. I am sweating

40:41

bullets. Because if

40:43

I don't get through this phone screen, I will

40:45

not move on to a full interview process. I'm going to have

40:47

to do a bunch of work to change my SOC accounts on

40:50

social media to match a new persona. It's going to be a

40:52

lot more work for me. Luckily,

40:54

it took like 45 minutes. I

40:58

passed. I get moved on to

41:00

the next round. The

41:02

next round has six different

41:04

interviewers. So, it's just... No,

41:06

this is in person? No, virtual, thank

41:08

goodness. Did

41:10

you try to gain any information on the first round?

41:14

No, not during the phone screen. I wanted

41:16

to get... Because you were like, I'm going to wait for the right time.

41:19

Yeah, I was terrified that they were going to be like,

41:21

this person's a weirdo, like, I just not moved them forward.

41:24

So, I waited until the

41:26

actual official interviewers.

41:30

And it's going to be a packed day of

41:32

interviews. I have six interviews back to back all

41:34

day. These interviews are conducted

41:37

over Zoom. I get all dressed

41:39

up in my interview clothes that I

41:41

haven't worn in years. I'm pressed with

41:43

all my anecdotes, my strengths and weaknesses,

41:45

my KPIs and success stories. And

41:48

a lot of these examples I'm using are heavily

41:50

focused on UX research because if you remember, that's

41:52

something I used to do. And

41:54

many PMs do have advanced UX research skills.

41:56

So, I'm just like hoping that they don't

41:58

think that's weird. So

42:01

I get to the first interviewer and the

42:03

interviewer is like, okay, asking me all these questions.

42:05

I seem a little nervous, but they're like, oh,

42:07

you know, don't worry about it. It's going to

42:09

be fine. We go through all

42:12

the PM related questions. This person's nervous

42:14

for all the wrong reasons. That's what's funny

42:16

here. I know, but I

42:18

have to pass. Yeah. So I'm going through this. I

42:20

mean, they're used to people being nervous and so that

42:22

I could see them saying, oh, it's fine. You're

42:25

doing great. Don't worry about it. And

42:27

you go, thanks. Because I'm trying to pack you.

42:29

And the only thing is when you're hacking people, a lot

42:31

of times it makes sense for your pretext to match how

42:33

you're actually going to feel when you're hacking. And

42:37

a lot of times you are nervous

42:39

when you're calling support because you can't

42:41

gain access to your bank account. You

42:43

are uncomfortable during an interview. These are

42:45

normal human emotions. And so it's okay

42:47

to not be way too overconfident. Sometimes

42:49

that can even read as strange. So

42:52

yeah, I mean, I'm sweating bullets. It's

42:55

clear. I'm, I'm nervous. We

42:58

finally get to the end and the

43:01

interviewer says, so do you have any questions for

43:03

me about the role? I

43:05

have never been sweater in my life. This is it.

43:08

And if they get suspicious during this

43:10

moment, all of my work is for nothing. So I

43:13

say, I

43:15

am so excited about this company. I hear

43:17

there's a lot of opportunity for growth. I

43:19

did a bunch of research. I did find

43:21

a few news stories that mentioned XYZ potential

43:23

merger. I know you can't

43:26

confirm anything, but I just want to

43:28

understand what an integration process

43:30

looks like at your company during

43:32

an M&A. I know

43:34

you can't confirm anything again, but I just want

43:36

to understand how my role could potentially change over

43:39

time. The interviewer

43:41

takes a beat and

43:43

says, you're right. I can't confirm

43:45

anything. And

43:47

my heart sinks. I'm like, no,

43:49

this person's trained. And

43:51

then they go, but just

43:53

because I can't confirm doesn't mean I

43:55

can't talk in generalities. And

43:58

winks. Actually, winks. like,

44:00

oh, this is going to be so good. So

44:03

there's a lot of hand waving and you

44:05

know, I can't confirm, but throughout the rest

44:07

of these interviews, but it

44:09

seems that everyone at this company knows you're

44:11

not allowed to say information in plain language

44:13

about M&A's, but that doesn't mean that I

44:16

can't glean pretty serious details about

44:18

the upcoming acquisition plans that have

44:20

been clearly discussed internally. By

44:23

the end of this day, I

44:25

got M&A info out of

44:28

three of the six interviewers. So

44:30

50%. What kind of info

44:32

are we taking? Yeah. So they

44:35

wouldn't tell me the names of

44:37

the companies that were potentially going

44:39

to be acquired. But I

44:41

would say things like, I saw a

44:43

rumor about XYZ

44:45

company. Is this the type of company

44:48

that you would be excited about? And

44:51

then the wink, wink hand waving process

44:53

starts of, you know, I can't confirm

44:55

it, but we

44:57

are interested in integrating things

44:59

like XYZ. So

45:01

I was able to glean information such that

45:03

when I reported it back to the team,

45:05

they were like, I mean,

45:08

yeah, you,

45:10

you, you got the right information. And

45:12

nobody said anything in plain language, but

45:14

you can get people to say things

45:17

kind of beating around the bush. So in

45:19

the end, I got M&A info out of 50% of

45:22

the interviewers, three out of six. I

45:24

debrief with the security team. I

45:26

asked them when they want to discuss the

45:29

results with the organization. They say, well, let's

45:31

just wait up and just finish the hiring

45:33

process so that it's not a distraction to

45:35

them. And in the

45:38

meantime, the next day, I actually get an email

45:40

that I used to apply for this role, that

45:43

I was being moved to the next stage of the interview

45:45

process to get an offer. So

45:48

not only did I siphon out the info I

45:50

needed during the interview pent-up, I

45:52

also got the job, I guess. Well, congratulations.

45:54

I hope that goes on your resume. No,

45:57

I should put PM on there. So she meets

45:59

with the security team. and explains to them how she

46:01

found out about all this upcoming mergers and acquisitions.

46:04

And together they had a chat about whether

46:06

this was just an obscure edge case or

46:08

a bigger problem. They realized that

46:10

when they explained to people that

46:13

they were not allowed to say

46:15

the words of the

46:17

acquisition, they realized that they needed to

46:19

be clearer in their communication. That

46:23

no, just because you're not

46:25

saying we are acquiring XYZ

46:27

company, it doesn't

46:29

mean that friends, family, on social

46:32

media, people can't glean information to

46:34

understand, oh, they're interested

46:36

in AI. This person is talking about how their

46:38

role is going to change. They're

46:40

talking about how much they love this

46:42

specific technology and they're tagging certain companies

46:44

on Instagram or Twitter. They

46:47

realized that they needed to be much

46:50

more specific in their protocols and language,

46:52

saying, when we're planning an acquisition,

46:55

once we talk about it internally, please

46:58

do not talk about it even in a

47:00

hand waving fashion with friends or family or

47:02

on social media. Don't talk about how

47:04

your role is going to change on LinkedIn. Don't

47:06

talk about what you're excited about upcoming on Instagram. They

47:08

had to be really clear about that. And

47:11

once they did that, those leaks stopped because

47:13

it wasn't an insider threat. It was just

47:15

people not 100% getting

47:17

what an attacker is interested in and how

47:19

they could find that info. Oh,

47:21

so it did solve the problem. It did. Yeah,

47:24

it doesn't necessarily mean that it's coming

47:26

through the interview process. Now, maybe it

47:29

was, but it's probably just

47:31

that people in general at the company

47:33

didn't understand that when talking in generalities,

47:35

that can be used by attackers too.

47:38

Yeah. And then you probably had recordings to

47:40

show concrete proof of, when you say this,

47:42

I'm hearing this.

47:45

Exactly. And they were like, oh, I get it.

47:47

So I can't say that we're

47:49

talking about this technology and how

47:51

it's going to change my role as a

47:53

product manager, because that tips off people to

47:55

understand that we're going to be acquiring XYZ

47:58

company in the next six months. That's where

48:00

these leaks are coming from. So

48:04

you know, you came on my

48:07

radar because you sometimes just create

48:09

these crazy viral

48:11

instances online where

48:13

I've seen you hack a... Who's

48:17

Donnie from? Donio Sullivan from CNN.

48:21

From CNN. So I've seen you

48:23

hack a CNN correspondent. I've seen you

48:25

hack voting machines before. I've

48:27

seen you do all kinds of crazy things that suddenly

48:29

you've got like a million views on this thing.

48:31

And I'm just like, well, there she is again.

48:34

Rachel's out there going, doing things. But

48:37

one thing was interesting was when you went on

48:40

60 Minutes. Yeah, it's

48:42

last year or so I started talking more

48:44

on Twitter about how I'm seeing AI get

48:47

used by criminals to trick people. So

48:49

I'm talking about this scammers are tricking

48:52

grandparents out of 1500 bucks posing as

48:54

their grandson, spoofing the grandson's phone number,

48:56

voice cloning, or just like modulating the

48:59

pitch to sound like the grandson and saying they

49:01

need money for bail. Just talking about these examples,

49:03

60 Minutes sees this. They

49:06

email me, they reach out, they say, hey, we

49:09

want you to do a hack live. It's

49:12

actually got to trick somebody. Can

49:14

you do that with us? And

49:16

I'm like, I mean, yeah,

49:19

I can do that, but it's complicated. I've done a

49:21

lot of these live hacks over the years for large

49:23

media pieces. I

49:26

need consent. Before I

49:28

do any sort of hacking, I get

49:30

consent. Like when I hacked CNN's, Stonio

49:32

Sullivan, I hacked him through his

49:35

service providers, and I also hacked him through his leaked

49:37

passwords. And I had his

49:39

consent with a lengthy contracting process and

49:41

scope discussion before I was able to

49:44

contact his service providers pretending to be

49:46

him. Before I was able

49:48

to log into his LinkedIn using his breached

49:50

passwords and the things that I found online.

49:53

So I started explaining to them how much consent

49:55

I'm going to need. And they're like, I

49:57

mean, we'll just try and see what happens.

50:02

So I start to

50:04

talk to them about who my target is going to be.

50:07

They want my target to be

50:09

Sharon Alfonsi. She's an

50:12

awesome correspondent for 60 Minutes. Rachel

50:14

Toback is what's called an ethical hacker.

50:17

She studies how these criminals operate. So

50:19

ethical hackers, we step in and show

50:21

you how it works. So

50:24

the mission was to use AI to

50:27

somehow trick and scam the host of

50:29

60 Minutes while on the show. But

50:32

the problem is the host needs to consent

50:34

to being targeted, which if she

50:36

knows she's going to be scammed while on

50:38

her show, it'll really put her guard up,

50:40

right? So this was going to

50:43

be tricky. How do you trick someone who's asking

50:45

you to trick them? She's got a lot

50:47

of information about her online, so I do my own

50:49

thing. Since the host of

50:51

60 Minutes has been on TV for years, Rachel

50:53

realized there is a lot of audio of

50:56

Sharon talking. This might be

50:58

useful. Maybe she can somehow

51:00

use Sharon's voice to do something. I

51:02

determined through OSINT, open source intelligence,

51:04

that the best way to do this

51:07

hack was to

51:09

trick her coworker while

51:11

pretending to be Sharon. Because

51:14

sometimes our coworkers have just as

51:16

much info and access on us

51:18

as we do about ourselves. So

51:21

I needed to get consent from

51:23

the coworker. And here's

51:25

the massive challenge. I

51:27

needed to get her coworkers consent because she

51:30

was a major part of the hack. This

51:33

coworker is named Elizabeth. I contacted

51:35

her like, hey, this is what we're going to do. We're

51:37

going to do this hack. You

51:40

need to consent to essentially being part of

51:42

the hack, but you don't know when, where,

51:45

or how it's going to happen. You don't know

51:47

who I'm going to pretend to be. You're not

51:49

going to know the method of the attack, whether it's going

51:51

to be a phone call, email, text, contacting your

51:53

service providers, pretending to be you. Elizabeth

51:56

is awesome. She's like, that's

51:59

fine. That's like completely fine. I'm

52:01

really excited and I'm like, okay, let's

52:04

do this So I

52:06

decided that I wanted to do a phone

52:09

call because I wanted

52:11

to clone Sharon's voice and

52:13

spoof Sharon's phone number to Elizabeth and

52:16

Trick her during a phone call to reveal

52:19

some sort of personal information to me now

52:22

Sharon is a famous reporter. So her voice

52:25

is everywhere I grabbed about

52:27

five minutes worth of samples of her

52:29

voice just from YouTube videos from 60

52:31

minutes I put her voice into my voice

52:33

cloning tool. I start tweaking the tool There's

52:36

voice clone settings for things like

52:38

clarity voice stability style exaggeration And I

52:40

finally get the settings tweaked to a

52:43

point where I feel like it's gonna

52:45

be Credible at all during a

52:47

phone call, but it's not a hundred percent

52:49

perfect and I do my

52:51

open source intelligence to find the right phone

52:53

numbers to spoof and The

52:56

great phone numbers to call like I

52:58

said data brokerage sites have personal contact

53:00

details for almost everyone And I

53:02

need to find the right details to you during the

53:05

hack like upcoming travel the right information

53:07

to try and siphon out For the demo

53:09

you can find most of the stuff through social

53:11

media when people talk about their lives The

53:15

only issue No, I'm

53:19

gonna have to somehow get Elizabeth

53:21

to participate in this hack without

53:23

her realizing It's the hack itself

53:25

going down. How am I gonna do

53:27

that? She's already consented to this and she

53:29

knows it's coming So I

53:32

figure the only way that this is gonna work is

53:35

if it feels natural within the filming day Otherwise,

53:37

how is the film team going to

53:39

catch the hack live so that anyone

53:41

in the audience can watch it? So

53:43

when you so they've got

53:46

the cameras on you, they've got you

53:48

in the studio. They've got Sharon there

53:51

Not yet. Let me tell you these details So

53:56

I get my hair and makeup done for 60

53:58

minutes, right I'm doing my vocal warm-ups I'm like,

54:00

la, la, la, la, la, la, la, la, I'm getting ready

54:02

for recording. Sharon's still in

54:04

her room, prepping for the day. The

54:06

light and the sound crew are getting the gear

54:08

ready for the shoot. Elizabeth

54:11

shows up, she's getting ready. I

54:13

pull aside the head of the sound and lighting crew,

54:16

and I let them know that I think the only

54:18

way they're gonna be able to catch this hack on

54:20

camera live is this, I'm

54:22

gonna go out into the hallway with my computer and

54:24

phone. You, the camera crew,

54:26

cannot follow me because it'll be way

54:28

too obvious to Elizabeth if you follow

54:31

me, and it really shouldn't

54:33

matter anyway because it will be me on the other

54:35

end of the phone call, so you should catch the

54:37

interaction from her end anyway. So

54:39

you, the crew, must ask Elizabeth to

54:41

stand in for Sharon so you can get

54:44

lighting, sound, everything prepped so that when Sharon

54:46

finally comes down, she can just slide into

54:48

the shot and we can get started. That

54:51

way, when I start the hack, you'll be able to actually

54:53

see and hear Elizabeth and be able to catch the

54:55

attack in motion. Wow.

54:58

The crew is like, what

55:00

did we sign up for? This

55:02

is ridiculous. I am so

55:04

nauseated by this plan. Like, I'm so freaked out

55:07

by this because if this

55:09

doesn't work and Elizabeth is like,

55:11

sure, crew, I'll stand in for

55:13

Sharon and immediately realizes what is

55:15

happening, then this entire shoot, all

55:17

15 members of the

55:19

60 Minutes team, the lighting crew, sound, hair, and

55:21

makeup, everybody's here for nothing, and I will just

55:23

have to basically demonstrate what I would have done.

55:27

That's not gonna be fun for anyone. And now, mind you, it's

55:29

like 7 a.m. So

55:31

this feels like the crack of dawn for all of us. People

55:33

have not even had a cup of coffee yet. So

55:36

people are like, okay, Rachel, sure, we'll do that.

55:41

So I walk over to my hacker laptop. I

55:44

announce to the room that I need to go help my

55:46

team with something back home. So before we get started for

55:49

the day, everyone get your coffee,

55:51

whatever, set up. They say, no worries. I

55:53

step out into the hallway. I

55:56

can't hear anything that's happening in the ballroom now where

55:58

we're filming. I just have

56:01

to hope that the sound and lighting

56:03

crew have successfully gotten Elizabeth into the

56:05

quote stand-in position with the sound and

56:07

lighting on because otherwise they're not going

56:09

to catch this and like I'm not

56:11

going to fake it for them later. It needs to be real. So

56:14

I'm just like praying this works. So

56:16

I open up my voice cloning tool

56:18

in the hallway. I type

56:20

in my opening line into the voice cloning tool

56:23

and to be clear this voice this voice cloning

56:25

tool. I cloned Sharon's

56:27

voice. I can then type in

56:29

any words and it will spit out those

56:31

words spoken in Sharon's voice. So

56:34

I type in my opening line. My opening line is

56:37

Elizabeth sorry need my passport number because Ukraine

56:39

trip is on can you read that out

56:42

to me. It has to

56:44

be short and sweet direct into the point

56:46

without requiring a lot of follow-up

56:48

because the issue with these tools

56:51

is there's a delay in me

56:53

typing it into the voice cloning tool and

56:55

when it spits out the words in Sharon's

56:57

voice. I'm also holding my phone up

56:59

to the computer so there's like kind

57:02

of a strange audio vibe going on with this phone call

57:04

and I just want to minimize it and make it happen

57:06

as fast as possible. So

57:09

I open up my spoofing tool on

57:12

my phone. I type in Sharon's number

57:14

to spoof. I type in Elizabeth's phone number to

57:16

call. I hit go. It

57:18

is 100% silent. I

57:21

hear Elizabeth's phone. It's

57:23

audibly ringing inside of the ballroom and I'm

57:25

just hoping she like goes over and picks

57:27

it up right. My stomachs and knots. I

57:30

am sweating profusely and then I hear her go. Hello

57:32

Sharon. Like I hear it through my phone and I

57:34

can also hear it in the ballroom and I'm like

57:36

oh my gosh you can like hear out here. So

57:39

I hit my voice cloning play

57:42

button. It starts playing

57:44

Sharon's voice asking for the passport number. Elizabeth

57:46

sorry need my passport number because the Ukraine trip is

57:48

on. Can you read that out to me? And

57:52

then silence. For

57:55

what feels like hours I am sick

57:58

to my stomach. My hands are shaking. this

58:01

forever silence that I was experiencing

58:03

with Elizabeth holding her phone in

58:05

her hand, looking at

58:07

the caller ID during the call to

58:09

ensure it really does say Sharon, because

58:11

the voice sounds weird. I mean,

58:13

I'm voice cloning plus spoofing.

58:15

So it looks like it's calling from Sharon, but it sounds

58:17

kind of far away because I'm holding my phone up to

58:20

the computer. Elizabeth finally

58:22

responds. Oh, yes, yes, yes, I

58:24

do have it. Okay,

58:27

ready? And then she reads out the

58:29

passport number I just asked her. I'm like,

58:31

let's just get off this call as soon as possible.

58:33

So I say thank you. She starts

58:35

asking me questions. You know, when am I going

58:38

to be down for the shoot? Do I need anything else? And

58:40

I have to do I have to deal with this

58:43

delay back and forth typing in my replies. So I'm

58:46

just thrilled that by this point, I like

58:48

siphoned out information. And I just wanted

58:50

to get off this call as fast as I possibly could. So

58:52

I said, I'm just coming down. And I

58:54

am in the call. I

58:57

walk into the ballroom. Elizabeth is

58:59

sitting under the lights with the mic pinned

59:01

on her. And I'm like, it worked. All

59:04

I have to do now is do the interview

59:07

with Sharon and explain the mechanics of the hack

59:09

to her live and make sure

59:11

that Elizabeth knows that anyone would fall for

59:13

this style of hack because most people don't

59:15

realize it's possible yet. I wanted to make

59:17

sure she didn't feel like horrible about it.

59:20

And yeah, she did the interview

59:22

and explained what just happened, how

59:24

she tricked Elizabeth into giving Sharon's

59:26

passport number. But after

59:29

listening to this story, I got

59:31

really curious about this voice cloning tool

59:33

and wanted to try it myself. So

59:38

to clone someone's voice, you give it

59:40

a bunch of audio of them talking

59:42

and using some advanced AI, it will

59:44

get to know that voice. And

59:46

whatever you type, it'll say it in their voice.

59:49

I spent a few hours playing around in

59:51

this tool. And I cloned my voice. I

59:54

think it's really interesting. Okay,

59:56

I want to show you, I'm going to play two clips

59:58

for you. I want you to listen and

1:00:00

try to figure out which one is

1:00:02

AI generated. Ready? Here's

1:00:04

clip one. Hey this is Jack

1:00:06

Resider. This morning I had a peanut butter

1:00:09

and chocolate smoothie for breakfast. Okay

1:00:11

here's clip two. Hey this is

1:00:13

Jack Resider. This morning I had a

1:00:15

peanut butter and chocolate smoothie for breakfast. Okay

1:00:18

punch in your votes. Ready for

1:00:20

me to tell you? Both clips were

1:00:23

AI generated. In fact what

1:00:25

you're hearing right now is AI generated

1:00:27

too. I switched to having

1:00:29

AI talk for me a few minutes back.

1:00:31

I just type whatever I want and

1:00:34

it'll narrate it for me. It's really

1:00:36

wild. It even adds in breaths like

1:00:38

this. Listen and sometimes it'll

1:00:40

even add plosives like how the P sounds

1:00:42

and nope. It's crazy how good

1:00:44

this sounds. Huh

1:00:47

okay okay I'll switch back to my normal voice

1:00:49

now. There is I'm using my

1:00:51

real voice now okay. The

1:00:54

future is going to be weird isn't it? Okay

1:00:56

so I just saw this article the other day

1:00:58

on CNN's website and it said there was this

1:01:00

guy working for a company in Hong Kong who

1:01:03

controlled the finances for that company and

1:01:05

he got invited to a video call with a

1:01:07

CEO and a few other colleagues that he recognized

1:01:09

and he saw them on the screen. He heard

1:01:12

their voices and he was positive. It was the

1:01:14

CEO and his colleagues and they

1:01:16

were telling him there's this new deal that just finished up

1:01:18

and they wanted him to send 25 million

1:01:20

dollars to another company. So

1:01:22

he did but the problem

1:01:24

was the video and the voices

1:01:26

were all AI clones. Scammers

1:01:29

tricked him into thinking he was on a

1:01:31

video call with the CEO and our

1:01:34

future is almost surely not going to be what we

1:01:36

think it's going to be. I have

1:01:38

a feeling we're going to have a

1:01:40

hard time knowing what's reality and what's

1:01:42

fiction. Yeah it's a good

1:01:44

point. Hey Jack can I jump in here? Yeah

1:01:47

who's this? This is Daniel Meisler.

1:01:50

Oh hey Daniel. Yeah what do you have to

1:01:52

say about this? Yeah so what I

1:01:54

find fascinating about this whole story is that

1:01:57

there's a very early concept in security about

1:02:00

How do I know it's you, right? And

1:02:02

we normally don't have to worry about this

1:02:05

with video because seeing has always been believing.

1:02:08

And the same with hearing for audio. But

1:02:11

now with deepfakes, for both video

1:02:13

and audio, we need a whole nother layer.

1:02:16

So what I actually expect to see here is products coming

1:02:20

out that are basically like, how

1:02:22

do I establish like early trust?

1:02:24

Like as soon as you join a company, you'll

1:02:27

probably establish like keys across like

1:02:29

all of Slack or across like

1:02:32

all of Microsoft Teams or something. And

1:02:34

that's like your predetermined channel. I

1:02:37

think this is a good idea. If you

1:02:40

can cryptographically sign something, then that'll prove the

1:02:42

message or video came from you. So I

1:02:44

imagine this could cut down on people falling

1:02:46

for fakes. If it's not actually signed

1:02:48

by the person who sent it, don't trust it. Initially

1:02:51

getting your key would be interesting though. You

1:02:53

still have to prove who you are at

1:02:56

the beginning, right? And one way

1:02:58

to do that is to verify who you

1:03:00

are in the meet space, the real world.

1:03:02

When you're face to face and in person,

1:03:05

it's still a valid verification technique that you

1:03:07

are you. But with

1:03:09

everyone having their own cryptographic keys to

1:03:11

prove someone is real, the

1:03:14

threat then moves to securing the

1:03:16

key. If someone else grabbed

1:03:18

the key, they could make it look like

1:03:20

you sent something when really you didn't. They

1:03:22

just signed it using your key. Yeah,

1:03:25

yeah, absolutely. We're gonna need something like

1:03:27

that for all remote calls essentially. Cause

1:03:29

it's like, first of all, the AI

1:03:31

can copy both of our voices because we have our

1:03:33

voice out there and that's

1:03:36

easy to copy. But very soon, like I

1:03:38

won't know honestly if this is you right

1:03:40

now on this call. I

1:03:42

just imagine like making a whole caption network for

1:03:44

everyone I know, right? So my dad calls me

1:03:46

on the phone and it says, before you can

1:03:48

connect to this party, please solve this caption. Exactly,

1:03:51

exactly. There's still gonna be some sort

1:03:53

of challenge or like predetermined

1:03:56

like key exchange, yeah. Oh

1:03:58

my gosh. I personally, I'm

1:04:00

excited about our future. We

1:04:02

are smarter than ever and more

1:04:04

advanced than ever, and it feels

1:04:06

like the human race is going

1:04:08

through a Cambrian explosion of sorts,

1:04:10

with all new technologies and advancements

1:04:12

popping off almost daily. We're

1:04:15

living in the exponential era.

1:04:18

Time will move faster from here

1:04:20

on out, and we get

1:04:22

to witness it. We have

1:04:24

tickets to watch the birth of

1:04:27

human 2.0. 2.0, how special is that?

1:04:32

Whatever comes next will

1:04:34

surely be exciting. A

1:04:46

big thank you to Rachel Toback for coming

1:04:48

on the show and sharing these stories with

1:04:50

us. She wrote a free ebook on social

1:04:52

engineering, and you can find a link to

1:04:55

it in the show notes. Besides doing social

1:04:57

engineering for companies, she also does security awareness

1:04:59

training. And in fact, she started a whole

1:05:01

video production company where she creates fun and

1:05:03

entertaining training videos. You can learn more about

1:05:05

what she's doing by visiting socialproofsecurity.com. Also

1:05:08

thanks to Dan Miesler for giving us some insights

1:05:10

into AI. This episode was

1:05:12

created by me, the backseat writer,

1:05:14

Jack Recider. Our editor is the

1:05:16

gourmet sorbet, Tristan Ledger. Mixing done by

1:05:19

Proximity Sound, and our intro music is by

1:05:21

the mysterious Brickmaster Cylinder. How

1:05:23

does a computer...

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features