Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
1:11
CyberWork and InfoSec would like to introduce you
1:13
to our new Cybersecurity Beginner
1:15
Immersive Boot Camps . They're designed
1:17
to help you gain and enhance your expertise
1:19
in the cybersecurity field . Join our
1:21
live interactive virtual classes led
1:23
by InfoSec's highly skilled instructors , who
1:25
will guide you through the material and provide real-time
1:28
support . And , as part of InfoSec's
1:30
Immersives training , each student will have
1:32
access to career coaching aimed
1:34
at helping them start or switch to the cybersecurity
1:37
field . You heard that right . We aren't here
1:39
to just teach you the concept of what a security
1:41
professional does . We want to prepare you
1:43
to enter the job market with a competitive
1:45
edge in six months time . Now
1:48
I've told you about InfoSec certification boot camps
1:50
, and if you're trying to hit your next career target and
1:52
need a certification to do it , that's still your best
1:54
bet . But if you're an entry-level cybersecurity
1:57
professional or want to be , or you're
1:59
switching your career and want to experience a career
2:01
transformation , infosec's
2:08
immersive bootcamps are designed to make you job-ready in six months
2:10
. To learn more , go to infosecinstitutecom . Slash cyberwork all one
2:12
word C-Y-B-E-R-W-R-K and learn more about this exciting
2:15
new way to immerse yourself in learning with InfoSec
2:17
. Now let's begin the show Today
2:20
on CyberWork . I'm very excited to welcome
2:23
Debbie Reynolds , the data diva herself
2:25
to discuss data privacy . Now Debbie
2:27
developed a love of learning about data privacy
2:29
ever since working in library science , and
2:31
she took it through to legal technologies and now
2:33
runs her own data privacy consultancy
2:36
and hosts a long-running podcast , the
2:38
Data Diva Talks Privacy Podcast
2:40
. We talk about data privacy in all its
2:42
complex , nerdy and sometimes frustrating permutations
2:45
how GDPR helped bring Debbie
2:47
to even greater attention , how AI
2:49
has added even more layers of complexity
2:51
to this puzzle . And Debbie gives some great
2:53
advice for listeners ready to dip their toes into
2:55
the waters of a data privacy practitioner
2:58
career . That's all today on CyberWork
3:00
. Hello
3:06
and welcome to this week's episode of the CyberWork
3:08
podcast . My guests are a cross section
3:10
of cybersecurity industry thought leaders , and
3:12
our goal is to help you learn about cybersecurity
3:14
trends and how those trends affect the work
3:16
of infosec professionals , and
3:18
leave you with some tips and advice for breaking in or
3:21
moving up the ladder in the cybersecurity industry
3:23
. My guest today I'm very excited about this
3:25
is Debbie Reynolds . She's known as the
3:27
data diva . She's a leading expert in data
3:30
privacy and emerging technology . With
3:32
over 20 years in ad tech , fintech
3:34
, legal tech and AI . She delves
3:36
into smart cities , iot and data
3:38
privacy . She's a sought-after
3:40
keynote speaker , and Debbie has addressed
3:42
companies like Coca-Cola , paypal and Uber
3:45
. Her insights appear in the New York
3:47
Times and Wired , and
3:49
she is also the host of the number one podcast
3:52
, the Data Diva Talks Privacy Podcast
3:54
, which I love . I've listened to about half a dozen episodes
3:56
already . I'd highly recommend it . She's recognized
3:59
globally as a top data privacy
4:01
expert as well . Her leadership roles include
4:03
the US Department of Commerce's IoT
4:05
Advisory Board membership and the chair
4:08
of the IEEE Committee on
4:10
Cybersecurity for Human Centricity
4:12
, which is influencing the future of data privacy
4:14
and emerging technology . So no
4:16
guesses in terms of what we're gonna be talking about today . It's gonna be
4:19
data privacy all the way , but I'm looking forward
4:21
to learning more about Debbie and her journey
4:23
and how she got here . So , debbie , thank you so much for
4:25
joining me and welcome to CyberWork .
4:27
Thank you so much for having me on the show . I'm excited
4:29
to chat with you today ?
4:39
Oh , my pleasure . Yeah , like I said , I've been listening to your podcast for a while now and I'm
4:41
even , as someone who's not completely steeped in data privacy myself , it seems like I get at least
4:44
one really cool insight in every episode . So , yeah , I really appreciate it . So
4:46
yeah , debbie , to help our listeners get to know
4:48
you a little better where did you first get
4:50
interested in , like computers and technology
4:52
and security and privacy
4:54
was was there ? Was there an initial spark ? Was there a moment
4:57
when you said this is what I want to do with my life
4:59
, or what got you excited about it ?
5:01
yeah , well , it is quite
5:03
a pernicious journey . Uh , it wasn't
5:05
a straight line , I would say . Um
5:08
, I was actually a philosophy major in college
5:10
. I thought I was gonna actually become
5:12
a lawyer , uh , and I actually
5:15
my mother was diagnosed with cancer
5:17
when I was in my senior
5:19
year of college and so I
5:22
decided I want I need to do something
5:24
where I can be spend
5:26
more time with her . So I
5:28
started doing , you know
5:30
, back then it's called desktop publishing , like
5:32
now . It's like graphic design and different
5:34
things like that . And
5:36
I had a friend that had another friend
5:38
that was a
5:41
head of a university library and they wanted
5:43
to create databases
5:46
of books and so they asked me to
5:48
help out and I did and I fell
5:50
in love with data . So that
5:53
was kind of the beginning of my data journey . So
5:55
this was back in the day when
5:57
libraries had card catalogs
5:59
and they were trying to create
6:02
databases of books and you know
6:04
, you see how rapidly and differently you
6:06
know library science is now as it
6:08
was back then . But then
6:10
, during that time also , I
6:12
read a book called the Rights of Privacy and
6:15
Caroline Kennedy was a co . The Rights of Privacy and Caroline Kennedy
6:17
was a co-author of that book and the
6:19
book shocked me . Actually
6:21
it was a book my mother had and
6:23
she was very interested in it
6:26
and her interest made me interested
6:28
and
6:36
it was all about what's private and what isn't in the US . And I was , you know
6:38
, very shocked because I think in the US we think , you know , we're the land of the free , the
6:40
home of the brave , but we don't know that privacy is not like
6:42
a constitutional right . And
6:44
that book talks about those legal , you
6:47
know , areas , those gray areas
6:49
where people's privacy , you know , may be taken
6:51
advantage of . So that was around the time of
6:53
just , you know , like the early internet
6:56
, right , 1995 . And
6:58
so as I worked more and
7:00
started getting more in depth into technology
7:03
and more stuff started getting into digital
7:05
systems , I started to see , you
7:08
know , the problems with putting
7:10
people's personal data in these systems
7:12
or how the data was shared and different things like
7:14
that . And so I
7:16
, as I I was doing this work , I was working a lot
7:18
of multinational companies that were doing
7:21
data moves and data
7:23
transfers around the world , and in
7:25
order to do that you have to know , like , the laws
7:27
for different jurisdictions
7:29
, and back then that wasn't even a job
7:31
, right , it was just something you just had to know . But
7:34
as
7:37
the European Union started to
7:39
re-examine their
7:41
data protection regulations
7:43
, I started getting calls from companies
7:46
that knew me and they started asking me hey
7:48
, can you come talk to me about privacy ? And
7:50
so one of the first big companies they asked me
7:52
to come talk with them was mcdonald's corporation
7:55
. So I went to their corporate
7:57
legal department and talked to all their legal
7:59
folks around the world around
8:01
privacy , and this is before the
8:03
big chain and the general data
8:05
protection regulation came into play in
8:08
europe . And I was like , hey , this is even though
8:10
this is europe . This is going to be a huge , big deal and
8:12
this is why it is and this is how it's going to change
8:15
. You know your work , and it actually
8:17
did . And so , like nobody
8:19
in the US talked about that around that time
8:21
, right , this is before it came out . And
8:24
then I just kept talking about , hey , this is important
8:26
. You know , I felt like I was like Paul
8:29
Revere , like , oh , the British are coming . You
8:31
, the British are coming . You know , you need
8:33
to pay attention . And eventually , by
8:35
the time the regulation
8:37
went into full effect , I got a call from PBS
8:40
and they asked me to be on TV to talk about
8:42
germ data protection regulation
8:44
. It's funny because people still
8:47
contact me about that interview , so I made
8:49
some predictions that actually came true . It's pretty
8:51
funny , but yeah , so that's what
8:53
I decided . Well , maybe I should just do privacy
8:56
.
8:56
Yeah , absolutely . I
8:59
love that . So it sounds
9:01
like the sort of the roots
9:03
of the Data Diva experience came
9:05
from GDPR happening and a lot of people
9:07
not in Europe or companies
9:09
that were partly in Europe and partly
9:12
in the US were like help us understand
9:14
what the heck's going on here . Is that accurate
9:16
?
9:17
Yeah , absolutely Absolutely . So
9:19
it's fortuitous . You know , I'm like it's
9:22
so funny because what ? So ? The
9:24
GDPR went into full effect in
9:27
May 25th of 2018 . That's when
9:29
I was on PBS . But two
9:31
years before that , the law was passed and
9:33
I thought , you know , the
9:36
day that passed became a law
9:38
, quote-unquote , but not enforceable . I
9:40
thought , okay , I'm gonna wake up and today
9:43
everyone's gonna care about data protection and
9:45
like there was nothing on the news , there was
9:47
nothing , like nobody cared , right
9:49
. So I thought , oh god . So I thought
9:51
, well , I just need to talk about this a lot
9:53
. There even weren't things
9:55
written about it , right ? Uh , when
9:57
I was telling people about people like , oh , can you give
9:59
me a summary ? like there is no summary , you
10:02
know so I started doing like writing a lot
10:05
in that time period about it , just so
10:07
that there'll be some information that people can
10:09
look back on .
10:10
Yeah , yeah yeah , yeah , I
10:12
, I remember that that era . It's
10:14
funny . Yeah , yeahs don't really become laws
10:17
until they become enforceable , and I know that on
10:20
your show you talk about you know , if
10:22
you give your guests one wish , or
10:24
you know , king for a day , queen for a day , whatever
10:26
, in terms of making changes , and I over here
10:28
talk about having a magic gavel and stuff . So we'll
10:30
talk later about what kind of enforceable
10:34
things that we want to get
10:36
into with regarding the data privacy . But to
10:38
start with here I want to talk a little more about
10:40
your career . So I like to
10:42
look around . I
10:44
guess maybe this is an invasion of privacy , but I look at
10:46
your LinkedIn profile , your experiences , to get a
10:48
sense of what your history is
10:50
like and it gives me a sense of
10:53
you know , your employment story and
10:55
your your cyber history and stuff like that
10:57
. So I mean , you've , you know , uh
11:00
, done a lot of stuff uh
11:02
in you . You told us a little bit
11:04
about that in terms of library , science and so forth
11:06
, but you spent a good portion of your earlier career
11:08
working in law and specifically legal technology
11:11
. Uh , and I've had one other guest on who's
11:13
a law cyber person on the show in the past
11:15
and it's . I think it's not
11:17
a common path , but it's an interesting one . So
11:20
could you talk about how
11:22
a pivot like this from
11:24
the legal tech area I guess that was probably privacy related
11:26
too but like how did that sort of turn
11:29
into this complete world of data privacy
11:31
?
11:32
Yeah , well , I think people misunderstand
11:35
that part of my history . I've
11:37
always been a data person and I've always
11:39
done consulting around data
11:41
projects , but I have had situations
11:44
where I've worked
11:47
with companies that were involved in
11:49
legal stuff . But that was not , that
11:51
was not the whole entire , the whole
11:54
enchilada of what I worked on
11:56
, right ? So you know I was still working
11:58
. You know I'm still working like an ad tech
12:00
. You know people just call me up . I've
12:03
been very fortunate that people have called me up all
12:05
types of wacky things that they want to do with
12:09
data , and so I've been able to
12:11
. You know , for me , I think , if people
12:13
see , oh well , she's done some stuff in
12:15
legal tech , but I was never always
12:17
in legal tech Like I was in all
12:19
these other types of tech spaces . But
12:22
for me , to me , that's the
12:24
reason why my work
12:26
is so unique , because I traverse
12:29
all these different industries . I think people
12:31
just didn't know that I knew all . You know people
12:33
in business intelligence
12:35
, people in ad tech , people in you
12:37
know , pharma , all this other type of stuff
12:39
. Just because you
12:47
know , when you think of legal , a lot of legal issues
12:49
around data is around litigation , and
12:56
so companies have bigger data problems than litigation , so I work with them
12:58
on all those different things . So before litigation , you know people need to have better governance
13:00
of everything you know within their corporation
13:02
and a lot of my talking
13:04
with people over the years about privacy
13:07
is about how much bigger it
13:09
is than any one industry .
13:11
Yeah , yeah . Now I
13:13
want to sort of break apart how
13:15
you acquired this sort of tool , belt
13:17
of private , because I know it sounds like you . You
13:20
know you learned as as needed on the job and
13:22
so forth , but for a lot of our , our , our listeners
13:24
, they're trying to figure out you know where their
13:26
opportunity is , is going to come from . And
13:28
one of our past guests , chris Stevens , is , is
13:31
our InfoSec skills author
13:33
for privacy and sort
13:35
of privacy certifications . You
13:38
know the seven of them or whatever . And I'm
13:40
wondering if you can talk a little bit
13:42
about the combination of
13:45
how you sort of like came to learn , like
13:47
the different privacy regulation
13:51
structures around the world , the governance
13:53
. How did you sort of put all that together
13:55
? I mean , I know it was kind of on the job and
13:57
as you were going , but like , well , what are the different
13:59
pieces of this toolbox that sort of add
14:02
up to the whole ?
14:03
Yeah , actually it wasn't on the job . Privacy
14:06
is my own personal interest . So
14:09
, these are things that I was interested in
14:11
back then . As I said , when I read that book
14:13
in 1995 , I just
14:15
kept seeing , just in general
14:18
, how more data was being created
14:20
, more data was being collected , seeing
14:23
kind of the gaps , and that's what I talk a lot
14:25
about on my show . So I talk about the gaps in
14:27
privacy , the gaps in regulation
14:29
. You know how data plays within
14:32
data spaces , and so
14:34
what happened is my personal
14:36
interests converged with my professional
14:38
life . So you
14:40
know , like I see things , like people say , well
14:43
, let's put chips in pets
14:45
. And I thought , oh , this is bad , because
14:48
people are going to put chips in people at some
14:50
point . Right . And so for me
14:52
, I didn't ever think
14:54
that my personal interest in
14:56
privacy would turn into a business . But
14:59
it is because people will start saying
15:02
, well , we want to do this . Well , you know
15:04
. Like an example , let's say someone said
15:06
, ok , we have some data in France
15:08
and we want to transfer the
15:11
FTP over to the US , and I'm
15:13
like , well , you can't do that , yeah
15:16
. And they're like , well , why't do that ? And
15:18
they're like , well , why
15:21
? I'm like because they have , you know , this type of data can't be transferred because of you know
15:23
these blocking standards and those are things you know . That was not something that that
15:26
was . It was not a job requirement or anything
15:28
. That was what I brought my knowledge
15:30
that I had , just because that's something that
15:32
interests me . So for me
15:34
it's just been many years of reading , researching
15:38
, because I guess I'm you know , I'm
15:40
personally interested in privacy
15:42
. You know , I want you know what are people doing
15:44
with my data . So I think my , you
15:46
know , I think my motivation for
15:49
being interested in privacy is , is
15:51
personal . Yeah , Right
15:53
, I'm like well what , what you know , what are you doing
15:55
? And so I , when
15:57
I had a chance to get involved whether
16:00
it was , you know , this IOT advisory
16:02
board , you know I raised my
16:04
hand . I said , hey , I want to , you know change
16:07
. You know how can I use my skills
16:09
to change the
16:11
way that things are happening in the world ? And
16:13
so that's what I decided . I'm like well , I
16:16
can go in any direction . Right , I
16:19
can take off a career like a cult and
16:22
move into any type of data space , because I've
16:24
built data systems for over 30 years .
16:27
Can you talk about building data
16:29
systems Like what does that process
16:31
look like on sort of a practical day-to-day
16:33
level ?
16:34
Yeah , so I guess my early career
16:37
in library science
16:39
literally creating the
16:41
technology that's capturing particular
16:43
data types , whether that be text
16:46
or , you know , barcodes or different
16:48
things that we're using around
16:50
that . Eventually
16:52
technology got to
16:55
a place where people wanted more descriptive information
16:57
around data . That was put
16:59
in Because you know , when you think of like
17:01
libraries and catalogs
17:04
, it basically would tell you you know here's
17:06
a book and here where it is and here's
17:08
the title you know . Eventually people want to
17:10
know more , right ?
17:11
So what's the ?
17:12
description of the book , what's in the
17:14
book ? You know what books
17:16
you know cross-reference the same information
17:18
, so I think you know my background
17:21
in . That really helped me
17:23
, especially as other companies
17:25
were coming to the challenge where
17:27
they couldn't manage
17:30
the data that they had manually
17:33
right , so there were just not enough paper . There were not had manually
17:35
right , so there were just not enough paper . There were not enough
17:37
people right . As we see more
17:39
digitization , especially
17:41
with the commercial internet , where people were
17:44
you know , microsoft Office
17:46
, different things like that People were authoring
17:49
things more rapidly . There's
17:51
more data in different forms . It
17:54
was just difficult to be able to try
17:56
to manage . I think if
17:58
it were not for the internet , we'd still be in a very
18:01
paper world . Especially
18:03
, we're seeing this escalation again
18:05
with artificial intelligence , where
18:08
there's going to be even more data created
18:10
.
18:12
It's even harder to source where it's
18:14
coming from , with ai , I imagine yeah , exactly
18:16
so .
18:16
It's creating more data challenges
18:18
that we're having just because the the volume
18:21
and the the
18:23
the speed in which this data
18:25
is being collected . So for me it was like
18:27
, okay , so we don't have the right tool , let's
18:29
build it . Let's build a new tool
18:31
or let's go . You know is another
18:34
, another industry using a
18:36
tool in a different way that we can learn
18:38
from . So , you know , it's just
18:40
it's kind of a race , a race
18:43
to try to keep up with the
18:45
demands of what people really want
18:47
and what those technologies end
18:49
up being used .
18:51
Yeah , now , like I said
18:53
, I really enjoy the Data Diva
18:55
Talks privacy podcast here and
18:57
I encourage our listeners to go
18:59
check it out and check out some episodes . It
19:03
obviously comes from a place of real passion for
19:05
you . Like you said here , you
19:07
got into data privacy because you
19:09
have this very vested interest in understanding
19:11
what our data is being used for
19:13
and you know each episode
19:16
. It feels like you know you're asking the guests
19:18
what is the thing you know , what
19:20
is the data privacy issue
19:23
that you're most worried about right now ? So I'll turn
19:25
the question back to you , debbie what
19:27
is the number one data privacy issue
19:30
for you at the moment ? What's the thing that , if
19:32
not , keeps you up at night , keeps you kind of thinking about
19:34
it at the moment ?
19:35
what's the thing that , uh , if not keeps you up at night , keeps
19:37
you kind of thinking about it . Yeah well , let's see my big issue
19:39
. I guess I'm gonna blend it with ai . I guess
19:42
, uh and this is a concern I've had
19:44
for many years and so I've seen it
19:46
play out now uh , probably
19:48
worse than I even imagined . So I
19:51
feel like people are abdicating their human
19:53
judgment technology where
19:56
we're saying , oh , we don't need humans
19:58
, we'll just have robots that you know
20:00
fold clothes and you know pick
20:02
up eggs and stuff like that .
20:05
We're going to save so much money this way ?
20:06
Yeah , so it's like , yeah , but
20:08
you're trying to use technology to make bad
20:11
decisions and then you're trying to advocate your
20:13
, your responsibility as a
20:15
human for that right . So we're seeing
20:17
a lot of bias in hiring
20:19
algorithms . We're seeing a lot of problems
20:22
with , you know , these tools spitting
20:24
out information that's not correct , things
20:26
that are harmful people . Um
20:28
, you know , we're seeing just an over
20:30
data collection , just too much data . There
20:33
was a expose
20:35
in the New York Times recently
20:38
about cars collecting people
20:40
don't even know like your car
20:43
is collecting data that's going to these secret
20:45
data brokers to get it sold to
20:47
insurance companies without their knowledge . And one
20:49
of my friends who was a data expert
20:52
didn't know this . His insurance went up 30%
20:54
and they were like you know , I didn't get any tickets . Expert , didn't know this , and his insurance went up 30% . And they were like
20:56
you know , I didn't get any tickets , I didn't do anything
20:59
. You know there was nothing in my
21:01
quote unquote driving
21:03
record that was wrong . They were like well , you
21:05
know , you drove through this neighborhood
21:08
five times , or you
21:10
, you know , we think you took these
21:12
trips , these number
21:14
of trips in your car , or , um
21:16
, you , you , you broke
21:19
, you did like a fast stop
21:21
, a hard break .
21:23
Uh , you know they're tracking the
21:26
, the actual drive it , you're driving style
21:28
and they're like you're too dangerous , wow exactly
21:31
so mean to me that's problematic
21:33
.
21:34
Right , right I'm like
21:37
maybe I did a hard stop because someone
21:39
a kid ran into my
21:41
pathway and I wouldn't hit the kid Right
21:43
. A
21:47
lot of context , a lot of possibilities
21:49
for context here have with people using
21:51
technology . I see to me
21:53
AI should be more of a low-level , low-risk
21:56
, low-stakes tool as a
21:58
helper to people . It shouldn't be making
22:00
decisions about people .
22:07
I agree completely . Yeah , no , there's a lot of space for it , especially in terms of crunching numbers
22:09
or , you know , gathering , you know data from large sets and stuff like
22:11
that it's perfect for . But yeah
22:13
, doing things like that and you
22:16
know , also sort of replacing sort of human
22:19
judgment , you know , and allowing sort of decision-making
22:21
, is pretty wild . Now can you
22:23
you know , obviously you're a person
22:26
, a forefront can you speak to support
22:28
anyone else who is currently working
22:30
or talking in this space
22:32
, who is saying especially interesting
22:35
things about the confluence of AI and privacy right
22:37
now , that you'd like to shout out ?
22:39
Yeah , wow , there are so many different
22:41
people . One person I think that
22:43
you would love to chat with . His
22:45
name is Stephen Lawton , so
22:48
he's a writer . He's
22:50
a technologist in cybersecurity
22:52
. He traverses a lot of different
22:54
areas . He does a lot of writing for publications
22:58
like Dark Reading Bloomberg . He's
23:02
a great person to talk to because he knows
23:04
so much about computers and computing
23:06
. He talks about the cyber
23:09
risk and cyber insurance , so
23:12
he goes the
23:14
gambit , I think , around technology . He
23:16
has a deep , deep knowledge of just
23:19
tech and cyber in general , and
23:21
so he's a . He's a great fan
23:23
and he's been on the show as well .
23:25
he's amazing and well , um
23:27
, let's talk about the show here . Like I
23:29
said , I've been uh excited
23:31
about about it for a while now and you've got
23:33
what almost 200 episodes on there , something
23:35
like that , or over 200 ?
23:36
Yeah almost 200 . Amazing
23:39
200 in about I don't
23:41
know 10 weeks or something .
23:42
So when did you start it ? In 2018
23:45
?
23:47
2019 , I think , or
23:50
2020 ?
23:51
No , I've been 2020 , I think , wow okay
23:53
, that's an incredible pace
23:55
of work there . So , um , well
23:58
, let's tell our our guests , uh
24:00
specifically about the type of people that
24:02
you speak , uh speak to and I think
24:05
, because you know you look at a list of
24:07
175 , whatever uh
24:09
past guests and you get a little overwhelmed what
24:11
is there like one or two episodes that you think
24:14
our listeners should start with ? That really gives like
24:16
a good idea of like what makes the show great
24:18
yeah , oh , wow , that's such a
24:20
great question .
24:21
Uh , first of all , we're really happy
24:23
and proud that people really love the podcast
24:26
. I think we have last time I checked we've had
24:28
over 170 000
24:31
downloads of those
24:33
podcasts . Uh , we have listeners
24:35
in over 112 countries
24:37
. Um , the guests
24:40
run the gamut , um , so they're
24:42
not . I think the reason why people really like
24:44
the podcast is that it's not just privacy
24:46
people . Uh , because privacy
24:49
is such a horizontal issue
24:51
that impacts like almost any type of
24:53
company or any type of profession
24:56
. I'm able to bring on people
24:58
from all different areas
25:00
. So there may I may have lawyers
25:02
. I have cyber people . I have people in , like
25:04
you know , biometrics identity
25:08
. You know
25:10
, just , you know anyone in a data space
25:12
that wants to talk with me about privacy
25:14
. You know we've had , you know , people like
25:16
Cameron Carey , who's
25:18
part of the Brookings Institute , who's probably
25:20
one of the biggest you know
25:23
people in the US on privacy
25:25
. We had Johnny Ryan , who's an advocate
25:27
in Ireland . He works
25:29
on a lot of those real-time bidding at
25:32
tech cases . He's been on . I
25:36
mean , we've had a ton of VIPs . Probably
25:38
one of the coolest
25:40
episodes that I've done recently is
25:43
with a guy named Jesse
25:46
Taylor . So Jesse Taylor
25:48
, he's actually the inventor of
25:50
the App Store , really
25:52
, wow , yeah , yeah . So he
25:54
has a story in the show where
25:57
he he had a meeting
25:59
with Steve jobs . They introduced the
26:01
the app store concept to
26:03
him , uh , and then they
26:05
Apple bought it and so that's how the app
26:07
store started , with Apple , uh
26:09
. So his episode is amazing because
26:12
he talks about his trajectory
26:15
in tech , you know , inventing
26:17
the app store and then now he's working on
26:19
identity , right , ok , try
26:21
to solve a lot of identity
26:23
problems . So he has like a really cool
26:26
technology and a really cool way
26:28
that he thinks about stuff . So for
26:30
geeky tech people who are really
26:32
interested in that , you would really love that .
26:34
So it was one of my favorites oh , it's
26:37
awesome and I think also people love hearing
26:39
the firsts of things
26:41
like that , or or where things started . I
26:43
think one of my my past guests was , uh
26:45
, the first person to hack an iphone . I
26:48
think it was like the first , first iphone that
26:50
came out and they were part of the team that , uh
26:52
, hacked the first iphone . So everyone wants to
26:54
hear that story for sure . But so
26:57
I want to move into sort of the work
26:59
portion of our show , cyber Work here , because
27:02
you're giving us
27:04
lots of awesome knowledge here about data privacy
27:06
and I wanted to ask you
27:08
specifically about the role of
27:10
, say , a data privacy officer . You
27:14
know you learned your skills
27:16
over a long period of time . You were
27:18
doing other work , you were doing it
27:20
as kind of like a thing that
27:22
excited you and you gathered the skill
27:24
set over , you know , over time
27:26
. But for people who are just trying to get into this
27:29
space now , do you have any recommendations
27:31
for common educational
27:33
requirements , qualifications , degrees
27:35
or certifications or extracurricular learning that
27:38
would sort of get you up to speed
27:40
to do this type of work ? Like , where would you start
27:42
?
27:43
Yeah , that's a great question . First
27:46
of all , I would say start with self-learning
27:48
. So before you , spend any money on it
27:50
you know , look into it , look
27:52
into what's being written , follow people
27:54
who are talking about things that you're interested
27:57
in . You know there was a lot of
27:59
. You know , back in the olden times
28:01
you had to go to the library , but now you have the
28:03
Internet , so go on the Internet . You
28:06
know , I tell people they're interested in privacy stuff
28:08
. Put like a Google alert
28:10
for yourself for these things you're
28:12
interested in , so that you can get like a reading
28:15
list every day without having to go out and search
28:17
for things to see . Like , is this
28:19
something I'm interested in ? Like , do I really want
28:21
to go into this area ? Um
28:23
, also , I think , especially for people who
28:25
are young in their career , who aren't established
28:27
and they are not known , I recommend
28:30
that they try to decide
28:32
maybe get some type of certifications
28:34
, maybe a variety of them
28:36
. So , for privacy people
28:38
, you know , I tell people anyone
28:41
who's in a data job can , can
28:43
add privacy to their toolkit
28:45
. Right , so , you know , get a
28:47
certification and read a book . You know
28:49
there's something you know privacy
28:51
is something that all companies
28:54
have to deal with at some
28:56
level . So having just a little bit
28:58
of knowledge may give you that leg up where
29:00
you're invited to be on a
29:03
team within your company or you
29:05
can volunteer . There's no one in
29:07
your company say , hey , I'm really interested in learning about
29:09
this . You'd be surprised how companies may
29:12
invest in you . And you know , I
29:14
I do . I don't have certifications
29:16
because I think people
29:19
kind of know me already yeah
29:22
, yeah , yeah , no , what , what I am and
29:24
what what I can do . Um , but for
29:26
people who are not as well known
29:28
, I think those certifications help because
29:30
it it really demonstrates maybe
29:32
a future employer that you have
29:34
taken time there to be
29:36
able to get those things . Then I also you
29:38
know as much as I love
29:40
privacy . Because there's so much AI
29:43
out there , I highly recommend people
29:45
start to learn stuff about AI too . So
29:48
whether that be like I tell people , if
29:51
you read one article
29:53
let's say you spend 10 minutes a day for 30
29:55
days to learn something new about
29:57
AI , you'll probably know more than anybody
29:59
around you . Right
30:01
, right , we're at the beginning of the
30:03
beginning around what AI
30:05
will be in the workplace . So anything that you
30:07
can do to read on your own , you
30:10
know a couple of . There are a couple of different
30:12
places that have free
30:14
certification classes that you could
30:16
take in AI . You know just the basics
30:19
and , again , having the
30:21
basics will make you better than most people
30:23
. Most people don't , you know , unless they're data
30:25
geeky people like me Just
30:27
doing it for fun . Yeah Right , yeah Right
30:29
, you'll probably know more than other people . That'll
30:32
give you a huge leg up and
30:34
it helps you differentiate yourself . I
30:36
think too right , yeah , you say , hey , I'm
30:39
a cyber person , but then I also have
30:41
this , you know , interest in privacy and
30:43
those things can come together . You
30:45
can probably be , maybe , maybe the translator
30:47
between those legal and technical people
30:50
within organizations . So I highly
30:52
recommend people kind of diversify a bit
30:54
. You know , maybe , maybe not . I
30:57
wouldn't . I would not recommend people go into
30:59
debt to be able to do that there
31:01
are so many free resources out
31:03
there but just a little bit . You know , even
31:05
if you took one , even if you
31:07
took one little class
31:10
or something , you know that'll be better
31:12
than nothing . Or
31:15
if you read one article a day , that'd be better than nothing . Or if you read one article a day , that'll
31:17
be better than nothing . So I'll say , you know , I highly recommend self-learning . You know
31:20
, I'm a self-taught person . I
31:22
was . I did not go to school for
31:24
technology . My first computer , you
31:26
know , back in the day , when computers came with books , you
31:28
know I would read the whole book or I would
31:30
, you know , learn as much as I could
31:33
. And that's how I ended up in technology
31:35
, because I was interested
31:37
and I could prove
31:39
that I was capable of doing
31:41
things . And so myself , my
31:43
journey a lot , is around self learning
31:45
.
31:46
Yeah , that's , yeah , that's awesome advice
31:48
. I want to ask
31:51
a little bit . You mentioned already about differentiating
31:54
yourself from other people , especially at entry
31:57
level position , sort of floating
31:59
you to the top of the resume pile and so forth
32:01
and I , like you said , I think having
32:04
these kinds of like niche skills
32:06
, certain aspects of privacy or
32:08
or things in addition to your cyber skills
32:11
, is probably a really great way . Now you
32:13
run Debbie Reynolds Consulting . You're
32:15
a , you're a consultant for companies . Is
32:17
consulting uh something
32:20
that you can , uh someone new
32:22
can do to get their job in the door ? You
32:24
know their foot in the door , like if you wanted to uh
32:27
volunteer your time , say , for some local
32:29
organization that needs to know
32:31
you know what their privacy requirements
32:34
are and stuff like that . Is that something that companies
32:37
or you know places in general would
32:39
be willing to entrust to
32:41
, like a newcomer who's trying to get their feet wet ?
32:44
Yeah , I highly recommend . That's one of the
32:46
top things . When people contact
32:48
me , like how do I get my foot in the door ? No
32:51
, or for some people , some people
32:53
say , you know , I went to this school
32:55
, I got a degree , I've maybe got a
32:57
certification , but no one will hire me because
32:59
I don't have any experience , right , and I tell
33:02
them , like , do you have family members that have
33:04
to have businesses ? Do you have like a local
33:06
business that you can go to and say , hey
33:08
, you know here's privacy
33:10
thing , it's a big deal . Maybe I volunteer
33:12
or you do it like for just a nominal
33:14
fee , not a ton . You know , maybe
33:22
just help them with their website privacy policy . All that is
33:24
experience and you don't have . You can put that on your resume , right , absolutely , I was the
33:26
data privacy officer for blah , blah , blah , whatever
33:28
, whatever that is , and they don't need to know
33:30
that you didn't get paid for it , right , it's
33:33
still experience , right , that you can put
33:35
it on your resume and it's something
33:37
that you can do at your own pace , right , because a
33:39
lot of companies , especially the smaller companies
33:41
, they don't know what to do and they'll be happy
33:44
to talk with you because a lot of them
33:46
think , well , I need a lawyer to do this
33:48
, so you actually don't right Most of
33:50
these . I'm not a lawyer . A lot
33:52
of people who do this work are not lawyers
33:54
. A lot of us are tech people , data
34:02
people who are , who have a background in governance and know what you should and shouldn't do
34:04
with data and that's really all that . Privacy is like you can do or you can't do this
34:06
with data or the things you need
34:08
to look at . So I
34:10
highly recommend people do that and that's
34:12
been something that's been very beneficial
34:15
to people who've actually taken that
34:17
track . So you know , even
34:19
let let's say , for instance , you have
34:21
a job , maybe you're a cyber
34:23
, you know , see if
34:25
there are projects in the company
34:27
around privacy that you can say
34:29
can I , can I be a part of this
34:31
project ? Can I help do this ? You
34:34
know you may be able to . You
34:37
know I've had people say well , you know , can
34:39
you ? I'll be your
34:41
data privacy officer , but you know
34:43
I'll just . Maybe you just
34:45
get a new title or something . Yeah , that
34:48
says privacy .
34:49
Negotiate extra responsibilities and yeah
34:51
.
34:51
Yeah , exactly , so there are ways to be
34:53
able to get in . I would say , you
34:56
know , to me it's a wide path
34:58
, so I think you know . I
35:01
think in the future it's going to become even more
35:03
important , especially as things
35:05
with AI come about . Yeah
35:07
, because a lot of lawyers don't you know they
35:09
weren't , they didn't , they're not
35:11
experts in tech or they're not experts
35:14
in data . So having people who are experts
35:16
in data and also know the privacy
35:19
part , I think really
35:21
can elevate someone's career .
35:23
Yeah , that's really good advice too . A
35:26
portion of our listenership is also people who are
35:28
trying to transition into
35:30
cyber-related roles
35:32
, maybe after having another career , maybe
35:34
in their 30s or their 40s , and I think that's
35:36
a you know , if you're already at a company like
35:38
a really good starting place is to
35:40
you know , on your next job evaluation
35:42
, ask for more responsibility , say
35:45
, is anyone doing data privacy for this
35:47
company ? Can I do that in addition to what
35:49
I do , or can I get a few things taken off
35:51
my plate in order to concentrate
35:53
more on this ? You know , I think there's a lot of
35:55
talk about professional development and companies
35:57
want you to be , you know , constantly
35:59
growing . I think that's such a great piece of
36:01
advice in terms of that .
36:03
And one thing I will add that you should really
36:05
know about kind of the job market
36:08
and in
36:10
privacy right now . So
36:12
much so when I was interested
36:14
in privacy in the US , there were like hardly
36:16
any lawyers doing it . It was just all
36:19
data people like me , right , Right . But
36:21
over the years , as the regulations
36:23
came about , there are more lawyers now in
36:25
privacy . Some people think of it as like a
36:27
lawyer-only job and it isn't
36:29
so . There's a lot of people like me who are in
36:32
privacy , but then also , you
36:35
know , it's employers
36:40
. Now they're looking for
36:42
more of the people who understand
36:45
privacy , who understand
36:47
data . All right
36:49
, Because at some point you
36:52
know , unless you're a super big company , it's always
36:54
changing . Once you know what the regulation
36:56
is , you know you need to
36:58
figure out how to change your operation
37:01
, how you work , and so what
37:03
we're seeing is more emphasis
37:06
on companies trying
37:08
to recruit more data people in
37:11
privacy . So people who understand
37:13
the data part of
37:16
how companies work
37:18
because they have to . Companies are struggling
37:20
really with operational change , not like
37:23
the regulation , so , uh
37:25
, so to me it's like a very good area to
37:27
put yourself in , is a good way to differentiate
37:30
yourself .
37:31
Yeah , I love that . Um , yeah , that that reminded
37:34
me of , of , uh , something I was going
37:36
to ask . Oh , this is so . This is kind of a a
37:38
sideways question here . Uh
37:45
, but you know , like I said , um , in learning about , you know , possible new career tracks , if you're
37:47
just , you know , fresh out of school or not even out
37:49
of school or not , didn't go to school , you
37:52
know , or whatever you're thinking
37:54
like data privacy , this is it , this is my career
37:57
track , and you're all excited , can you talk
37:59
about ? Is there a certain aspect of the
38:01
job that is you
38:03
know that people should be warned of ? You know , in the sense
38:05
that , like , this is , this part of it is is
38:08
way more boring than you're expecting , or this part
38:10
is , you know , makes you want to pull
38:12
your hair out because you know people are not going to
38:14
, you know , take your
38:16
advice , or whatever . Are there certain sort
38:18
of undersides of data privacy that
38:20
is , as long as you know them coming in
38:22
? Uh , you know , you got to be ready for them oh
38:25
, that's a good question .
38:26
So underside I will say if you don't
38:28
like , reading like this is not like the
38:30
job for you yeah , there's lots of reading
38:32
. You're constantly researching , I imagine , right
38:34
constantly , constantly , like I
38:37
do several hours of research every
38:39
day . Wow , so's
38:41
just the only way you can keep up
38:43
. The
38:46
European Union has the AI
38:48
Act that came out . That's like
38:50
500 pages , even
38:52
though it's in Europe . It's going to be like
38:55
the GDPR was , where
38:57
it's going to be very influential for
38:59
different jurisdictions . If you learn that
39:01
, then in the future , when
39:03
more laws come out , you'll start
39:06
to see shades of that that
39:08
regulation there . So
39:10
learn on reading that understanding that
39:13
will help you navigate what that future
39:15
is and that's what companies really want
39:17
. I would say
39:19
I don't know
39:21
, maybe I'm just nerdy . To me , I think that's
39:24
probably the only downside . It's
39:26
just a lot of reading and there's a lot more
39:28
stuff
39:32
in the press around privacy
39:34
. I remember when I
39:36
first
39:38
, over a decade ago , maybe like
39:41
15 years ago , I put out a Google
39:43
alert for privacy and there was nothing
39:45
like not one thing
39:47
for years , like I want to say like
39:49
almost like six or seven years , it was
39:51
like no articles . And then now
39:53
, like you know , my
39:56
Google alert may have 10 , 10 or
39:58
12 every day . Yeah , right
40:00
, so it's changed a lot , a lot , lot , and so
40:02
I think for me being
40:04
able to look you know people call
40:06
me a futurist because I'm good at predicting
40:08
what's going to happen next . But you know I'm
40:10
always looking at the technology because
40:12
the technology um
40:15
, you know law
40:17
follows technology . So
40:19
law , you can't lead with law . Law , law is like
40:21
backward looking , yeah
40:23
. So if you're thinking about the new thing
40:25
, like you know , like when the Vision Pro headset
40:28
came out , oh wow , well , what could be
40:30
the privacy issues with that ? Right , and
40:32
so it may not be on your desk
40:34
today , but if you're thinking about those things
40:36
ahead , when it comes up , then
40:39
you're already in a good position to
40:41
be an advisor or being a trusted person
40:43
in that area .
40:44
Yeah , this is a little
40:46
off script here , but you
40:49
mentioned that you do about five hours of research today
40:51
. Could you tell us a little bit , like , what
40:53
is your ? What is your , your sort of
40:55
your regimen
40:57
? What's what do you ? Where do you start each
40:59
day when you're trying to do research ? It
41:02
sounds like you're . You're also the the Google
41:04
alert diva here as well , but like what are some
41:06
of the other ? Like go to sort of news sources
41:08
that you you check every day , or
41:10
like what is your ? You know I jump from here to here
41:12
, to here to here each day . Do you have like a routine
41:15
at all ?
41:15
Yeah , I have an app that I use
41:17
called Flipboard , and
41:19
Flipboard allows you to collect articles
41:23
and stuff and it makes it , you can
41:25
make it a magazine . So I have a research magazine
41:28
and every time I see something that I
41:30
think is of interest , I have like a research
41:32
file that I save it to and then every
41:35
night for a couple hours I go through
41:37
my research file and I just read
41:39
, you know , just read through stuff , and
41:42
so and of course it was like bigger things
41:44
. Another thing I do like these
41:46
bigger laws . Sometimes I use
41:48
an app called Speechify where
41:51
I have it read it to me so I can be
41:53
like washing dishes or doing
41:55
something . So , I'm not reading , I'm just
41:57
listening . You know doing different
41:59
things , just taking it in and stuff like that
42:01
. So those are two things that helped me a lot
42:03
, and so I do videos every week . I've
42:06
actually been doing those for over five years , so I do
42:08
one video a week that I release . And
42:10
so , uh , because
42:12
I what I found that it was too
42:14
hard to start and stop research
42:17
. That's why I always do it , yeah
42:19
no yeah continue to do it .
42:21
And then when people call me , they're like oh , you need
42:23
to look at your notes , like no , no , really
42:25
yeah
42:27
, for for those of us with uh , you know I have adhd
42:29
and I know other people as well
42:32
but like , sometimes the prospect of like , okay
42:34
, learn all the privacy now feels like
42:36
you know that old like game show thing where
42:38
you're in like a wind tunnel and there's dollars flying everywhere
42:40
and you're just constantly trying to grab . So I I think
42:43
that's a that's a really good advice , especially
42:45
the uh , uh , the flipboard thing and sort
42:47
of making the magazines Cause . Again , I think , if
42:49
you're like , well , I'm grabbing
42:51
this , I'm grabbing this , but what am I going to get to it . But if
42:53
you sort of like slap it all together and say , okay
42:55
, two hours a night , I'm just going to go through
42:57
this that
43:00
you know you're going to read , I think that's
43:02
a really good way to kind of like get your
43:04
head around what
43:06
all this is , because there's , you know , there's a million
43:09
options and there's a billion ways
43:11
to do it , right and wrong and so forth . So , yeah
43:14
, that's great advice . Now I want
43:16
to ask one more thing
43:18
regarding , like , careers and stuff
43:21
Are there particular skills gaps among
43:23
people trying to get hired in data privacy positions
43:25
you see on a regularly , regular
43:27
basis that you
43:29
know you wish were
43:32
more common , like are there things
43:34
that you know people are like I'm
43:36
a data privacy person ? You're like well , why aren't you doing
43:38
this ? Or I wish you did more of this . I wish you
43:40
had more of this skill . Is there anything like that that
43:43
people should be aware of
43:45
?
43:47
That's a great question , I would
43:49
say , and you probably wouldn't
43:51
think this was
43:53
typical . But you know
43:55
, the gap that I see the most are
43:58
technology people who don't
44:00
know how to talk in
44:02
ways that anybody can understand
44:04
, and legal people who also have the same problem
44:07
. Yeah , so you know
44:09
, we have , like legal people who want to do alphabet
44:11
soup in acronyms
44:13
and shorthand and stuff , and then
44:15
the tech people have their own acronyms
44:17
and shorthand and so the problem
44:19
is in a privacy
44:22
role , you have to be able to
44:24
communicate across all
44:26
levels of the organization and with anybody
44:28
. So if you're saying some acronym
44:31
soup thing , you know people are going to
44:33
pay attention to you , they are going
44:35
to listen to what you have to say . So
44:37
to me , that gap really is understanding how to
44:39
be a great communicator . Understanding how to be a great communicator
44:42
, right ? So the person that
44:44
you're talking to like
44:47
, let's say , you're being asked to brief the CEO of a company about a particular issue
44:49
and this happened to me before where
44:52
I talk with maybe the legal people like , hey
44:54
, you all need to do this , and then I talk to the
44:56
technical people , hey , and they're like well , I
44:58
need you to explain it to the CEO . So I
45:00
know that the CEO doesn't understand the legal
45:03
, he doesn't understand tech stuff
45:05
in the same way , so I have to be able to communicate
45:07
that to him differently . Or even people
45:10
when I'm doing training , like for a whole organization
45:12
, like I've done trainings for , like
45:14
you know , coca-cola , johnson
45:16
Johnson , like all types of companies , right
45:18
, and so you have to make it easy
45:21
enough for anyone , regardless
45:24
of where they are in the organization , to be able to understand
45:26
what you're trying to say , and so that's a
45:28
skill , just in general , that
45:30
I think is going to be highly relevant
45:32
in the future , not just for privacy
45:35
, but all jobs . You know , ai
45:37
is very complex , right ? So
45:39
if you can break down
45:42
what AI systems are doing in an
45:44
easy way , you can elevate
45:46
yourself in your organization , because
45:48
very few people have
45:51
honed those skills . It's
45:53
so funny because even when
45:57
I was asked to talk on PBS about
46:00
privacy , they don't tell you what they're going to ask you . They just
46:02
put you on TV and they start asking questions . But
46:05
I had to think in my head . I kind of stopped
46:07
for a second . I thought , wait a minute , I have to be able to explain
46:09
this to someone's grandmother , you
46:12
know , or someone who's 10
46:14
, you know so I can't explain
46:16
it in a way that a legal person or a technical
46:19
person , so I had to break it down as easily
46:21
, as simply as possible , and so that's
46:23
a skill that I think will help anybody
46:25
in any type of career . So if you're a good
46:27
communicator you can communicate
46:30
not in legalese or not
46:32
in technical jargon you'll
46:34
go far in any career that you
46:36
want to go into .
46:37
I completely agree . Now
46:40
you mentioned , obviously , the sort
46:42
of alphabet soup and the sort of talking past
46:44
each other between you know , legal and tech and so
46:46
forth Is is there . Is it ever
46:48
a case where , like , the privacy person
46:50
is asking the company to make difficult
46:53
changes and I imagine there's gotta be some element
46:56
of like persuasion involved right as well
46:58
where you're saying , like look , I know you
47:00
really don't want to put in this extra money to like
47:02
or lose this data that we were otherwise
47:05
going to harvest this way , that way and the other way or
47:07
whatever . So I imagine you're also
47:09
having to sort of like patiently
47:11
explain , like no , we don't get to do that . Is
47:13
that ? Is that the case as well , or is that someone else's
47:15
job ?
47:16
Yeah , absolutely Right . So I guess it's
47:19
the art of saying no , in a way , you're
47:29
like , oh , you can't do this or you shouldn't do this , right , but you know , when you're an advisor
47:31
, you know , I say consultant , but you know , advisor , yeah , uh , interchangeably , uh , you know , and it may
47:33
be frustrating or you say , hey , here's the law here , this is what you're
47:35
doing , this is what we recommend that you do . But
47:37
you know , companies even though
47:39
privacy is important , it's not the only
47:41
consideration , and so companies have
47:43
to make their own choices . But
47:45
you want to make sure that they're making an educated
47:48
decision , right ? So you
47:50
don't want them to make a decision because they don't
47:53
know something . You want to be able to say , hey
47:55
, here's the lay of the land , this
47:57
is what I think . You know that here
47:59
will be my advice for what you do . But
48:01
then you have to decide as an organization
48:04
. You know what is your culture , what is
48:06
your standard , what are you going to do ? You know , maybe
48:09
you're somebody like Facebook , you're
48:11
like , well , we don't care , because we're going to do
48:13
what we want . We're going to pay a billion dollars
48:15
and that's fine .
48:16
You know that's their choice Right
48:18
.
48:18
You may not like that , but you
48:20
know , and I feel like sometimes I
48:22
see on LinkedIn people like slamming
48:25
people who work for certain companies . I
48:32
can't believe your company does this . It's like you know you just have one job Right , so you're
48:34
not the CEO . You don't get to pick and choose what companies do . All
48:36
you can do is tell them like this is what
48:39
is going on , this is what our obligations
48:41
are , this is what I think you should do
48:43
. This is how you know and this is where
48:45
, to me , the cyber data people come in
48:47
. How do you do it ? Yes
48:50
, so it's one thing to say okay
48:52
, you need to comply with this law . Okay , Well , how
49:00
do you do that ? Yeah , I don't know . Like , how do I change the way that we operate to
49:02
to be able to get in line with ?
49:03
this . So that's that's the gap way that we operate to be able to get in line with this .
49:05
So that's the gap you have to sort of show the entire pathway there , then , yeah , exactly
49:07
, like you know , it's easier to say
49:09
, okay , this law came out and you have to comply
49:11
with it , but
49:17
then how do companies change ? And so that's the problem that they're
49:19
having . That's why they need data people , because they need
49:21
to know operationally , on a you know
49:23
, hands-on basis , what
49:25
do you need to do to change your behavior
49:27
in that organization .
49:29
Yeah , awesome , yeah , and I I
49:31
think that's probably another one of those sort of underside
49:33
things worth noting is like you can make recommendations
49:35
to your company and and I also
49:38
have to understand that they might they might
49:40
not take all of your advice , right .
49:43
Yeah , absolutely . It's part of the advisory
49:47
that you do .
49:48
Yeah , I can only tell you what you need to
49:50
do . I can't put it into your hand and make you
49:52
do it , that's right
49:54
. So this has been an amazing conversation
49:57
. We're bumping up against an hour here . I would love to
49:59
have you back on , if you're available , to talk Internet
50:01
of Things and especially AI and sort
50:03
of drill into those topics better . But I really enjoyed
50:05
getting your insights on sort of
50:08
career related questions
50:10
around data privacy . But before I let you go , debbie , can I
50:12
ask what's
50:14
the best piece of career advice you ever received
50:16
, whether it was from a mentor or a teacher or a colleague
50:19
or just something you learned along the way ?
50:22
Oh , wow , I think probably the best career
50:25
advice I received
50:27
, maybe from my parents . So my
50:30
parents were very into education
50:32
, they were very much in the learning and they never
50:34
put limits on what we
50:36
could do or what we can learn Right
50:38
. And so I would say don't put
50:40
limits on yourself , don't ? You know ? I
50:43
learned not
50:45
to put myself in a box because
50:48
I don't fit in a box , right . You know
50:50
, I have a lot of different interests and so I'm
50:52
lucky that I have a company
50:54
where I can exercise all the things
50:56
that I'm interested in , even
50:58
though they may not even seem related . Right
51:01
, like I one of my early jobs
51:03
when , when I said I was doing desktop publishing
51:05
, like I do a lot of graphics . You know I have
51:08
a media company as well , so I do a lot of graphic
51:10
design . For that , I
51:12
mean , that's just because that's something that I've done
51:14
forever and that's kind of a fun artsy
51:16
thing . You know , gets me out of the , gets
51:19
me out of the . You know the privacy
51:21
, you know the privacy , you know wonky world and
51:23
I can kind of do more creative stuff . But
51:25
I would say for people , don't put yourself in a
51:27
box , and you'd be surprised how much maybe
51:30
skills that you have in
51:32
different areas may come together at
51:34
some point .
51:35
Yeah , fabulous advice , thank you , so
51:37
I'm gonna let you go here . But one last question
51:39
. If our listeners want to learn more about you , debbie Reynolds
51:41
, the Data Diva , the Data Diva podcast
51:44
or the other 150 things you got going
51:46
on , where should they look for you online ?
51:49
Yeah , well , people can always connect with me on
51:51
LinkedIn . You just type in Data
51:53
Diva Debbie Reynolds and connect
51:55
with me , happy to . Or
51:57
they can look at my website , debbiereynoldsconsultingcom
52:00
. I have all my videos and newsletters
52:03
and events and everything on that , the
52:05
website .
52:07
Yeah , and don't , and don't forget to check out the data diva
52:09
talks privacy podcast . I I
52:11
got mine on my regular podcatcher and , uh
52:13
, highly recommend it . So , uh , well , this
52:15
has been great , debbie . Thank you so much for joining me today . I
52:17
really enjoyed learning from you .
52:19
Thank you so much . I love the show , I love
52:21
your flow , very good flow
52:23
of the show .
52:24
I really appreciate that . Thank you , and
52:27
to everyone out there , I'd like to thank you
52:29
everyone who's watching and listening and writing into
52:31
the podcast with feedback , as usual
52:33
. If you have any topics you'd like us to cover or guests you'd
52:35
like to see on the show , drop them in the comments . We've
52:38
been adjusting our
52:40
content accordingly so you are getting heard , adjusting
52:42
our content accordingly so you are getting heard . Before
52:45
I go , I don't want to forget to have you check infosecinstitutecom
52:47
slash free , where you can get a whole bunch of free and
52:49
exclusive stuff for CyberWorks listeners . This
52:52
includes our security awareness training series , work
52:54
Bites , a smartly scripted and hilariously
52:56
acted set of videos in which a very strange
52:58
office staffed by a pirate , a zombie , an alien
53:00
, a fairy princess , a
53:06
vampire and others navigate their way through age-old struggles of yore , whether it's not
53:08
clicking on the treasure map someone just emailed you making sure your non-nocturnal vampiric
53:10
accounting work at the hotel is VPN secured or
53:13
realizing that even if you have a face as recognizable
53:15
as the office's terrifying IT guy Boneslicer
53:18
, you still can't buzz you in without your key card
53:20
. So go to the site , check
53:24
out the trailer . I love it . Infosecinstitutecom slash free is still the best
53:26
place to go for your free cybersecurity talent
53:28
development ebook . You'll find our in-depth
53:30
training plans and strategies for the 12 most common
53:32
security roles , including SOC analyst
53:34
, penetration tester , cloud security engineer
53:37
, information risk analyst , privacy
53:39
manager , a secure coder , ics
53:41
professional and more . One more
53:43
time , that is infosecinstitutecom . Slash
53:45
free and yes , the link is always in the description
53:48
below . One last time . Thank you so
53:50
much to Debbie Reynolds and thank you all for watching
53:52
and listening Until next week . This is Chris Sanko
53:54
signing off , saying happy learning .
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More