Podchaser Logo
Home
"AI in Literacy" - and Beyond! with Miss Jean Darnell

"AI in Literacy" - and Beyond! with Miss Jean Darnell

Released Monday, 8th April 2024
Good episode? Give it some love!
"AI in Literacy" - and Beyond! with Miss Jean Darnell

"AI in Literacy" - and Beyond! with Miss Jean Darnell

"AI in Literacy" - and Beyond! with Miss Jean Darnell

"AI in Literacy" - and Beyond! with Miss Jean Darnell

Monday, 8th April 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Hey everybody, we're back once again and I am so excited.

0:03

I get to talk to Miss Jean today and I have been looking forward to this

0:07

conversation for a while so I'm so excited.

0:09

Thank you so much for being here. Why don't you take a second, introduce yourself, tell us where you're at.

0:15

Okay, I am Jean Darnell.

0:18

I'm also known as Awaken librarian So for those of you that listen to us via social

0:24

media Please feel free to get onto my website Awaken librarian .com or catch me

0:29

on twitter or x I should say my apologies x at Awaken librarian altogether as one

0:36

word But I am a texas librarian and I am down here trying to save our school

0:43

libraries. I am raising as much cane as I can.

0:47

If you follow me, you know that I speak my mind, truth to power.

0:51

And so I try to practice what I preach and be the good trouble in the world and wake

0:58

up people to what's going on as far as our liberties and what we are guaranteed about

1:04

by our rights. I paid attention in history class.

1:07

I went to undergrad at Baylor University and my studies were in English and history

1:12

because I was taking so many. classes of each.

1:15

I was like, dude, I might as well just double major and kill two birds with one

1:19

stone. So, yes, I've been a school librarian for coming upon 11 years and I've been in

1:26

public education for almost 23 years.

1:30

So I got a little sweat equity put into this and I love what I do.

1:36

I'm very passionate about student agency and voice and advocacy and looking out for

1:43

my community. So that's what I do.

1:46

My website started off as a school assignment and I loved it so much and

1:51

being able to have my own platform and kind of my digital baby that I was like,

1:54

yeah, this is going to continue.

1:56

I found a little niche for myself and I'm like, yeah, this is how I can get my work

2:01

and my content out there and connect directly to my community.

2:06

So it took off and I haven't stopped.

2:09

no, you never stop. You are the greatest.

2:11

I love it. You are such a wonderful force in school librarianship and I am so glad that we're

2:19

getting to have this conversation. I'm curious though, so you were a double major English in history and then you were

2:24

teaching for a while before you got into the library. What was sort of your path to the librarianship?

2:29

Okay, so I have to give a shout out to my high school.

2:32

I was a high school English teacher, the dreaded senior English teacher and my

2:37

campus librarian. His name was Dr.

2:39

Roger Leslie and he was kind of like my savior.

2:42

It was my confidant whenever I had, you know, issues in class or just needed

2:47

sound, you know, educator advice, mentor advice.

2:51

Dr. Leslie was my go to.

2:53

And so every time he would, you know,

2:56

say you should think about you know going back and getting the master's in library

2:59

school librarianship I think you'd be great at it.

3:01

I'm like don't have the money for that.

3:04

I'm a single mom trying to raise two knucklehead kids.

3:07

I just didn't have the time and the finances and so he kept pestering me and

3:12

one day he asked me again he's like and before you say you don't have the money I

3:18

put your name in for a cohort and they will pay for it all you got to do is pay

3:21

for your books. I was like you play it.

3:25

And he was like, no, seriously.

3:28

He's like, so if you really want to do it, I believe that much in you.

3:32

And then within two weeks, I take in my DRE.

3:35

This was November by January.

3:37

I was enrolled. And I was just like, and it was the hardest thing I ever had to do.

3:43

I had no business being a single mom, trying to raise two kids and go back to

3:49

graduate school at night while working a full -time job.

3:53

But that's. That's me and in that show I've always been ambitious.

3:57

You could I just couldn't do a little bit I always had to go that that extra

4:02

unctuous. Sometimes it's a it's a double -edged sword because I don't know how to rest

4:07

People have to remind me. Hey, you know You might want to chill because I'm tired of looking at like my

4:12

sons will tell me I'm tired of you being on podcast I'm tired of dinner having to

4:16

be rushed and the last two years alone they just look at me and when I get a

4:21

certain looking stare in the face like really I

4:24

This is what you gonna do tonight. And so I am slowly but surely I've promised I took on a lot this last year So

4:32

I promised my boys I'm gonna take it easier 24 25 school year and there and

4:37

they're looking for Expectation guidelines, you know kind of checkpoints

4:42

to make sure oh, are you just giving me lip service?

4:44

Are you really doing this? I said I promised them next year.

4:47

I'm not gonna take on so much But then the long story that's how I

4:55

Yeah, no, that's awesome. We're going to miss you that you're not going to be as omnipresent, but I also

5:01

think it's very important. I mean, clearly you need to be there for your boys.

5:05

Yeah, I'm there. One is 21 and one is 15, so I already have one kind of out, but I don't want to miss

5:12

out on the porting with my youngest.

5:15

So I'm trying to listen as far as parent and not just dictate, but listen to the

5:19

feedback, you know, and self -reflection.

5:22

I think that's a sign of maturity when parents can do that with their kids and

5:27

not, you know, take it to heart so much and just, you know, because that's a human

5:32

being too. They have feelings and thoughts. just like anyone else and so, you know, when he's like, mama, you're doing too

5:38

much. And I'm like, okay, okay, I might understand that because he looked, he

5:42

accidentally looked at my calendar once. He's like, this is all you're doing?

5:46

And I'm like, well, yeah, I live by my calendar.

5:50

So, but yeah, I love, I love school librarianship.

5:53

I'm passionate about it. And if I can't be teaching it in, in the library, I want to be talking about it.

5:59

If I can't be talking about it, I want to be helping somebody get to some insight

6:02

about it because. I've said it before, my culture, my heritage, literacy has been such a

6:10

prohibitive area that I'm like, no, I'm going to drink as much as this literacy

6:17

juice. I'm going to get absorbed as I can in it.

6:20

Because when you look at our times right now, you never know when it's going to be

6:24

snatched. And so I look at the book censorship that's going on and I look at the voices

6:31

that are censored. That's why I went up.

6:34

who are, I can't believe I did this, but I did do it.

6:38

I walked up to our Texas Library Association president and I said, where do

6:44

you need help? And she's like, well, I don't have anyone to lead the intellectual freedom

6:48

committee. I said, well, I want to caution you to who I am before you ask me if I want to

6:54

volunteer. And I told her, I was like, I'm a waking librarian on social media.

6:59

She's like, oh, I know who you are. And I was like,

7:01

So are you sure you want to offer this to me?

7:05

And she was just like, you know, if you, if you want to lead, I'm looking for a

7:09

leader. And so that's how I got into that position.

7:12

And it was just truth to power.

7:14

I saw what's happening with the bands.

7:17

And like I said, whose voices were being erased.

7:19

I wasn't going to stand by. There's no way I can live and exist in this time period.

7:24

What's going on as the person I am and not speak up because then I'm not worth my

7:29

salt. Everything I said is just. water under the bridge.

7:33

It's mouthwashed of the worst kind.

7:36

So, I had to advocate the only way I knew how, which was to write, to use my

7:42

platform, to antagonize sometimes because I do have a way with words.

7:48

My grandmama told me that long ago.

7:50

She's like, you gotta watch your words because you can string together sentences

7:54

way above your age.

7:56

She used to tell me, she's like, I don't know where you got that sentence from.

7:59

I'm like. I just thought of it, because he's like, well, you gotta watch your words.

8:04

But it's... advocate and warrior.

8:06

I'm so glad that they were able to find you this.

8:10

I mean, intellectual freedom. That's like, I can't think of a better match for you.

8:14

So that's outstanding. thank you.

8:18

I appreciate it. I hope I have not done anything out of the ordinary to, you know, make someone second

8:24

guess choosing me, but I am, if there's not anything else, I'm passionate about

8:29

it. So you're going to get the full force of me.

8:32

That might not be great in everyone's perspective, but it's the only way I know

8:36

how to be me. Absolutely.

8:38

And you are here not only to help us understand how important advocating is,

8:43

which you absolutely do on such an amazing level all over the place, but you're also

8:48

here to share this great lesson. And I think it's so timely that you're bringing us a lesson about AI because I

8:54

mean, is there a bigger buzz in school librarianship or schools or education

8:59

right now? I don't think there is. So I am so excited.

9:03

education technology was right in my wheelhouse for school librarianship.

9:08

I immediately took tech tech to technology.

9:12

Ain't that a tongue twister? But yeah, so it all started back in 2001 when I taught myself how to code because I

9:20

knew computers were going to be the next wave when we were coming out of the late

9:25

90s and we were worried about the year two, two K .O.

9:29

and not to K .O. But you know, seriously, turning into 2000, we were worried about what computers

9:35

were going to do. We had all these prophecies and stuff.

9:38

And I was like, well, I need to start learning the language.

9:40

If computers are that, you know, indicative of our future, I needed to get

9:46

some ground and understanding on it. And so same thing when AI came out, I was like, no, I got to get ahead of this.

9:52

You know, the generative AI was just what the fun part of AI that they released to

9:58

the public, AI had been around.

10:00

for decades before that, but for the education teaching practices, I thought

10:06

generative AI was definitely a big buzzword.

10:09

And I was like, yeah, I'm gonna have to get knee deep in it.

10:12

So, my downtime, I did what I do best is I start digging.

10:16

I'm how is this gonna affect libraries? How is this gonna affect teaching, curriculum, grading?

10:21

You a lot of the concerns that everyone else had, I was digging, looking for the

10:26

data. I immediately went into library research mode.

10:29

I was like, what is the data saying? Yeah.

10:31

And then I had to apply it to me, you know, on a social, economical standpoint

10:36

as well. How is AI going to affect people that look like me?

10:39

Because I just got to be honest. I got to look at it from the most open and inclusive perspective as possible because,

10:47

hey, you know, there's good, bad and ugly and just about everything.

10:50

So it behooves me to pay attention to the details.

10:54

And so I wanted to. dig deep, get in there.

10:57

The only way I feel like you can really learn is to get knee deep in the mud with

11:01

someone. And that's with any experience ever.

11:04

So even when I'm teaching, I'm teaching the kids, but the kids are teaching me and

11:09

we're knee deep in this mud together. And we might have to wade and feed off of each other and, you know, spit ideas at

11:15

each other just to get to the destination. But that's the whole learning process, you know, so in both of us working together,

11:22

you know, educator and student.

11:24

We facilitate the knowledge and we get the steps needed in order to get to the goal.

11:28

That's how I've always taught.

11:30

You know, whenever it was a lesson, I'm going to demonstrate it for you.

11:34

And please correct me if I screw up because these kids are sharper than we

11:39

are. And so when I looked at right now, our kids being born in 2008, you know, these

11:46

are our 16, 17, 18 year olds, you know, 2005, six, seven years.

11:52

And so the... They've had swipe technology at their fingertips.

11:56

They're not like, you know, me and used to play Oregon Trail on the computer, you

12:00

know, did not have the old, and we had the prehistoric kind of dinosaur age of

12:07

computer technology. And so they've had the sophistication.

12:10

So I want to listen to what they have to say.

12:12

I want to see what their fears are. I want to hear what their feedback is from AI, because I feel like some school

12:19

districts and leadership are not listening to them.

12:22

And I feel like that's cutting your nose off despite your face because if you're

12:28

really about data and student agency, you would be listening to them.

12:32

And so when it comes to AI, I want to get as much content as I can out there in

12:37

front of them. I want them to look at it. I want them to investigate it, play with it, do all the curiosities that we don't

12:44

allow kids to do so much anymore because we're into the structured, standardized

12:49

way of the days. to get the data that way.

12:52

But I think creative data supersedes any type of orchestrated standardized data.

12:57

Now that's just me talking. But I fully believe that.

13:02

So I want AI in front of the kids' face.

13:04

And so that's what got me to write in this unit.

13:08

I was accepted into a teacher fellowship with the Pulitzer Center.

13:13

And that's where you do investigative journalism.

13:15

Because in addition to being a school librarian,

13:18

I do investigate journalism on the side and I'm constantly writing articles and

13:24

sharing my thoughts and doing research because that's the geeky side of

13:31

librarianship that I get to research the world as it's happening right now.

13:35

And so that's how I bring my full self into the role.

13:40

And so with AI, I was doing a lot of learning about AI, especially generative

13:46

AI. I wanted to make sure that I did the investigative part to really understand

13:52

what I'm talking about, not just regurgitating the factoids that an app

13:58

wants to teach you about it. I needed to understand what training data was, what large language modules are, what

14:04

even just the other day I learned brain computing identities for BCIs or for

14:13

literally. attaching AI algorithms via different neuro paths that they could put on top of

14:19

your head, attached to your brain to read your neuro waves in order to, you know,

14:26

people that are quadriplegics, people that have had amputations to get their

14:29

prosthetics to work like we do in thinking right now when you want to move your arm

14:34

or wave your hand and they're reading the neurons in your brains.

14:37

And so I saw that and I think it's fascinating.

14:41

Yeah. I also know that sometimes the fascinating glitter of technology, newness of

14:47

technology is amazing, what it can do.

14:50

But I also have to look at everything between an equity and inclusive lens.

14:54

And I'm like, they were like, well, what are your questions about this?

14:56

And it's like, well, is it going to affect the brain?

14:58

Long -term usage, because when you think about it, when we have to get scans and we

15:03

look at, you know, put neural pads on our bodies only for temporary.

15:07

So putting that on your brain every single day, what can be the long -term damage of

15:11

it? With those pets, you know, just participating in it to get the bang for

15:16

your buck of being able to control something else using your brain power,

15:20

does it degrade the brain in any shape, fashion, or form?

15:22

So these are just the things that I'm, you know, thinking about when I look at the

15:26

marvel of what AI can be and how it can help.

15:30

But I also want to hear what the kids are because sometimes kids can give you

15:33

perspective you've never even thought of.

15:36

Mm -hmm. was like, holy moly, I can't believe you just came up with that.

15:41

And so that's why I believe in. many times there's this belief that like as the educator, we're the one that's

15:46

supposed to have all the answers. And if we're not passing the information on, then what are we doing?

15:50

And it's like, well, no, like we should be co -learning.

15:54

Like I absolutely love how you kind of brought up that idea of we're learning

15:59

from each other because they've got knowledge and skills and insights that

16:02

we're not going to have. And we can't just assume that we know everything there is to know.

16:05

I mean, you have done an amazing deep dive in this clearly, but like,

16:09

there's gonna be something that a kid has tried or done that maybe it just didn't

16:14

occur to you. And so having that conversation and that collaboration and learning, so valuable,

16:20

so important. I think that's what we're missing sometimes when we get so bogged down on

16:25

the data, we forget to interact and influence in, you know, as best as we can

16:32

manipulate the data into a creative aspect, if I can put it like that.

16:37

And so in saying that, I had to devise a unit, going back to what I was saying with

16:43

the Poulter Center teaching fellowship, I had to design a unit of investigative

16:48

journalism that

16:51

specifically how it affects marginalized communities and whose voice was missing.

16:56

So when I was looking at all the rollout from AI, me just being, you know, a waking

17:02

librarian, I was like, how is this affecting black people?

17:04

I had to figure out real quick. I was like, hold on a second.

17:07

When I looked at, you know, what was going on in the news and the book bands and how

17:12

they were banning books about us, for us, you know, aimed to, you know, influence

17:17

those that... and took a participatory interest in the black community.

17:20

And I saw that band in there. I was like, okay, are we being banned in AI?

17:25

So that's where I first went in to generative AI.

17:29

I was like, okay, so what is generative AI saying about the black community?

17:32

That's the first thing I wanted to know. And so I asked it to write a history, a black history paper.

17:38

I just wanted to see where it was at and what it was working with.

17:40

That's all it was just an investigative part because that's what I do in my free

17:45

time. And I did not like what a chat GPT OpenAI said.

17:50

So, you know, me being me, I was like, okay, this is what you think about the

17:55

community. Considering that I have a vested interest, I'm going to teach you what it is that you

17:59

should be looking at. Because I knew that there was train of data that was going in and the people that

18:04

were monitoring AI in that training data room were not looking like me.

18:08

Because if they were looking at like me, that paper wouldn't have been produced out

18:14

of, you know, if the input was,

18:16

as diverse and inclusive as it needed to be, then the output should have reflected

18:21

that. And considering that the output for a chat GPT that what I use in this particular

18:28

situation did not, you know, have the success that I wanted to, that I felt it

18:35

needed to, to give a brief overview of our contributions to this country, I was

18:42

livid. And so I was like, okay, this is a blog post.

18:46

Y 'all, they are in generative AI is popular right now.

18:49

Okay, let me tell you what I think.

18:51

And I sat down and then PBS was just like, you know, we saw your piece.

18:58

We want to publish this. And I was like, real?

19:01

Did y 'all see all? And they were like, no, no, no, we, we saw it just as it is.

19:05

And we want to, you know, we want to give your voice a platform.

19:09

So then I was like, okay. And I felt like, well, I get it.

19:12

I did, I did a. dark side. I gave Lord Vader's understanding of the AI galaxy.

19:18

So then I was like, well, let me, let me put myself into a bootcamp and I signed up

19:23

for an AI bootcamp. And I was like, let me learn as much as possible.

19:27

Cause whenever I counter information, I know there's a good side to it.

19:32

No, there's a valuable side. And so I wanted to give myself as much level playing field as I could, as it

19:39

related to AI. So I was like, teach me as much as.

19:42

as you know, I'm one of these geeky people that will sit there in their free time and

19:46

do the coursework in order to get the certificate so I can understand AI.

19:50

And that's what I did. And so then I got a better understanding.

19:53

I was able to like, how can you teach this?

19:56

How can you engage kids with this? How can this be fun?

19:58

How can I use this for literacy?

20:00

So that's when I came up with seven innovative, you know, ways to use literacy

20:05

prompts using chat GPT. And I had a lot of fun with it.

20:08

I was doing everything from, you know, writing sassy,

20:13

AI generated out of office replies using Beyonce songs.

20:18

I had Edgar Allen Poe writing the essay to advocate for kids to read scary books.

20:24

I had AI generate a campaign for running for mayor for someone that is 18 years old

20:31

and never ran for mayor before. So I was having fun with it, doing different ideas, because I wanted teachers

20:38

to be able to have that same kind of fun.

20:41

And then other companies started rolling out their programs and you know, I've used

20:45

AI in Canva. I've used AI, Magic School AI.

20:50

And so I'm having fun with it, but the investigative side, going back to what I

20:56

was saying, I had to, I wanted to hear what the kids had to say because that was

21:01

the most important. So I put together a unit where I was going to teach the kids the how social media and

21:09

AI are in a. marriage so to speak and then how the algorithms that AI has can be manipulated

21:22

and can spit out bad things and can you know be biased against certain genders and

21:29

certain communities and so I had to give that equitable kind of focal point on AI

21:36

and then how some schools were using the biometrics.

21:41

biometric data and using AI algorithms to monitor students.

21:46

And so I've been teaching that lesson to some high schoolers as part of the

21:51

investigative journalism. I was teaching some high schoolers in Chicago last week.

21:55

And so I presented this lesson in the articles and they looked at the data and

21:59

they were just like, so many points.

22:01

They just wait, whoa, hand pop up real like rapid fire.

22:04

Whoa, whoa, miss. So you're saying that AI is reading our posts.

22:10

Social media and I say yeah, there's an algorithm out there to do it.

22:13

I said you have Navigate 360 detect you have social sentinel you have securely you

22:19

have Gaple grapple or I think it's gaffled whatever it is, but different things that

22:25

you literally Have relationships with all of your social media content producers

22:30

whether it's Instagram whether it's you know YouTube or you know Facebook

22:38

You know, all of the different big three, they had relationships and they were

22:42

reviewing those posts. And then based upon those posts, they would send out safety security alerts.

22:47

And so I shared a story of one guy that attended Arizona state university.

22:53

He was a football player and he, you know, coach coaching staff changed and he was,

22:58

you know, affected by it. He was upset about it and he tweeted some things and AI algorithm picked it up.

23:04

And before he knew it, you know, SWAT team was at his.

23:07

dorm room and a barrel of gum was at the back of his head and he was being charged

23:11

with felonies for, you know, knowingly interrupting a school place.

23:16

And so, and he never knew.

23:20

And the school, the students at that school never knew that they were being

23:24

monitored by an AI big brother named social Sentinel.

23:28

And so, and that's what flagged.

23:30

And so when I saw that article, I was immediately upset because of course this

23:35

kid looks, comes. from a community, this could be my brother.

23:40

And so my brother was a big football player.

23:42

And I saw that and I was like, but what about your first amendment right?

23:47

But what about your fourth amendment right to, you know, so I'm looking at it from a

23:51

legal standpoint because both of my parents are attorneys.

23:55

They met in law school. And so everything I've ever encountered, I used to read law books for fun.

24:01

That's another reason why I'm nerdy.

24:03

Cause you know, I used to. try to help my mom in her office as a paralegal when she was starting up her

24:08

practice. So, you know, I knew how to read law books and what to look for because my mother

24:13

taught me at a young age. And so when I was looking at the terms of service and agreements for these AI

24:19

companies and how there really wasn't legal guidelines for them that should be

24:25

transparent as far as what's being done with that information and how notification

24:29

would be given to the public if they're in a place that's monitored.

24:33

by drones because, and one of my investigative articles, a girl was

24:38

followed home by a drone and she didn't know what it was, scared her, turned out

24:42

to be the school safety police drone, was just trying to make sure she got home

24:46

okay, safely, because she was walking late at night.

24:49

But at no point did they notify the students that they were gonna be monitored

24:55

by in a social media to buy unmanned flying drones.

24:59

that could identify who they are, follow them all around campus.

25:02

They had no notification of that. And so when I presented that to graduating seniors, the last week I taught high

25:08

school seniors, I was like, this is the capability of AI.

25:12

And so I'm like, these are the colleges that said they would use it or would not

25:17

use it. It changed how they were thinking.

25:21

You could see it right there in their face before their eyes.

25:24

They were like, wait, well, well. So they made me go back and slow down because I was clicking through the slides

25:28

too fast. on the names of colleges that said they would use this, you know, AI big brother

25:33

monitoring or they would not. And kids are like, oh no, I was considering that school.

25:37

But even if they send me, you know, an acceptance letter, I ain't going.

25:41

And I was just like, well, I'm not trying to, you know, un -recruit you, but at the

25:47

same time, I want you to be aware because these companies are playing on the fact

25:52

that you have young 18, 19 year olds that are unaware of their rights.

25:56

And so they're, they're participating and they're knowingly giving consent, whether it's on

26:00

their phone for their face biometrics and you know, Apple's getting that data and we

26:05

hope Apple is going to do right with that data, but we don't know who they're

26:09

selling that data to. And so one of the kids that I was teaching last week, so he's like, hold up with the

26:14

facial recognition software, you're telling me that if the police have a

26:19

relationship with this company that, you know, monitors and identifies everyone's

26:24

face and if the computer.

26:26

Say like if you know somebody hates me and they hacked into the police computer and

26:30

they put my face and say I was at this location even though I know I was sitting

26:35

on my couch playing video games I was like well technically in that the way you're

26:40

situating it should situate in this scenario if that is the case yes you would

26:45

may have to defend where you were against the computer so who's going to charge a

26:50

computer if they make a mistake if it messes up.

26:53

Yeah. the accountability and the transparency on that?

26:57

And so the kids were just like, I've never seen kids back away from a technology

27:03

before, but listening to the rules that they came up with after, you know, this

27:10

deep dive in investigation into social media and AI and the biases that are

27:15

associated with it. When they saw it, it was just a completely different tune.

27:20

And the whole reason why that was taking them through that process is because a lot

27:25

of schools right now don't have an AI policy.

27:28

They don't have an AI users policy. Here in the state of Texas, they have, you know, shushed our students.

27:34

They've blocked a lot of the, you know, AI websites and tools and stuff to use for

27:38

fun because I guess they're still trying to get their wits and understanding about

27:42

AI. And so, but at the same time, I don't think they're hearing what the kids are

27:49

concerned with because, the kids are having legitimate mature concerns.

27:55

And I feel like, you know, any school district that really wants to show you

28:00

care about the kids, give them a seat at the table.

28:02

If you, you know, if you want kids to abide by policies, give them input into

28:06

those policies. It would mean a lot more if like the kids that I had last week, they had to create

28:13

four policies. And so, um,

28:17

In order for them to do that, we had to do the deep dive into the framework.

28:23

But the policies they came up with were so very rich.

28:26

You know, they were just like, hey, we're not going to create, I don't want you

28:29

doing anything with my likeness or image that's going to violate my fourth

28:32

amendment right. I don't want you to do anything with my image that's going to, that can be

28:37

manipulated or sold to a company that can manipulate it.

28:39

They were thinking so far along on.

28:44

the capabilities of it and the mutations of AI.

28:47

And I'm like, this is what the school district needs to hear.

28:51

So that's why that listing wasn't so important to me.

28:55

I had them paraphrase an AI policy.

28:57

We were looking at what the EU, Europe just released their AI act that's supposed

29:04

to govern the whole planet, ideally.

29:08

Now, which companies abide by it?

29:10

We don't know. You know, it's just...

29:12

It was just written back in, I think it was October, November of 2023.

29:15

So there's a grace period in between the adoption of these new guidelines and

29:21

policies of what everybody has to abide by.

29:23

But I want to see how the AI, how the machine is trained.

29:30

I want to look at that training data. I want to know who's in that room to make decisions as far as what this input is,

29:37

because I've looked at the data and right now AI is.

29:42

Embracing our flaws as though our flaws don't need to be fixed and as as a society

29:49

We know we haven't fixed all of the problems that's gone wrong and just

29:53

existing and cohabitating with each other and now we're trying to transfer transfer

29:58

this semi enlightened intelligence to a machine that might be let's just level a

30:06

Computer will and has been proven to be smarter than the average human

30:11

So if you have this computer that's constantly thinking, that's constantly

30:15

gathering information, to develop a simulation of thought like a human,

30:23

eventually it might look at the input data that we're given it, see the trash for

30:28

what it is and decide, ah, I don't know if I'm going to follow that data, that

30:35

algorithm. because I know it produced bad results because I've looked at and analyzed all

30:40

the other instances and algorithms that had this specific code in it and I know it

30:45

leads to XYZ.

30:47

So. which depends on the system itself being able to recognize that the results that we

30:53

have been getting are not good results.

30:55

And I mean, being trained on the data that created the world that we're in now, and

31:01

that's, you know, you're trying to look at the outside of the box from inside the

31:05

box. It's not always, it may not be feasible.

31:08

So yeah, boy.

31:10

interact with that information, how to assess it.

31:12

So this is where the literacy and intellect of your librarians come in.

31:17

And so when I look at school districts that are removing their librarians and the

31:22

areas that it's happening in, they're putting those communities at a

31:26

disadvantage because you don't have the intellectual human being that's going to

31:30

create that buffer that is going to monitor that data.

31:34

or that is going to teach kids how to assess and analyze the data, you're taking

31:39

out the central core processor of your school campus.

31:43

How you think it's gonna work? We always talking about the library is the heart of the campus.

31:50

Well, you just got heart failure. Now what are you gonna do?

31:54

Where's the stent? Where's the medication? How are we gonna open up this vessel again?

31:57

Because we gotta get this working.

32:00

Because if not... We're running out of time because this computer is going to figure it out and it

32:06

might figure out that we're the pest, we're the virus.

32:08

And then what happens when, you know, it's just, it's amazing.

32:13

And so I want the kids to look at it from that, that perspective of, okay, in the

32:20

sense of viral, as it relates to AI, how bad can it get?

32:24

You know, how preventative do we need to be?

32:28

And so given the kids access now,

32:31

While it's hot, while we're still in the formation stage, while we're still in the

32:35

input, while we're still trying to work out the kinks, I think it's imperative.

32:38

That's the reason why I wrote this lesson. I want the kids to hear it.

32:42

I want the kids to interact with it, to think about it, to get angry, to find

32:47

that, poke the holes into whatever theories or whatever public announcements

32:52

are being told to us, and then looking at the data.

32:56

You're seeing what's being spit out, seeing who's being banned.

32:59

for whether it's shadow banned or whether it's book banned or whether it's a gender

33:05

ban because AI has been proven to be sexist as well too because one part of the

33:12

deep dive into the lesson was looking at the algorithmic biases against women that

33:17

AI is perpetuating.

33:19

And so the monitor of our, the policing and monitor of the pictures that we put on

33:24

the internet and how women are scored on this.

33:28

racy, sexy, explicit scale that you know, they rate pictures on how sexually

33:33

explicit it is and sexually explicit could be a girl in a bikini.

33:39

But because she's in a bikini, she's rated as, you know, higher in the explicit where

33:45

a topless man or a man in and what it is, the skinny little briefs that they would

33:51

be in a swimsuit with Speedos. Yeah, a man in a Speedo, ha, no problem.

33:56

You know, he might score 20 on this racy explicit, but a girl in a two -piece

34:01

bikini and she's already at 86.

34:03

Well, what is the data telling you? What is it telling young girls about how their body is going to be related or

34:10

related or degraded down in whatever way?

34:12

So just looking at that data and then, you know, same input of these images that are

34:19

created with AI. What is the data that is being trained on to create these images?

34:26

And how are these images abiding by copyright law?

34:31

And how is it stepping all over copyright? Because as librarians, we got to know that.

34:36

When a kid comes to print out a picture and they want to put it in their project,

34:39

I don't want it to be so much like some kind of Disney princess that the sleeping

34:46

Disney spies pop in like, hey, that's a violation of copyright.

34:49

And they try to carry off a little kid.

34:51

That's a princess story we don't want to happen.

34:55

But no, seriously, these are the things that we have to think about as librarians,

35:01

as educators, as teaching our kids to discern between something that's real and

35:06

something that is AI generated. It's what we're threatened with right now as a country, as we get ready to go

35:12

through another set of elections.

35:14

You have AI bots out there taking people's voices and making it say and do all kinds

35:20

of crazy things. And so you may not know.

35:23

that the person you're speaking to on the phone is actually some dude in his

35:28

basement in Montana pretending to be a political figure.

35:31

You don't know it because it sounds so much like it and the technology of AI is

35:36

so sophisticated that it can do that.

35:39

So it's easy to this way, look at the people who are benefited from that.

35:43

You know, who benefits for something like that going on and how are we gonna be able

35:46

to discern differently? So yeah, I've been.

35:50

you have done a heck of a deep dive.

35:52

This is amazing. You are such a font of knowledge on AI and it's awesome.

35:57

It's amazing. You're bringing up all these great points that we really need to consider.

36:01

I mean, we personally need to consider, but also as a profession, school

36:06

librarians and educators need to consider.

36:08

the lesson that you had shared out, using AI for literacy, where did that sort of,

36:15

if I'm a student and I'm coming in and you're gonna be doing this lesson, what

36:19

might I see as I come in and you're ready to do your using AI for literacy lesson?

36:28

Yeah. I wrote it almost a year ago.

36:37

Bear with me. Oh yeah, no, no problem.

36:47

You're talking about the one I did with Matt Miller, correct?

36:50

Ah... April 28th, 2023, using AI for literacy.

36:58

Okay. Matt and I, Matt Miller, he's a phenomenal man in tech.

37:05

He, he's like a tech guru god.

37:07

And so when I did that article for, um, when PBS picked up my article on chat GPT

37:14

as it relates to black history, this was a part of me doing investigation of the fun

37:19

side of it. So my kids,

37:22

I did not roll this lesson out with my middle school students.

37:26

I had to teach this in a workshop over the summer to some high school students.

37:31

And so what does it look like?

37:33

First, I love interacting with technology.

37:37

So you're going to see my kids at the computers.

37:39

We're going to be, you know, logging in, investigating, playing with ChatGPT.

37:45

So at that time ChatGPT had just rolled out anyone with the, you know,

37:51

Gmail account could have signed up for it and start tinkering around playing with it

37:55

And so that's exactly what we did and so I wanted to see how creative they could get

37:59

in the different types of ways that we could Use the prompt writing to get the

38:05

result that we want. So when I was talking earlier, you know I had an example that I had we had kids

38:13

Writing poems from the dead, you know, so the example that I used was Edgar Allen

38:18

Poe And you know, one kid did Stephen King.

38:22

He had a chat GPT, right?

38:25

About something with Stephen King as though they were Stephen King.

38:29

So I wanted to see the kids play and interact with it.

38:33

And so you'll see the kids typing.

38:36

You'll see them playing. You'll see them interacting.

38:39

My kids love to, you know, do weird things with AI.

38:44

So one of them, you know, had the AI voiceover and so they had.

38:48

joker singing certain other tunes.

38:51

And so they had the curiosity and the fun.

38:56

So when you're going to see us doing things like that, I believe in showing the

39:02

kids engaging with the material producers.

39:04

So how can you make it better? Now that you know this, or how can you improve upon it?

39:08

Or what did you find out that was wrong in the information?

39:12

Because hey, you know, a chat GPT Open AI only had data up to

39:17

2022 and that data was collected from a lot of sources, any source, anybody that

39:24

had a page on there. So I also had to teach them how to, you know, not take everything that you find on

39:30

AI face value. Read it because even if you read the terms of service, they will tell you, Hey, this

39:36

is, yeah, this is a toy. This is fictitional and this is fun, but you know, we're not responsible for, you

39:43

know, they, they give their disclaimers.

39:45

They, they, they put the legal aspect out there.

39:50

So. many, yeah.

39:53

And I love that you're bringing in, there's so many skills that the students

39:57

maybe don't realize. And I think educators probably don't realize are important to teach the kids as

40:03

we're thinking about AI. Like you were talking about writing prompts.

40:05

Like if you can't generate the prompt that's going to get you the result you

40:10

want. So prompt writing becomes a skill in and of itself.

40:13

And like assessing, like if you tell it to write a poem as Edgar Allan Poe,

40:19

you need to know what Edgar Allen post sounds like in order to know if it's done

40:22

what you want it to do. And, you know, as you said, the terms of service don't trust any of it because

40:27

unless you can do the research yourself and actually find where it said the thing

40:31

that it's supposedly giving you is facts.

40:34

There's so many great like information literacy skills built into thinking about

40:40

how these AI tools, how they might work, how we might use them.

40:44

But when we do use them, what are we paying attention to?

40:48

And then whose voice is missing from the contribution of the data that you're

40:52

getting back? You know, that's that that the the Analyzation of it and the application of

40:59

it. So those are two things because When I was teaching this the first time around when

41:05

it became the chat GPT They did not the students did not like the fact that the

41:09

data source was limited. It was cut off They did not like the fact that you know when I asked them

41:16

Well, where did he, where did chat GPT get this information from?

41:19

Did you ask it to cite itself?

41:21

Whoa, oh no, I didn't do that.

41:23

Well, why doesn't it come in with this? The kids, you know, they want to know the easiest way.

41:27

Well, why did it cite it to begin with?

41:29

Why can't it tell me where it got this information from?

41:32

Well, you didn't, did you ask him to cite his sources?

41:34

If you didn't ask him to cite his sources, then he doesn't have to do it.

41:36

That's kind of like what y 'all do. You know, you don't cite your sources when you write your papers sometimes.

41:41

So how do I know where you got this from?

41:44

but you want me to believe that she wrote this sentence.

41:46

And so that aspect.

41:49

students having to deal with that side of things that we always have to deal with

41:52

and they just, it goes right over their head, but now and they've got to deal with

41:55

it. And so, so funny, I was telling my kids, this is a complete sidebar, but they got

42:00

upset with me, the kids last week in Chicago, because I was trying to, I was

42:04

teaching them that I taught the teachers how to discern whether their papers were

42:10

written by AI or not. And they were like, so when the kids, he was so funny.

42:14

He's just like, he sat real patiently with his hand up in the air, like, hold on,

42:18

hold on, hold on, hold on. So you telling me you taught the teachers this?

42:22

Why would you do that? Why would you sell us out like that?

42:24

I'm like. He was really upset.

42:28

He's like, so I was like, so it's okay that you know this information, but the

42:32

teachers are not supposed to know. And I was like, well, yeah, I thought you was trying to give us a one up.

42:36

And I'm like, no, sir.

42:39

Hustler, hustler, this hustle is not going to work for you.

42:43

And so, you know, he was just like, I was showing them how originality AI and a lot

42:51

of the different things that you can do that will.

42:54

determine and grade how much of a project is written by AI.

42:57

But here's the thing, that could be wrong because I wrote a lesson in collaboration

43:01

with AI and I felt like, you know, it was more like a 60, 40, 70, 30 maybe split on

43:08

the writing. But when I put it into the AI content, you know, distinguisher for lack of, for lack

43:14

of better words, it said that, oh, 90 % of it was written by AI.

43:18

And I was like, you lying, cause you didn't even.

43:20

your citations. You didn't cite your sources.

43:24

I had to go get this video. I had to find this article because when I asked you to find an article, you gave me

43:30

an error message. So, you know, don't don't sit here and tell me you wrote it.

43:35

But I know so I was showing the kids were just like, well, great.

43:40

Now you write us out like that. I was like, well, but in addition to that, your professors can go into your document

43:46

version history and look at it and they can see where a big input.

43:50

of information happens. So either A, own up to the fact that you use AI with this project, show where you

43:57

used AI, cite it, show how you made it better or how you took that idea and

44:02

further developed it. Then you will get credit for the AI and you will get credit for being honest.

44:08

The problem is you guys want to pass it off as though you did it and that's where

44:12

you're lying. That's where you're getting.

44:14

So just own up to it, properly cite it.

44:17

I access AI this day. This is the prompt that I gave it to get this information.

44:21

This is how I made the information better.

44:23

Just cite it and you will never ever have a problem with AI.

44:27

We just wanna know where it is and what words are yours and what word isn't yours.

44:33

So, you know, I just have, it's the same thing that we had to teach kids when, you

44:38

know, technology first came out. What is a reputable source?

44:42

How do you know this is a reputable source? how you cite this rep, whether it's ALA, Chicago, you know, MLA format, whichever

44:49

it is, heck, AI so sophisticated right now, all you gotta do is tell it to do

44:53

that. And it will get the source in that format in less than five seconds, which, hey,

44:58

that's an improvement because I grew up, I grew up, when I was in college, you used

45:03

to have to still go to the MLA format handbook, find a way it was properly typed

45:10

in there. Yep.

45:13

you know, bottle your stuff behind it. Y 'all kids don't even know how spoiled you are.

45:18

You little entitled little suckers.

45:22

yeah, no, totally, totally.

45:25

Man, I love it. So I came to talk about your using AI for literacy lesson and I've gotten so many

45:32

more lessons within the discussion of just thinking about how this is going to what

45:38

we should be thinking about and how this is going to impact our students,

45:41

ourselves, our profession. Boy, so many, there's just, it's like an onion of great ideas that just every...

45:47

Every layer that you share reveals another layer that we need to be thinking about.

45:51

It's so great. I love it. Man.

45:53

I think it's so important that I don't keep it in my brain.

45:57

This is the reason why, you know, this podcast is important to me because it's

46:02

great. Yay. Gene, you benefit from it, but unless you sharing it with someone else, it does no

46:08

good. Just being stored up in my head. I have to share and show other educators how it can be manipulated for good, for

46:15

bad, for, you know, for whatever reason.

46:18

So yeah, I want to, I want to do things like this.

46:21

I'm. I literally just taught a college on Friday, you know, what it was, what I'm

46:27

looking at as far as the work that I did for the Pulitzer Prize, but also how you

46:33

can use it as an educator to entice your kids.

46:35

Don't be afraid of AI.

46:38

Get your, you know, verbiage right so the kids understand how they can use it

46:42

because I have to teach for future.

46:45

Not just now. Our kids are need, they will need to understand how to engage with AI.

46:52

the faculty will need to understand how to engage with AI.

46:56

We didn't think when the computers rolled out that they were going to be as involved

47:00

in our lives. We didn't know we were going to be carrying it around.

47:03

You can't be, you know, 10 minutes from your technology.

47:07

It has to be in the same room with you.

47:09

We did not know. And so I feel like we're on the cusp or at the tip of a very large iceberg as it

47:16

relates to AI. The longer you use it, the more involved and engaged with it.

47:21

You don't have to worry about AI taking your job.

47:24

You will have to worry about it if you stick your head in the sand, you're like,

47:27

no, I'm not going to learn one more bit of technology.

47:30

I'm not going to move with it. Yeah. Now you do have, you know, job insecurities because you're flat out

47:37

refusing to engage with something that is transforming the way we communicate.

47:43

You know, AI is powerful.

47:46

but everybody needs to do their part in inputting how it's going to be used, how

47:50

it's going to be clear in the expectations of privacy violations and privacy

47:58

protections. So we got to look at it, all of it.

48:02

And so school is the first place. That's where you learn your social customs.

48:06

That's where you learn your academics, your human behaviors.

48:09

And so to not embrace AI in a school setting,

48:13

you're setting yourself up and you're setting your students up for failure.

48:16

It's just the bottom line. So I'm trying my best to do what I can to help and to educate students or faculty or

48:26

even as an education consultant.

48:29

I will talk to, I will share everything in my brain.

48:32

If you have the time, I will pour this sucker out for you and what I don't know.

48:38

And you have a question on, if I don't have the answer to it, you better rest

48:41

assured. And that 24 to 48 hours, I'm gonna find that answer because that's just, it will

48:47

nag my own brain not knowing and not being able to help or assist on that because

48:53

that's where I feel like librarians is what we naturally do.

48:57

We are, you know, the help mates in the school.

48:59

If you don't know where to go, if you're lost, find a library.

49:02

If you don't know, you know, where to go with a lesson, collaborate with your

49:07

librarian. If you don't know how to plan a program.

49:13

That's what your librarian is. If you want to talk to somebody but you don't know how to set up the podcast, call

49:19

your librarian because they can teach you about Riverside in a minute.

49:22

So I don't see how people are not up in arms about what's happening to our

49:31

libraries and not trying to use all of it because we're more than just books.

49:35

We're books. We're technology.

49:37

We're social workers. We're therapists.

49:40

Mm -hmm. We're playmakers.

49:45

Safe spaces, movers, shakers.

49:47

We're the... Absolutely.

49:50

I mean, like you said, heart of the school.

49:56

We are all of it and then...

49:58

and even if some libraries to wedding venues, I did not know a library could be

50:04

a wedding venue. And it's like, when you think about some of the classic, you know, libraries,

50:12

whether that's, you know, the library in Philadelphia or Chicago, you know, there

50:17

are various different libraries. But yes, I learned that libraries can be wedding venues and that just, I'd never

50:25

thought about it. like that. Like, a life?

50:30

So it amazes me what, you know, all the many facets of the library.

50:35

Yeah, well, this is amazing, so thought -provoking, so many things to not just

50:42

feed our brains, but to really get our professional thought processes and juices

50:50

go in to think about how we can bring this to our students.

50:53

I'm so glad that you came to share.

50:55

We're now gonna go in a completely different direction.

50:58

It's time for our book break.

51:01

So. For the book break, you get to share any book you like.

51:05

Anything could be personal, professional, for kids, for staff, for students, for

51:08

yourself, just a beach read, whatever.

51:10

Anything you like. What's a book you think people should know that they should try and get their hands

51:15

on? Okay, I will discuss my book Legacy and this is a black physician's a black

51:27

physician I say reckons with racism in medicine.

51:30

I think is super important that we start discussing especially this last week where

51:37

there's a law coming out where they're banning Dei any type of Dei instruction in

51:43

a field of medicine and I feel like

51:47

We are rolling back in the worst of ways if that is what's happening because equity

51:53

and inclusion and medicine, that goes back to the Hippocratic Oath of do no harm.

52:02

DEI is affirming that do no harm.

52:05

So this is by Dr. Uche Blackstock and she talks about how she is a second generation physician.

52:12

Her mother was a physician and her and her twin sister.

52:15

became physician as well. And it talks about her experience as a young black doctor in America.

52:21

And so how a lot of the instructional part is the data as it relates to the black

52:30

community is so biased, is so jaded, is so misinformed that you have doctors

52:36

believing that they're following proper medical practice.

52:40

But because that medical practice never consulted black doctors and just took what

52:44

they. internalize, understood, but never practiced and applied about the black

52:49

community. And they're using that as medical knowledge to train all doctors.

52:55

And so it's scary.

52:57

Some of the things that she discusses in here and how doctors relate.

53:01

And so I looked at that and the reason why I wanted to read it is coming out of the

53:06

pandemic and seeing how it.

53:10

Decimated you know communities that look like mine.

53:14

It was important for me to understand why Knowing what I have experienced in my

53:19

various medical issues and not being listened to and not having a doctor To

53:24

advocate for you having to get up and walk out of a doctor's office, you know almost

53:28

eight months pregnant because at that point in time I was done with that doctor

53:33

not listening to me and having a complications after that and knowing I

53:38

the complications that other women who look just like me have faced, that's an,

53:43

it gets back to an instruction part.

53:46

If we have to get back to the diagnosis of the why and why is this happening?

53:50

Some people have to be taught that way. And so she goes into the deep dive of how these biases are killing people and how

54:00

it's a complete violation of the one thing that they're supposed to do, which is.

54:04

No harm to a helm a human that's in pain or that needs your help with your school

54:09

set with your skill set and to save that life and so when I think about The

54:16

segregation and how you had black hospitals and how you had, you know white

54:21

only hospitals and I think about how When dr king said we were integrating he was

54:27

trying his best at the end He was saying i'm afraid that we're integrating into a

54:31

burning house It makes me wonder

54:34

if some of what the medical racism is, is the burning house.

54:39

Because when a person is under and they're unconscious, anything can be done to them.

54:43

They can't defend themselves. And there are only certain people in that room.

54:47

And you gotta know that that person that's in that room, when you are knocked out,

54:52

unable to defend yourself, will still defend you and will still help you and

54:56

will still give you the best of their abilities.

54:58

And so in reading her story and seeing how some other, and she went to Harvard

55:03

Medical School. and seeing how some other doctors treated her and how some other fellow doctors in

55:10

training treated her and listening to her story, understanding what her mother went

55:15

through and what her and her sister, it's riveting.

55:19

And I think about it in a goss, it is, yes, that's what I'm saying, in a ghastly

55:25

way because it even extends into how our soldiers retreated after they returned

55:32

home from. various wars, World War I, World War II, World War III, World War III, sorry.

55:38

But the Vietnam War, sorry, in my community, Vietnam War was a long gone,

55:43

long enough war to where it was like a World War III because the Vietnam War went

55:47

on for years upon years, way longer than World War I and World War II combined.

55:53

So, and then our, the effects of it, so many community members coming back, you

55:58

know, with diseases.

56:01

from the chemicals that were sprayed during Vietnam and the Agent Orange and

56:05

how our community was treated afterwards.

56:10

And the medical sense of how they were taken, whether they were taken seriously.

56:16

And then how that mentality, everything from how they were treated with their

56:20

disabilities coming back home from war, how they were treated with, as it relate

56:27

to the Tuskegee experiment, how they were treated.

56:30

Yeah. the apprehension that went along when the, uh, you know, COVID -19 vaccine came out

56:36

and a lot of the hesitation.

56:38

This is the stuff that, that was knee deep in there.

56:41

I feel like everyone should read because it's going to give you a perspective when,

56:48

when case in point, I'll tell you a small story.

56:52

I stopped talking to a sorority sister of mine because she minimized my racial

56:57

experience in, you know, medicine.

57:00

And she would downplay a lot of the concerns that I have.

57:03

Well, you don't understand how doctors talk and you don't understand how nurses

57:06

communicate. I've worked in the hospital and they're just mad.

57:09

I was like, no, this doctor was not listening to my concerns.

57:12

And so it got to a point to where I had to stop communicating with her because I felt

57:18

like you're dead. And she was mixed race.

57:20

So she was, you know, I thought she would understand, but it seemed like she.

57:26

She didn't and I wonder if this what it feels like when some patients go in and

57:31

they're trying to talk to their doctor and communicate what's going on and their

57:34

doctor is has willful, you know guards up in their ability to relate and empathize

57:40

with their patient. So that's why I feel like everyone because she has such storylines and so many

57:47

different stories that even if you're not a person of color, you can see.

57:53

how the biases and the marginalization and the talk, the teaching and practice has

57:59

influenced how doctors treat people of color.

58:03

And so it's, I feel like...

58:07

It's a mass equity we need to read together as a culture, as a society, just

58:13

in the wanting to improve and help your fellow man.

58:17

I think that's why you should read it to understand your fellow man.

58:22

Definitely. She has a fascinating story and I cannot believe these are medical professional.

58:29

Sometimes I have to stop because I'm like, this is a medical professional acting this

58:33

way. Like how, you know, how are you letting your

58:37

your personal hatred and not feel this as a sentient intellectual that has gone to

58:42

school for extra years to understand the human brain and the body and the biology.

58:47

How can you not still yet understand and interact with these blinders?

58:52

So it's crazy. Yeah.

58:55

Yeah. How can you ignore the reams and reams of data that have been collected that show

59:00

what a different experience people get when they go in to see their medical

59:06

professionals based on their look or how they might be different than their care

59:14

provider. It's just, it's mind blowing.

59:17

This is absolutely a book that is going on my short list of TBR, because this just

59:22

sounds so fascinating. Wow. It really is.

59:25

I...

59:28

You know what? She's doing such a fine job.

59:32

I wish a lot more media would listen to her story.

59:37

You know, CBS, Good Morning, did a segment with her that I thought went phenomenally

59:42

well. So definitely look at that because she did...

59:47

I mean, it's only six or seven minutes, so they can't go that deep into it.

59:51

But I think it's critical.

59:53

right now, especially when you have legislation that's coming out to say no,

59:58

we're no longer going to teach on this aspect of, you know, doctors are not

1:00:03

supposed to solve racism. Are you serious?

1:00:06

Doctors are the first one that knows that scientifically racism has affected the

1:00:10

body. How are you going to sit up here and tell me that's not a doctor's problem to

1:00:14

understand and instruct and teach on to be better helpful and preventative for, you

1:00:19

know, up -and -coming doctors that are about to go through the program.

1:00:23

So when I look at our headlines that are saying that and I look at the structured

1:00:27

attack as it relates to now they're trying to come after they came after the

1:00:33

intellects in the school system because it might be a, you know, occupational bias,

1:00:38

but librarians are typically just the smartest people in the room.

1:00:42

Okay. I know, I know I said it, but usually we are, we're pretty doggone sharp.

1:00:47

Okay. You're not gonna pull one over on us too easily.

1:00:51

But now they're going after the intellects in the field of medicine.

1:00:56

And I'm sorry, but that's here.

1:00:59

That's oversight, overreaching, you know, trying to get us into a whole, I'm sorry.

1:01:05

You are gonna need the education. You're going to need the clearness of mind not to be manipulated in that way,

1:01:11

especially when it comes to health.

1:01:13

You know, now we're talking about longevity.

1:01:16

and messing with people's livelihoods and being able to survive.

1:01:20

And you better be concerned with it.

1:01:24

It's your prerogative as far as it relates to you wanting to keep your medical

1:01:28

license as a doctor, you must understand, you must stop this.

1:01:32

I feel like, and I feel like they did not, they kind of tangled with it with Dr.

1:01:39

Fauci as we were going through the whole pandemic and how they tried to discredit

1:01:42

him. And I thought that was a precursor to what they're trying to do now with doctors and

1:01:47

remove, you know, DEI instruction.

1:01:51

As I, oh, well, we were able to make Fauci resign and want to toss up his hand and

1:01:56

walk away from the madness. And now we're going to make, you know, doctors of color, which when you look at

1:02:02

the doctors across the nation, only 5 % of them are black doctors.

1:02:07

And when you have a 12 to 13 % population of, you know, black people.

1:02:13

and only 5 % are doctors, that's scary.

1:02:16

It's even scarier when you get into the political arenas and so, and when you get

1:02:20

into the CEOs of companies. So that marginalization we're used to, but tired of.

1:02:27

And I feel like everyone is, it's not just black people that are tired of, we are all

1:02:32

as Americans united, sick of the bull.

1:02:36

And so we, if you want to start playing with the schools and censoring what we

1:02:40

read, fine, we'll get it. I'll get the information one way or the other.

1:02:43

I ain't going back in time. So that means I gotta be an underground, speakeasy library.

1:02:48

I'll give you a certain knock in the code and I'll do what I need to do to get the

1:02:52

resources to you. But I hope that's not what we're gonna have to do for hospitals.

1:02:57

I don't wanna have to go to an underground, speakeasy hospital because

1:03:01

they believe in DEI versus the above ground hospitals that won't support DEI.

1:03:10

yeah. do as a society.

1:03:12

We have to have learned from the history books.

1:03:15

Come on now. Yeah, seriously.

1:03:18

Well, I'm gonna definitely be coming to your SpeakEasy library.

1:03:21

So that goes without saying.

1:03:24

Thank you so much. You have given us so many things to think about, so many great resources.

1:03:29

I cannot thank you enough for taking the time. I'm so glad we got to do this and hopefully we can do it again sometime,

1:03:34

because you are just such an amazing font of knowledge and I love talking to you.

1:03:39

So thank you so much. so much my friend this has been a blast and yes, I promise next time I might

1:03:46

choose a lighter lesson that won't I won't have to Give you a better sweet

1:03:52

understanding of it. How about that?

1:03:54

I do love AI.

1:03:58

I Just want to used in an equitable manner

1:04:04

it should be.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features