Podchaser Logo
Home
Addressing the Risks of AI with Innovative Solutions Encrypting Communication

Addressing the Risks of AI with Innovative Solutions Encrypting Communication

Released Monday, 13th May 2024
Good episode? Give it some love!
Addressing the Risks of AI with Innovative Solutions Encrypting Communication

Addressing the Risks of AI with Innovative Solutions Encrypting Communication

Addressing the Risks of AI with Innovative Solutions Encrypting Communication

Addressing the Risks of AI with Innovative Solutions Encrypting Communication

Monday, 13th May 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Well, ari, it's great to finally get you on the podcast.

0:03

You know, I think that we've been planning this since, like last fall.

0:07

I know.

0:08

It's insane how having a kid can just like completely alter everything.

0:15

You know like every day is a different day. It's like pick her up from daycare.

0:19

It's like, oh, you're really sick today.

0:21

I guess the next week is now taken up up for me.

0:26

You know, taking care of you yeah, I, I, uh, I don't.

0:30

I don't have a kid yet. In fact, my wife and I are actually uh probably soon on the way there, but but I, I certainly have a lot of friends with kids and I know, I know that it it just uh introduces an entirely new set of both challenges and opportunities and new experiences.

0:50

In fact, my co-founder just had his third kid a couple of months ago, so we're sitting on Zooms all day with a little baby on his lap a lot of times.

1:02

Yeah, yeah, that's how it works out.

1:04

It's fascinating, right like.

1:08

It's almost like you get to experience a new phase of your like.

1:13

What you were like potentially you know, but from the outside you know because, like, how you remember being a kid is totally different from how you know.

1:22

Your parents experienced it, right like um, does, does being a parent?

1:27

do you find it like? Uh, does it give you sort of a new respect or new understanding of your own parents?

1:37

yeah, I, I would say so for sure, you know I can.

1:40

Now I understand their struggles a little bit more and I guess the flip side as well.

1:52

So I never really had a good relationship with my dad and I always assumed I always gave him the benefit of the doubt, like well, maybe things change when you have a kid or whatever, and whatever it might be, and maybe I'm too early on into it, you know, but it kind of it, kind of I guess reinforced it in the wrong way is like hey, like he had no excuse to be this way, right, um, but like it's it awesome, it's my favorite thing in the world.

2:25

Um it like I enjoy every minute of it.

2:28

You know, like my wife, my wife she's, uh, she's an early childhood teacher and um, so she's used to kids, right.

2:37

So I was going into this thing not knowing anything and I was so nervous, you know, and even like even the, the times where you're up, you know, every two, three hours in the middle of the night, and you know you're you're operating on zero sleep, um, like I, I still didn't even complain, I, I didn't even like it wasn't like a miserable, I didn't even like it wasn't like a miserable, like, yeah, it was tough, I had no sleep, but like, overall it was like, oh man, like me waking up at 4am which I'm not a morning person, right, so me waking up at 4am with this little baby, you know, and she falls asleep like on my chest, like that's, that's like it's the dream.

3:22

I don't need anything else. You know what I'm saying.

3:25

Like yeah and sure enough, you know.

3:27

Now she's too big to do that right, like if she tries to sleep on me, like it's, it's uncomfortable for her, you know how old is she now?

3:37

uh, she is 14 months, oh wow cool, yeah, but I'm like you've, you've been, you've been in it for a little while now.

3:47

You have kind of the protocol.

3:49

Yeah, yeah, we're. You know it took a while, but you know we're in a schedule, we're in a routine now.

3:56

So and she gets it too.

3:58

You know, like today at daycare, like she did not want to go to daycare, and when I showed up she's, like, you know, like today at daycare, like she did not want to go to daycare and when I showed up, she's, like you know, crying a little bit, complaining, but she knew that she would go.

4:10

And then there's other days where she's like can't wait to get out of my arms Right, like get out of here, you know, but, um, it's, it's really, it's a fascinating thing, in my opinion.

4:26

I mean it's, it is our prime directive, right like that's. What we're all here to do is to is to uh, you know, to to to procreate and to make more humans and and you know, I think there's something truly innate and like very deep within all of us of like oh yeah, this feels, this feels right yeah, yeah mean, we're 14 months in and I'm already like pushing my wife like, hey, let's get another one going.

4:50

You know, let's get all this sleep deprivation out the way immediately.

4:53

Yeah, you know, like people, you know people tell you about sleep, sleep deprivation with a kid and whatnot, and it was literally two entire months that I do not remember, you know, and I went to dinner with people.

5:09

I had, you know, conversations that like impacted you know the podcast and everything like months later, and you know I would, I would be talking to my, a friend of mine, and uh, you know he'd be like, oh, don't you remember like we were at this dinner and you know they were gonna sign up for this thing or whatever.

5:28

Like no, I actually don't remember that taking place at all.

5:32

You know it's, it's uh, it's crazy how your your mind just like blocks it out and you somehow get through it and humans, if nothing else, are extremely, extremely adaptable.

5:47

Yeah, yeah, that's a good point. I mean, you know, I'm not a morning person by any stretch, right, like in college, if I was up before 10, it was like a miracle, right, yeah, you know.

5:59

But after two months of waking up at like 4 am, you know, my body just like immediately switched over to okay, you're getting up at 4 am and you're going to be fully energized, and oh like oh, you got three hours of sleep.

6:13

Yeah, you feel great.

6:14

Yeah, you know before yeah.

6:17

It's like before, like three hours of sleep. It's like, hey, I'm not doing anything today.

6:21

Yeah, yeah, it's like I feel like I was drugged.

6:26

Yeah, yeah, exactly. Well, ari, you know I'm really interested in hearing how people kind of get their start, you know, in this industry and I'm wondering where your start is.

6:40

You know what made you want to go down this IT slash security route and you know the reason want to go down this it security route and you know the reason why I start everyone there is not not just to hear everyone's story, but it's for my listeners to hear everyone's story, because everyone's coming at this thing from a different angle, from a different point of view, a different background.

7:00

You know, for instance, right like I, I had someone on that was a former opera singer and now she's application security.

7:04

You know, for instance, right like I had someone on that was a former opera singer and now she's application security.

7:08

You know, expert and I'm sitting over here like how, like how do those dots, even you know, connect together?

7:15

right, so you know what's what's your story yeah, I think, um, you know, I I would say perhaps not as not as disparate as an opera singer, but similarly, I think, a fairly non-traditional path to being a founder of a security startup, american University in DC, and studied under a guy named Eric Novotny, who will become relevant later in the story, so just remember that name.

7:51

He's a cybersecurity guru, one of the advisors to US Cyber Command, darpa, some other places.

7:56

So, yeah, so I graduated college with a degree.

8:02

I thought I wanted to be a diplomat. Um, more time I spent with people who worked in the government.

8:08

I was like you know what I don't think this is for me. Um ended up coming back to los angeles, where I'm from, um, worked as a foreign policy advisor and then actually, uh, I was actually excited to talk to you about this.

8:21

I, I started a podcast. I started a podcast this is 2015 called millennials don't suck, which it was kind of like that initial wave of podcasting.

8:35

So, you know, I think competition was definitely less, less intense, and so we we released maybe eight episodes and we were on the top 50 podcasts on itunes and in the new york times and like got a bunch of press, um, and it was really, really fun.

8:54

Like just it was me and my buddy um, just kind of interviewing millennials who we thought were doing cool stuff and kind of trying to like at the time there's like sort of a narrative about the generation, so we were kind of trying to counter that um, so that that was.

9:09

You know, that was really fun. It was totally obviously not what I studied or what sort of my focus was in, but I love talking to people, um, so I was doing that, it was going great, we were, we were, you know, our audience was really growing.

9:22

I co-host, my co-host, matt um, then had a massive hemorrhagic stroke and ended up in the icu who was basically almost killed.

9:32

Him he was, he was in recovery he still is recovering, as you know, to this day um, so I didn't really feel like I could continue the podcast without him.

9:41

It just kind of didn't feel like it was.

9:43

It was the same um, I didn't have kind of the fire for that concept, um, but you know, sort of the I guess the consolation prize and this is, like you know you'll, I'm sure you you hear this a lot like you can only see kind of where the path is in reverse.

10:00

But like at the time like it was just pinging all around, because I had started a podcast just by myself that had become successful.

10:06

People started to reach out to me and be like, hey, can you help me do the same thing?

10:10

And so I I built out a production company called Curious Audio, started that in, I think, 2016, and then built it up.

10:21

Over the years. We produced, you know, over 100 shows um, sold, sold, sold some shows and some concepts up to, like, you know, stitcher, spotify those types of places.

10:33

Um really kind of just cut my teeth in the podcasting world, in the media world, um, and grew like a, you know, quite profitable business out of that.

10:47

One of the sort of the things that got me back into this world of privacy and security was we were hired to basically to produce a mini series on the harvesting of human behavioral data.

11:00

This was like kind of post Cambridge Analytica scandal at Facebook and there was a lot of concern and talk about sort of what psychometric targeting and some of these things we're going to, we're doing already and would be doing, especially as AI advanced, and so I was also hired to host that show and so I got to basically travel around the country for six months and I did it in person.

11:27

This was pre-COVID, so I just like traveled around in person interviewing everyone from policymakers to tech founders to, you know, to people in politics, kind of every angle of looking at sort of how the harvesting of human behavioral data and the use of that data was affecting our society, us, how we interact with each other.

11:51

And one of the really probably the most illuminating and in my for me at least sort of destiny defining conversation was with a woman named judy estrin who, um she was on vent surf's team that sent the first email at stanford and kind of built many of the early internet companies.

12:17

She like built the first ethernet company. At one point she was the cto of cisco.

12:21

She was the the first woman on the board of disney, um.

12:26

So I was sitting in her living room in I guess this was late 2019 and we were talking about like sort of the asymmetry between just your everyday human and these big technology platforms and how that was growing and growing and it felt kind of like a sort of a Sisyphean task to try and sort of overcome that power dynamic.

12:52

And so I asked you know, judy, what do we do? And she said to me scale down for trust, interconnect for power, and it was like one of those sentences that I you know I had never built a technology company I I had no really idea what she meant at the time, but I just like I remember just getting goosebumps, you know, just like whoa that's that feels important, um, and it really stuck with me.

13:17

You know, just like scaled down for trust. I've always been somebody who's really um, like close relationships have always been really important to me, like I've had the same best friends in my whole life, um, and really like, main, being uh, in integrity with myself and with the people that I trust is is kind of the most important thing for me and um, and so, yeah, I just kind of started thinking about like, okay, like scale that, like what, what does that mean?

13:44

And then kovat happened, um, my, my, at the same time as covid happened, my dad was diagnosed with cancer, um, and so I was like taking care of him.

13:54

You know it was covid. So we were, we were having to be super careful, um, and there was also, like you know, I, I, as somebody who studied geopolitics, um, you know I was always paying very close attention to the news, um, obviously the russia disinformation hacking had been happening and and the twitter hack I don't know if you remember that where, like you know, the sort of a lot of celebrities were hacked, um, with kind of this god mode, uh exploit.

14:25

I just kind of I I started thinking, like you know, scale down for trust, interconnect for power, like what is, how could that apply to a lot of some of the issues that we're facing?

14:35

Like how, like what if we could use what, if we could use the trust that we have in the physical, in like the real world, and bring it into the digital world in some way?

14:44

So I called up an old college buddy of mine.

14:47

We just started talking, like you know, man, like what if there was a way to know that you were talking to the you know who you thought you were supposed to be talking to?

14:55

Like what if there was a way to um, you know to, to prevent sort of a god mode level attack from occurring?

15:04

You know, preventing somebody from to prevent sort of a God mode level attack from occurring. You know, preventing somebody from moving laterally within a system.

15:08

Um, and really my aha moment happened, uh, probably around, yeah, sometime mid 2020,.

15:17

I was doing a, I was doing like a two factor password recovery.

15:20

Um, you know, I don't use password managers because I'm obviously they all get hacked and and I'm, I'm, you know, I'm skeptical.

15:28

Um, so I had forgotten my password and so I was doing a two-factor.

15:31

But then, of course, you know, my paranoid brain knows that I could be, you know, I could be sim swapped, I could be email spoof like they're, like I, I uh, there's no way to know that.

15:44

It's really like passwords have always just really kind of gotten under my skin.

15:49

And I was at the same time I was in a group chat with my best friends that I mentioned from childhood and I was like man, like I trust these guys more than I trust myself to remember a password.

15:59

Like what if you could use trust? Like what if you could use trust?

16:02

What if you could just use trust as a security layer or an authorization layer?

16:07

And so that was when I called my college buddy up, who's now my co-founder, and together we called up our old professor, who's the civilian advisor of Cyber Command and was an Obama appointee on cyber, and kind of like one of the leading minds that we have, and we started talking and really, at that point, it was just conversations.

16:35

You know, like I mentioned, I was taking care of my dad. He was getting sicker, so it was like it wasn't really the right time to build a company.

16:43

Then, at the same time, I think, just like the, I think, you know, in 2020, there was starting to be some conversations more about things being, you know, privacy being more priority, security being more priority.

16:57

But I think even then, like, um, I think it was a fundamentally, I think I think the world wasn't quite ready.

17:05

Um, really, the change happened, uh, like end of 2022, beginning of 2023, with the advent of like chat, gpt and sort of the ai explosion.

17:16

And we saw, and I started to see, like okay, like it is going to become critical to know that you are talking to the person that you think you're talking to and to know that a piece of content can be trusted as coming from a human being and it actually goes back.

17:36

You know I mentioned Eric before, my own professor.

17:38

You know he taught us this concept in college of the red internet and the green internet, us this concept in college of of the red internet and the green internet and on the red internet.

17:48

You know, you know this was back in you know, 2011, 2012.

17:50

He was like on the red internet. You know it's all going to be bots.

17:52

You're not going to know who's real. You're not going to know what's real.

17:55

You're not going to know who's trusted. 99.9 of the content is going to be generated by ai and it's basically just going to be like a forest of misinformation.

18:04

And on the green internet, you're going to know who's real.

18:08

You don't know what's real. You know who's trusted, and it kind of came back to like okay, what if we could take sort of trust that exists in small groups, scale down for trust and use that as basically a layer of confirming each other's humanity, of sharing information only to the people who should see it, and then of being able to eventually broadcast information out to the wider, the wider world.

18:36

Um, that can. That has, you know, human provenance and a chain of custody of of like okay, we know, this is this, this, this information came from human beings, and so we uh, we you know eric being the cyber uh expert that he is uh, spec'd out a you know an architecture and we brought on a uh, you know, we, we, we started building, we brought on a cto and and raised raised our first round of funding last June have been building in stealth for the last I would say, I guess, eight months.

19:13

I'm actually honestly it's kind of fortuitous that our conversation has been delayed a few times because we actually can now talk more publicly about sort of what we're building.

19:24

We actually came out of stealth about a month ago and are, are, are in, are in private beta right now.

19:30

So I can, I would love to send you, send you the beta when we get off this.

19:34

Call Joe, and yeah, we're, we're, we, we now are have basically built a tool that allows you to leverage your trusted relationships in a kind of a new way and bringing in a lot of sort of cutting edge cryptography, on-device biometrics, public-private key sharing into an ecosystem that essentially relies on trust to share information, know that you're talking to the right people, keep information compartmentalized eventually API consensus across existing systems as a way of, you know, adding additional layers of protection and then broadcasting validated, you know, cryptographically attested information out into the world yeah, it's uh, it's a.

20:26

it's a fascinating area, right, especially with the explosion of ai recently, because it it's like uh, it introduces like a really it's a really good power and it's a really like high risk as well.

20:45

You know, like, yeah, I want to use chat, gpt to write, you know, a better paper, a better email, whatever it is.

20:51

Right, I want to do that. But I'm also giving it more power to emulate me, right, because I'm putting in text, into my own words, and I'm saying write it better.

21:04

Well, it's able to analyze how I wrote it to begin with, you know Exactly, I mean, yeah, we.

21:09

I don't know if you saw this story, uh, a couple of weeks ago, but there was a banker in hong kong, um, who got an email from his cfo like, hey, you know, we, we, we need to wire these people 25 million dollars, like we're doing a deal.

21:24

And you know, just kind of doing his diligence, he was like okay, like sure, like can we get on a call to confirm?

21:32

So he gets on a call to confirm.

21:35

On the call it's a video call on the call to CFO.

21:38

Some other colleagues, you know they chat about it, they talk through the deal flow, everything looks good.

21:45

So he's like, okay, great wires, the money turns out, email was spoofed.

21:50

Gets on the call. Everybody on the call was deep faked.

21:54

Wow, money goes out to criminals.

21:57

So so that that's like sort of a direct use case that our tool, sort of creating these encrypt, end-to-end encrypted, what we call pods of trusted people is really solving for, of sort of like upping the ante on, you know, with generative AI, whether it's voice cloning, whether it's deepfakes, like a lot of the social engineering stuff that we're seeing already being so effective is just going to explode in its efficacy with AI.

22:27

And so you know I'm sure you're seeing it in your life like families having sort of like a passphrase or something that they say to each other is a way of like.

22:35

Okay, this is how you know I'm the real me.

22:37

But I think we're going to very quickly move beyond some of these just analog solutions, and I think that's kind of where our, our product starts to fit in the, in the equation of like.

22:49

You know, this is something you know, we're not tied to phone numbers, we're not tied to emails.

22:54

It's really the trust that you have in the real world.

22:57

And then, relying on your device, biometrics, um, you know, as, as as as the uh, you know, the decryption of, of the information, um fits into the stack so can.

23:13

Can we walk through maybe how your solution would have prevented you know that that attack?

23:21

Right, because you know it's like to me.

23:25

I can piece it together. Yeah, you know, um, but I think it'll make more sense coming from you.

23:31

A hundred percent so there's a, there's a couple ways. So let's, let's sort of our system is built around this idea called a pod, right?

23:38

So, um, you know basically who you know, a group of people that trust each other.

23:43

So let's say I want to start a pod.

23:45

I go into the Kibu app, I create a pod.

23:49

I'm then prompted to send out an invite to whoever I trust.

23:53

So I send you an invite, you receive an invite.

23:57

You're then prompted to take a photo or a video of yourself, proving that you are you.

24:02

I then, within the Kibu app, receive that photo or see that photo or video.

24:06

So I then, within the Kibu app, receive that photo or see that photo or video, I say, okay, yeah, that's Joe.

24:09

I then get to vote you in and really the vote there is key, right?

24:13

It's kind of the entire system is predicated around this idea of consensus and quorum.

24:21

Let's say, then we want to bring a third person in.

24:24

You could now invite someone, or I could invite another person.

24:27

Same invite flow occurs. Then we both have to vote that person in.

24:31

So that's kind of it's adding a layer of sort of proof of humanity and then consensus into these flows and you know, theoretically, in that circumstance.

24:45

You know, if you've already created a pod with someone, right, you know you created, let's say, a month ago, and then you're, you know you're, you're on this call and and you're saying, okay, these are the wire instructions.

24:56

Cool, can you confirm it? In kibu, whoever the person that's deep faking wouldn't have.

25:02

You know they can't spoof, it's not attached to email, it's not attached to phone.

25:05

So even if they sim swap somebody doesn't work.

25:07

They wouldn't be able to do it Point blank.

25:13

The second way you can create a pod is if you're in person with somebody.

25:16

You actually can just NFC bump or QR code an invite.

25:21

So then that's even more secure in that you confirm that they are in fact that person in person.

25:28

There's no out of band evidence that that pod even exists, so the attack surface is essentially zero and, uh, once you're in there, the entire thing is encrypted, unlike signal.

25:40

We encrypt at the pod level rather than the message level.

25:43

So everything inside of the pod um, uh is accessible, no matter when you come in right, so including the history of every action, of every vote taken, every file that's uploaded.

25:56

You know we have a vault inside of a pod so you can upload files, upload photos, videos, collaborate on documents, and this kind of goes into this concept of cryptographically attesting information.

26:26

No information, no file can leave the pod without the group voting to allow for it to leave. So it goes back to this idea of consensus and quorum. But if the group does vote to allow for it to leave, we basically put a cryptographic attestation on that file, as these, you know, these, these humans created it and and each, each keyboard user, has a public key.

26:31

That's that's kind of how you know, each, essentially each device, has a public key, and so then we can have a chain of custody of these public keys.

26:40

Created this information at this time, then we could.

26:42

When, when the attestation occurs, you know, a hash file is created of that, of that content, so we can actually see, okay, this, this file is created of that content, so we can actually see, okay, this file was created at this time.

26:51

These people attested it to be real.

26:53

This is what it looked like when they attested it to be real has it been altered, has it been changed, yes or no?

26:59

And so it's kind of really building, you know, cutting edge crypto security protocols and then bringing in kind of real world trust and building it inside of this trusted ecosystem where you have to be trusted and then you can't even get in.

27:16

There is no like pin code or anything like.

27:19

You need to be biometrically authenticated into the Kibu app to be able to see anything inside, and so you actually have to be the trusted person.

27:27

Oh, that is a.

27:30

I mean that's really fascinating because it sounds like it sounds like that really would have, you know, thwarted a lot of the social engineering attacks and breaches that we've seen.

27:42

Yes, you know, like with octa and mgm.

27:44

Right, because you know it's, it's right. Because you know it's fascinating.

27:49

Right Before, when you would hear about cyber attacks, it would typically be something pretty sophisticated, you know, something that is digital, something based on technology that is getting around, or security control, whatever it might be.

28:06

SolarWinds being a great example of that, for instance.

28:14

Right and like everything now is kind of pivoting, you know, because everyone's security is so top-notch right, like all these boards kind of opened up the budget.

28:19

It was like, okay, get whatever you need, you know, because we're not, we're not doing any better than the people that are attacked and it's only a matter of time.

28:26

Right. Well, now it's shifting.

28:29

It's shifting to that people aspect, that human aspect, you know, and exactly how we're having to come up with these, uh, these new, you know, innovative solutions to validate.

28:42

You know, you are who, you say, you are whatnot well, and not only that, but like know, that's kind of where we really think our API layer becomes interesting.

28:50

Right, you spoke to, you know, the MGM hack, for instance.

28:53

So imagine if, basically, we call it consensus-based authorization.

28:59

So imagine if you know a particular database or a particular pushing code into production or accessing, you know, or making certain authorization decisions required consensus, and it doesn't need to be obviously every decision.

29:15

That would be organizationally inefficient.

29:18

But these, these critical decisions that that you know millions of dollars are really.

29:23

You know that that that really can make or break a lot for an organization.

29:30

Imagine if you then put just some of those critical decisions behind a layer of consensus of just the people who need to be trusted, that that decision could happen inside of an end-to-end encrypted environment without having to rely on out-of-band comms like email or phone number or whatever, where you're communicating.

29:49

Okay, does this? Should this person have access to this system?

29:52

Should this data be shared with these people?

29:56

Should this wire transfer occur to this person where those types of decisions are happening, just among the people who are trusted, within an end-to-end encrypted environment?

30:09

We think that to your point, like um, it prevents an attacker from once they get in with a credential stuffing attack or with a social engineering attack from like if you can prevent someone from moving laterally within a system like what we've seen, where it's like, once they're in, they can just do whatever they want, if you, if you kind of gate certain critical actions with consensus, you can actually prevent a lot of the damage that we're seeing being done that's um, it's really fascinating, you, you know, you you kind of started the conversation off with your, your former professor, right, and his role within the government and whatnot or not really within the government but interacting with the government, and whatnot or not really within the government but interacting with the government and whatnot.

30:50

And you know, with my own experience with doing some consulting and contracting work to different agencies, like you can see, you know his mentality behind how he architected the solution.

31:06

Right, like you can tell that this is definitely something with that kind of that structure in mind.

31:15

You know of how it operates and what it can interact with and what it does and whatnot.

31:20

And you know it's something that's massively needed.

31:25

Right, and this isn't me right, like, try to try to pump up or promote.

31:32

You know your solution or anything like that. Right, like you didn't sponsor the podcast or anything like that.

31:37

This is a, this is a critical area that you know has a high demand and people are kind of just like waiting around, being like man.

31:48

When is this thing going to be solved?

31:50

you know, honestly, it's, it's so funny you say that I mean, obviously, you know, I, I like I said like I have a pretty non-traditional background when it comes to, you know, security.

31:59

I don't, I'm not like a super technical person, so I don't come at this from like a you know, a super technical standpoint, but but I just like sit as truly like somebody who I wake up every day and I all I think about is like I really think the world needs what we're building and it's my job to get to give it to them.

32:21

Like I, I, I just, I see, I see this actually is like almost like a civilizational level issue of if trust breaks down, if we can't figure out a way to both know what we're seeing is real, know that we're talking to a real person, and also being able to just protect the most critical systems in our society from constant attack.

32:43

Things are just going to break, attack, things are just going to break.

32:51

And so I just, I like, truly like this. I see this as like my mission. It's just like how can I give this thing to the world?

32:54

That it's, I mean, it's honestly really gratifying that you see, that you see it as well, just like, yeah, it seems like this is really needed and there is a ton of solutions like it, um, that are trying to tackle it in this way, and so we're just trying to.

33:08

You know, we've just been building and now we're just trying to get it to market and give it to people.

33:14

Yeah, it's like the security of the blockchain, but in the human format.

33:18

You know it's not code.

33:21

I call it the human blockchain.

33:22

Yeah, yeah, I mean that makes a lot of sense.

33:26

You know it's interesting.

33:29

You know your path into this as well. Right, you studied international relations and the security aspect of that, and you know I also got my bachelor's degree in criminal justice with international relations as a as a minor, you know.

33:45

Looking back, do you think the mentality that that program instills into people, do you think it kind of prepared you for the cybersecurity world A hundred percent?

33:55

Oh yeah, and I mean and and yeah, thinking sort of about, you know, operational security, about the need for compartmentalization, about the even like the need for trust, for for trust and sort of mutual trust and um and understanding.

34:13

I think, like something that's always really interested in me, especially in the last five or 10 years, is like, obviously, like I'm I'm a huge geopolitics nerd I would imagine you are as well but like sort of the, the nexus of technology and geopolitics is like really geopolitical, like even look at what's happening in taiwan, for instance.

34:31

Like our geopolitics is is basically technology based at this point, and I think, like it's, it's now in the last even five years, it's becoming more cybersecurity, like everything is cyber now you know.

34:49

Yeah.

34:50

And so I think like, yeah, I definitely think that my degree prepared me pretty uniquely to you know, to think about this stuff in a way of understanding both that everything.

35:05

Everything is probably, you understand, like everything really comes down to relationships in this world at the end of the day.

35:12

Like we can put a lot of window dressing on stuff, but it really, at the end of the day, it's about relationships and that's kind of the, the underlying function of what our product is based on.

35:23

Right is like do you trust this person or not?

35:27

If you trust them, you should be able to transact, to communicate, to safely share, etc.

35:34

Etc. With that person in a way that you know is real and you should be able to use.

35:40

Like. I think part of what I'm seeing, or what I've seen, you know, I think a lot of the.

35:47

We're seeing a lot of the sort of the impact that the internet is having on our society, and I think to your point about all the benefits of AI like amazing benefits.

35:57

It's going to change the world in so many ways.

35:59

But I think we're also seeing a ton of negative effects as well, whether it's social media affecting how we are interacting, how we're talking to each other, you know being with each other, whether it's our politics, whether it's we bring the, the sort of the, the trusted relationships that we have left in the in the real world.

36:29

How can we bring those relationships into the digital world and really use those relationships as a way of kind of carving out a zone of safety that we can then operate in and we can then sort of, you know, move, move around digitally in a way that is a lot safer than what currently exists?

36:47

oh, so you know, just thinking through, you know the design that you kind of laid out um, would this be vulnerable to like a 51 attack?

37:02

You know how I mean. I don't know if it would be and I'm just spitballing it right where you know if you're somehow able to compromise you know, 51 percent yeah right, you know 51 of the the people in your pod right I mean.

37:19

So let's look at a couple of answers there.

37:22

One is within the pod, security is entirely configurable.

37:26

So for decision, you know, for whether it's decisions or access to information that is of the highest level of security, we would recommend unanimous approval.

37:37

So that's kind of the top line there.

37:41

But let's say yeah, let's say just a simple majority is required to execute an action.

37:47

If you think about it, uh, you would have to not only know the know all the members in a pod, which even kibu doesn't know all the members in a pod.

38:00

We just know the public keys. So so you know you, you would have to somehow know that, based on surveillance or some other thing, then you would have to, uh, within sort of a limited amount of time that the other, such that the other pod members wouldn't become aware, essentially coerce at least 51% of the people in a pod to give up their biometrics and provide access to then vote on something.

38:33

As you, as you know, as we know, in the security world nothing is 100%.

38:37

But I think like if let's say, you know, your likelihood of compromising one person is 0.1, you know, and there's seven people in a pod, you know it's 0.1 to the fourth likely that that occurs.

38:52

I think that's a fairly low likelihood.

38:54

You know that something like that might happen and certainly, I think, a better way of operating than you know, kind of what we're seeing currently, which is just like username password.

39:09

Oh, yeah, it's really fascinating.

39:13

It's just fascinating to see how technology is evolving in an area that seemed to be, or was thought to be, solved right.

39:24

Like you know, we, we used to think like insecurity, like, oh, if you just trained your users better, right like you wouldn't have these issues.

39:33

But you know, like, even even now, right at my day job, yeah, um, there was someone that you know had had access to send money, large amounts of money, within the organization, and they got, you know, deep faked from the CEO saying, hey, send them this money.

39:53

And, you know, luckily, the person just had an inkling of a question with it and they waited until Monday, right, until they saw the ceo again and he, sure enough, you know he's like, yeah, I never said that but again, that's a numbers game, right, as as deep fakes become cheaper, let's say you do that five thousand times.

40:18

You know even even one percent.

40:20

Like you're, you're, you're, you're hitting right and so if, if you, if basically you're you, you push those communications into a kibu pod, there's, there's, there's a very, very, very low likelihood of that ever, ever working out yeah, yeah, that that makes a lot of sense.

40:40

So, you know, with the, with the encryption part of it, um, I I find that a little bit fascinating because now I'm starting to dive into, you know, security, securing uh, satellites with homomorphic encryption, and things like that on the side of course, um, talk to me a little bit about the encryption and how it's kind of set up.

41:04

Is there certain parts of your product that you potentially even open source?

41:09

Yeah, absolutely. Or pull from the open source community.

41:14

I would say, in the way that the iPhone was not.

41:19

There was nothing about the iPhone that was entirely new technology.

41:23

There was nothing about the iPhone that was entirely new technology. What the iPhone did was take a lot of existing technologies and bring them together in a really easy-to-use, user-friendly ecosystem and experience.

41:38

I would say that's what we've done here. To answer your question about open source number one, yes, we have pulled from open source cutting edge, open source crypto libraries.

41:49

We also plan on open sourcing sort of our base level Kiwi protocol as well.

41:54

We want to build an SDK to be able to have people build consensus off directly into their systems.

42:04

We think that there's a massive opportunity there.

42:07

We really see ourselves I think I mentioned at the beginning we see ourselves as kind of building the trust ecosystem.

42:15

You know, using our Kibu consensus as a way of providing a foundation for bringing trust from the real world into the digital.

42:27

And so, yeah, we're you know, we're you know.

42:31

If you want to kind of check out the sort of the more detailed specs on how we do our encryption and sort of the you know everything that we're using, you can go to kibuio.

42:41

You can check out our white paper on there there's a very detailed white paper on sort of how we're doing everything.

42:50

I would encourage everyone to look at.

42:52

But yeah, we're, certainly we are, we believe sort of that.

42:59

You know the way to be secure is to be open source and to have this thing.

43:01

You know the way to be secure is to be open source and to have this thing, you know, poked and prodded constantly and make sure that there's nothing that we're missing.

43:10

Oh, yeah, that's, that's the best way to do it.

43:14

Yeah, you know, to really get that consensus, you know it builds momentum through that too, because more you too, because more tech people are starting to be like, oh, this project over here can be utilized this way and whatnot.

43:29

It builds momentum like that. To kind of go all the way back to the very beginning, you mentioned that you had that podcast and you saw that early success and whatnot podcast and you saw that early success and whatnot.

43:47

And you know, I I can really relate to your experience, right, because you said that your co-host had had to drop out of the podcast and whatnot.

43:51

And um, you couldn't really keep on going with it.

43:55

And you know, I started this podcast actually with a co-host, um, and you know, early on he kind of dropped out and I was like, oh man, I don't, I don't know if I can hold a conversation by myself, I don't know if I can do this thing all on my own.

44:11

And you know, luckily or thankfully, I kept on going with it and I pushed through that uncomfortability, yeah, and now I get to talk to awesome people like you.

44:22

So, like it's, it's really interesting where life takes you when you don't just give up at the first.

44:29

You know kind of hiccup or speed bump or whatever, and you just keep on finding a way to keep going.

44:34

Yeah, man, I appreciate that, and I would also say, like you know, you're, you're a fantastic host and interviewer, so I'm glad that you kept going.

44:42

Um, but yeah, I, you know, I had this idea almost five years ago and it's it's, it's, it's so funny.

44:49

This is honestly it's like one of my first public interviews about it, um, but it's something that's, you know, just been in my head for a really long time of like, wow, I like am I dumb?

44:59

Like this feels really necessary and useful.

45:02

Like to people like I feel like I need to build this, I have to figure out a way to build this, um, and so, yeah, you know, it's, it's definitely been a very winding road, um, but I'm, I'm really proud of of what we've been able to build thus far.

45:20

You know, and and, um, you know, get people to believe in us and back us, and and get some amazing technologists to, you know, to help us build it.

45:28

And, and I'm really excited for where we're going.

45:32

You know, I think, like we, we, we already are seeing some traction in the enterprise market, you know, with interest in our, in our product, really, like, we're already seeing, yeah, we're already seeing, yeah, we're, we're, we're seeing a lot of positive momentum of like okay, yeah, people, people that are kind of at the top of the security game know that this is a, this is something that will be really useful.

45:54

And so you know, now it's our job to just continue to build and iterate and deliver, and I'm really excited about what we're doing yeah, it's, uh, it's going to be a fascinating time, you know, also to see how potential attacks like ramp up in different ways.

46:11

Right, because now we're getting it's.

46:14

It's constantly like a arms race, a digital arms race it is and, and you know it's, I think it's it's, uh, definitely, you know, I think one of the cool things about our product is that the security holes are humans, and one of the scary things about our product is that the security holes are humans, right, and so I think we'll see both sides of that for sure.

46:38

We've built it in such a way I think also because quorum is necessary really to do anything make any major decision, like we've kind of built fail-safes into it, right, where even if one person gets pwned like it doesn't compromise the sanctity of the system.

47:00

And you also can you know if you find out somebody's been compromised, you can, the group can just vote them out of the system, and you also can you know, if you find out somebody's been compromised, the group can just vote them out of the pot, vote them off the island, as it were.

47:10

Well, I definitely can't wait to get my ads on it.

47:13

I'm definitely looking forward to that.

47:14

Awesome man. Yeah, I'll send you an invite to the beta when we get off here.

47:19

Yeah, that'll be great. Well, Ari, you know, unfortunately we're at the end of our time here, but you know, before I let you go, how about you tell my audience where they can find you if they want to reach out?

47:30

And you know, connect with you, and where they can find you, know your company and learn more about it dot IO.

47:40

Um is is where you can check out the website, like I said, our white papers on there.

47:44

You can also sign up on our website for the private beta.

47:47

Um, we're starting to send in, send uh invitations out like end of this week.

47:51

We've we've been testing it, you know, with sort of the first layer of friends and families and team and investors and and advisors and advisors, and we're now just starting to share it out a little bit wider, which has been really exciting.

48:04

Me personally. I'm on LinkedIn, ari Anderson Anderson with an E, so S-E-N, not S-O-N.

48:11

I'm on Twitter, ari Anderson.

48:14

You can follow me there. And, yeah, I'm really grateful, joe, for the opportunity to come on here.

48:21

I've had the chance to listen to a bunch of episodes, super interesting conversations.

48:27

I've learned a ton, so I'm just grateful for your time and for whoever's out there listening as well.

48:36

Awesome, well, thanks. I really appreciate the compliments and I'm glad that you're also enjoying the content that I'm, that I'm putting out and whatnot.

48:46

Absolutely.

48:47

Definitely Well.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features