Podchaser Logo
Home
(Part 2 of 2) Top AR Trends for 2024: New Interaction Modalities for AR

(Part 2 of 2) Top AR Trends for 2024: New Interaction Modalities for AR

Released Monday, 10th June 2024
Good episode? Give it some love!
(Part 2 of 2) Top AR Trends for 2024: New Interaction Modalities for AR

(Part 2 of 2) Top AR Trends for 2024: New Interaction Modalities for AR

(Part 2 of 2) Top AR Trends for 2024: New Interaction Modalities for AR

(Part 2 of 2) Top AR Trends for 2024: New Interaction Modalities for AR

Monday, 10th June 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:07

Hello, I'm Karen Quatromoni, the Director of Public Relations for Object Management Group, OMG.

0:14

Welcome to our OMG Podcast series. At OMG,

0:19

we're known for driving industry standards and building tech communities.

0:24

Today we're focusing on the Augmented Reality for Enterprise

0:29

Alliance area, which is an OMG program.

0:33

The AREA accelerates AR adoption by creating a comprehensive

0:37

ecosystem for enterprises, providers, and research institutions.

0:43

Today, Boeing's Sam Neblett will host the podcast session.

0:50

So I know you have a huge amount of experience in the gaming sphere,

0:54

and that's with your podcast. That's with just personal time,

0:59

and I think the gaming sphere is leading the way

1:05

for informing enterprises on how they can use AR both effectively

1:10

and efficiently in just a wide variety of settings.

1:13

You mentioned firearms training, that's guns are pretty popular in video games,

1:18

but I want to think about real world examples of

1:23

thinking outside of the box for industry specific applications

1:28

and potential or interesting use cases that you've noticed.

1:32

So not necessarily just picking up a gun with a

1:36

controller, I'm thinking anything new and groundbreaking that you've seen

1:42

or you feel could be on the horizon. Something like,

1:46

it might not have to be super complicated, but just an out of the box solution.

1:50

So something like voice commands for hands-free machine controlling

1:54

manufacturing, hand gestures,

1:56

and for sterile surgery in healthcare,

2:00

or overcoming issues that you've noticed in UI and UX for learning

2:05

curves with eye tracking and AR applications that you've seen.

2:08

Is there anything very interesting in a specific industry that the

2:13

enterprise might be interested in for ar?

2:16

Yeah, I think quickly just to touch on one of the things you just mentioned there,

2:21

the eye tracking and hand gestures for UI navigation

2:26

I think is crucial. Previously before these technologies,

2:31

if you're using a virtual reality controller or an AR controller,

2:35

a lot of times when you're holding one of these controllers in your hand,

2:38

there's a laser that comes off of the edge of it that you can point at objects

2:43

in the UI to make menu selections and things like that.

2:46

When you drop these controllers, boom, now it's like, oh, where's my laser?

2:50

In what way can I make my next menu selection?

2:52

And I think if you've used the Apple Vision Pro, that's exactly it.

2:55

I think that is the best current method to solving UI and

3:00

UX problems is by glancing at something and having a pinch gesture.

3:04

You can cruise through menus so fast like that. If you've never used the Apple

3:09

Vision Pro, I highly recommend you go to an Apple store and check one out.

3:14

It's a quick demo. And very quickly you'll start to understand,

3:18

I think the direction that a lot of this stuff is going. Now granted,

3:20

the Apple Vision Pro is, I guess technically a consumer device, but

3:27

I think that the intended uses of that device really

3:32

lead to where we're going in this conversation.

3:35

It's funny that you mentioned voice commands. I haven't actually seen any compelling use cases for voice commands.

3:41

Neither have I. Yeah, which I find interesting because it seems like some low hanging fruit.

3:46

And in the gaming side of things,

3:48

I see people doing some really interesting stuff with AI and voice commands by

3:53

basically connecting a chat GPT to the game that you're playing and

3:58

attaching individual personas to NPCs in the world. I

4:03

have a friend named Genis VR who makes a lot of cool stuff for VR and AR

4:07

and uses peripherals, haptics and all that kind of stuff and her content.

4:11

And she will be playing a game like Skyrim VR and will walk up to somebody and

4:16

say, Hey, what do you think about dragons?

4:18

And the AI will connect to the NPC and create a brand new

4:23

never before heard of line of dialogue

4:28

using that character's voice, which is really, really cool.

4:31

But in terms of enterprise solutions and stuff,

4:34

I haven't seen anybody use anything that I found compelling,

4:37

which I find interesting in some of our trainings.

4:41

We do have sections of the aircraft trainings where

4:46

the user is prompted to speak out loud, but it's all,

4:51

it's implied. You know what I mean? It's like now you say this and you say it out loud and then you hit okay and

4:57

move on to the next thing. But yeah,

5:01

I do find that interesting in terms of some kind of interesting and

5:06

outside of the box use cases. Like I said, I'm a haptics industry professional,

5:09

and my brain kind of goes to some of the haptics use cases there.

5:13

And it's funny because there's a couple of companies that do this,

5:16

but there are some electro stimuli,

5:20

haptic suits that exist out there. One of 'em is made by a company called Tesla Suit,

5:24

and another one in the gaming side of things is made by a company called

5:27

O-O-O-W-O. And these haptics are not fun.

5:34

They are not comfortable. It's a lot of friction to get into these devices because they are using electro

5:41

stimuli. The nodes that attach to your skin can't have clothing in between you and

5:47

it. So to wear the Tesla suit,

5:49

which I put on at CES 2019, I think. So it's been a while now,

5:54

I had to basically get naked at a conference to put this thing on.

5:58

I had my underwear on still, but otherwise my whole body was inside of this suit.

6:02

And the O technology is very,

6:04

very similar except for it's just a shirt and it is designed to be used in

6:07

games. However, the sensation of these,

6:11

have you used one of these, by the way, Sam? I've used the OO suit. Yeah.

6:15

The O. That thing is not comfortable, is it?

6:18

No. And it's not fun to have,

6:23

have all your users get in it, and then you have to worry about them.

6:25

With cleanliness, we have to motion tracking suits too,

6:29

so I have to take it home and wash it. So for all of those reasons, I would say it's not really good.

6:35

It's not really worth it. They're expensive,

6:38

they're uncomfortable to get into and out of, and the haptic sensation physically actually hurts. You're being shocked,

6:44

which by the way, I learned electrocuted should only be used when referring to death because it's

6:49

like execution, like execution, electrocution. So unless you died,

6:53

you didn't get electrocuted, you got shocked, which is something that took me forever to learn,

6:57

but this thing shocks you and some of the sensations are akin

7:02

to scratches and stuff like that. And I do see,

7:07

however, a compelling use case for negative reinforcement in training because

7:13

let's say we are doing the oil rig training,

7:17

snipping the wrong wire or pulling the wrong lever could mean literal death for

7:21

everybody out there. Some of these things, there's a reason why we're using immersive technologies to train some of this

7:27

stuff because it's very dangerous. It's scary to learn these things in medical

7:31

procedures at contact.

7:34

We've done some stuff with Cincinnati Children's Hospital and we went in there

7:38

last week for a demo, and I'm looking at these little,

7:44

I dunno, fake cadaver things and stuff like that that they have.

7:46

And I'm realizing, wow, they're actually performing operations on children in here and

7:52

they need to practice these procedures because messing up a little

7:57

bit could mean something terrible happening to a child. So it's really,

8:01

really important that we learn these things and take these trainings seriously.

8:05

And if you've used the Owo vest or the Tesla suit vest,

8:08

it only takes one shock for you to realize you don't want that to happen again.

8:12

And now when I did the game for owo,

8:15

my head was on a swivel in a way that it would not have been if I didn't

8:20

have that negative reinforcement happening.

8:22

So while I don't recommend these devices for something like gaming, I mean,

8:26

I guess if you're a streamer and you want some shock value or something,

8:29

you could buy one and use it. But at the end of the day,

8:32

I don't see a practical use case other than negative reinforcement using haptics

8:37

in a training where it could necessarily be a life death situation.

8:40

That's fair. Okay. Yeah, that makes sense.

8:44

So we've talked about some benefits and challenges that you mentioned

8:48

challenges. I mean, it hurts or it's difficult to put on one of the EMS or TENS based

8:55

like the suit and reducing cognitive load as a

8:59

benefit and increased safety because you can practice

9:04

surgery before actually going to do the real thing like you're mentioning.

9:07

But what other challenges do you think you see with some of the newer

9:11

modalities? User acceptance.

9:15

Are the people familiar with controllers? Are they going to get frustrated with using hand tracking or eye tracking? Now,

9:21

do you see any accuracy limitations with hand tracking or eye

9:25

tracking or the contact CI glove for haptics?

9:31

Is it not quite accurate enough that you're working on it? Security concerns?

9:35

You mentioned that might be something that EU has to worry about collecting

9:40

PII for advertisement purposes or whatever,

9:44

aging IT infrastructure. Can our IT systems and enterprise take this stuff?

9:49

Is it ready? Are these companies working with say,

9:51

Microsoft Azure and AWS to really get them accepted in a

9:56

more formal way? Or is it still kind of the wild west, you think?

10:00

Yeah, I think in terms of infrastructure and stuff like that,

10:03

I think we're good to go. I think most people have computers that can handle this type of technology.

10:09

I mean, I guess you might have to acquire some hardware to pull some of this stuff off,

10:12

but it all works pretty well with existing technologies.

10:15

So I don't think we have too much to worry about there.

10:21

You touched on a lot of really good stuff there. But I think in terms of some of the challenges with accepting some of these

10:28

things, I think user acceptance actually is a pretty big one.

10:33

People, they resist change. There's a lot of people,

10:37

especially in the DOD side of things,

10:39

there's some guys who have been in those roles for decades, literal decades,

10:43

and the way they've been doing things has worked.

10:46

They're not in a big hurry to make a huge change to the infrastructure of their

10:50

training that's going to take years to fully implement and all of that stuff,

10:55

especially when they're used to seeing the results that they expect.

10:59

But other than some of the things that you kind of touched on there,

11:04

I think that scalability is a really big one.

11:07

I think that there is,

11:10

if I'm trying to sell somebody on using AR or VR for training,

11:15

I would probably lean into something like scalability.

11:19

Traditional trainers for the Air Force, for example,

11:23

are insanely expensive to build. It's basically a fake cockpit using all of the real stuff that a cockpit's made

11:29

out of. It costs a lot of money to build.

11:32

They can only exist in one place at one time,

11:35

and only one person can occupy it at a time. And also, typically,

11:39

you'll probably need a second body there as an instructor and as an instructor

11:42

to explain what's happening. But if you had an AR or a VR application that was designed to

11:48

train people in a virtual environment, this is a program that you could slap on a hundred headsets and send to

11:54

everybody and say, all right, everybody, here's your homework.

11:57

Spend an hour or two in this virtual cockpit.

11:59

And then when everybody comes in the next day, they all have so much more experience than they did before.

12:03

And you're able to scale this training and scale your ability to share this

12:07

information with across your workforce or across your whatever in a way that

12:12

was previously inaccessible. So I see a huge opportunity for scalability of training that

12:18

would require you being onsite, having something really expensive or large or something that would be incredibly

12:25

dangerous to participate in. Okay. Yeah, that makes sense.

12:30

So moving on to future outlook,

12:34

you mentioned how the enterprise can expect

12:39

to use these different interaction modalities in the next five to 10

12:43

years. Are there any particular companies that you expect to have a massive impact on

12:49

the AR space? And then what are some high level steps that companies can do to

12:55

prepare to integrate these new AR solutions? So that could be hand tracking,

12:59

start looking at gloves, start integrating them,

13:02

maybe start using if they're using Unity or Unreal,

13:06

import the new like for Unity, for example, the XR HANDS toolkit,

13:10

and start supporting that in your input code.

13:16

What do you think companies to watch that you expect to have a massive impact on

13:20

the AR space five to 10 years? And then what can companies do to

13:25

within the enterprise do to prepare for these advancements?

13:30

Cool. I'll start with the latter portion of this.

13:32

I think if you've made it this far in this conversation and you're considering

13:37

maybe getting into some of the stuff that now is a great time to start doing

13:41

some of the stuff that you mentioned. Start taking a look at the different hand tracking solutions that exist.

13:46

Maybe get a couple of different HMDs and start playing around, like I said,

13:50

go to the Apple store, do a demo there. Experience what it feels like to have eye tracking and hand tracking working in

13:56

conjunction with each other. This is definitely the time because the technology's moving very rapidly.

14:03

We're looking, I mean, they could put out

14:07

an AR headset and a VR headset once a year that would set a new standard for the

14:12

technology. So things are moving really, really quick.

14:15

And I wouldn't expect to have something that meets all of your

14:20

expectations today. And I say expectations in quote marks because

14:25

I do feel like the general public has a somewhat inflated idea

14:30

of what to expect when they use a lot of these technologies.

14:33

We are raised on incredible science fiction films in media

14:38

and stuff like that that honestly show us a lot of tech

14:42

10, 20, 30, 40 years in advance.

14:46

One of my favorite movies ever is Total Recall starring Arnold Schwartzenegger.

14:50

And if you watch that movie, you will see so many technologies that did not exist when that movie came out

14:56

that are now commonplace in the real world.

14:58

And most people who would watch that movie today would just,

15:02

they wouldn't notice that it would just kind of go in one and out the other.

15:04

And they're like, okay, cool. Yeah, they're doing a video chat. But I remember being younger and watching people do video chats and movies being

15:10

blown away that that might actually be something that we can do. However,

15:15

a lot of people, when they think about ar, they think about vr.

15:18

They have this expectation that when they put an HMD on,

15:20

it's literally stepping into a portal and it's going to change everything.

15:26

And when they see a little bit of a friction or a little bit of lag here,

15:29

or maybe the haptics don't line up perfectly,

15:32

they kind of just want to throw it out, throw it out the window, I would say

15:38

don't have unrealistic expectations of the technology of where it is today.

15:42

But I think it's safe to assume that all this technology is going to reach

15:48

utterly profound levels within our lifetimes,

15:52

the type of profound where you will be changed by 10,

15:56

20 minute experiences putting on the headset.

15:59

So I definitely see all of that happening. In terms of companies to watch,

16:04

I am not exactly sure,

16:06

to be honest. There's definitely a few that have been making waves and have been

16:11

pushing the envelope. Snap is one of 'em.

16:14

Snapchat has been using AR filters. They're basically the first company to actually get

16:20

wide adoption of AR technology, at least that I have seen.

16:24

I mean so many people using Snapchat filters to send videos and stuff like that

16:28

to their friends. They're working on a lot of AR stuff.

16:31

So I think that's a company that's really worth looking at.

16:34

Niantic does a lot of really cool stuff.

16:37

They made Pokemon Go and similar AR games. Like you said,

16:40

gaming kind of paved the way for a lot of these things.

16:43

I think that's the case here as well. Also, meta,

16:46

there are concerns with Facebook owning a company that does a lot of this stuff,

16:51

like being a data collection company. There's some privacy concerns there.

16:56

But in terms of the r and d and the money that's being spent on pushing all this

17:00

stuff forward, I don't think there's another company that does what Meta does. They are pushing

17:05

things so so hard. So I would pay attention to the products that Meta is putting out over the next

17:11

five to 10 years. In terms of where I see a lot of this stuff going,

17:17

I think we're going to see this incredible blend of biometric data,

17:22

immersive technology, and procedurally generated content that is going to create these

17:29

amazingly immersive experiences for people that are totally individualized and

17:33

completely custom to exactly what it is that you need to get out of that

17:37

experience. So for example, let's say you have a training program that is using all these technologies.

17:43

It is reading my biometric data in real time, my eyes,

17:48

my facial expressions, all of that stuff. It sees my pupils dilating and focusing and all of that,

17:53

and maybe there's some kind of BCI that can tell

17:58

my level of engagement. So the procedurally generated content in real time can read my biometric

18:05

data and feed to me the content that's necessary to get the desired outcome out

18:09

of my biometrics. So maybe they're looking for a specific level of engagement,

18:13

or maybe they're trying to scare me or make me experience fear.

18:16

It can keep ramping up the fear levels until it gets to the point where it's

18:20

getting the response out of me that it's looking for.

18:22

So when you combine that with, you combine all these things,

18:27

these biometric data, having it be an immersive experience where you really feel like you're connected

18:32

to what's happening and it being procedural and custom to the user,

18:37

I think it's going to be insanely potent.

18:40

I think they'll be able to identify your weaknesses in a training relatively

18:44

quickly and start to address those weaknesses in real time.

18:48

And that's super, super exciting. Of course, like I said earlier,

18:51

it comes with safety concerns. We all basically are going to have to get rid of the idea of privacy and stuff

18:57

like that, I think at some point. But like I said earlier,

19:00

I think the trains left the station.

19:03

I think trying to prevent these technologies from integrating with people

19:08

is trying to fight the tide. You know what I mean? This is nature.

19:13

We are naturally connecting with technology more and more and more all the time.

19:17

As a baby, a computer bs up and you crawl towards it.

19:21

We are instinctually driven to connect with the stuff, and to me,

19:25

it's natural. So I say, just hang on.

19:29

And I'm throwing my hands up on the rollercoaster. I'm just like, all right,

19:32

let's go. Woo, because it's going to be fun and exciting, and I'm here for it.

19:37

Yeah, that's awesome. Yeah. Great. So is there

19:43

anything else that you would like to plug before we think we're good on

19:47

everything else? Yeah. Yeah. I mean,

19:49

if you find this conversation and my perspective on some of this stuff,

19:54

interesting, I would highly recommend you come and check out between Realities.

19:57

The podcast that I do with my partner s Skiva,

20:00

we do live episodes every Friday as long as time permits,

20:04

and we don't have work obligations and travels and things like that getting in

20:07

the way. But we always have a guest on our show.

20:11

So every week we have somebody either from the gaming space,

20:14

the enterprise space, the training space. We get developers, CEOs,

20:19

YouTubers, the whole gambit,

20:21

basically anybody who cares about this technology as much as we do,

20:25

regardless of what they're doing, could be somebody that we would have on to the show.

20:28

And we really do like to kind of peel away layers,

20:31

talk about the nitty gritty of all this stuff,

20:34

focus on some of the individuals who are really making a difference in the space

20:38

and giving a voice to some people who are like us,

20:41

who are just trying to get involved and come and be a part of it. Of course,

20:45

if you're interested in some of the haptic stuff that mentioned,

20:47

definitely check out Contact ci. We are doing a lot right now that's currently available and behind the scenes to

20:54

improve some of the haptic fidelity and stuff like that that you were mentioning

20:57

earlier, which by the way, I will say fidelity of hand tracking and haptics has a long way to go,

21:03

but I mentioned earlier it exists in a state that is

21:08

enough to kind of bridge the gap from your hands to your brain to allow you to

21:13

have a more immersive experience than you would if the haptics didn't exist.

21:17

A demo I like to do often is having a gloved hand and an unloved hand,

21:21

and you reach out and interact with some buttons and switches with a gloved hand

21:25

and then reach out and do it with your bare hand. And the difference is night

21:28

and day. One of them feels real,

21:30

however you want to define real and the other one doesn't.

21:33

So I really do think that there is a huge value for haptics right now,

21:37

even if it isn't entirely lifelike, and I do expect it to get to that point.

21:42

So yeah, between realities and contact CI are definitely my two primary things

21:47

that if you wanted to follow up with me to reach out. And also,

21:51

I'll say that I do a lot of traveling and I go to a lot of conferences and

21:54

events. So if you ever want to meet up, I'll be,

21:57

I don't know when this comes out, but I'll be in a WE at a WE in June in Long Beach,

22:02

and I go to basically all of the ar, vr, and tech focused conferences.

22:06

So feel free to reach out to me anytime. Send me a DM on Twitter or LinkedIn and we can connect.

22:13

What's your Twitter handle. For people? My Twitter handle is Alex vr, but there's some underscores in it.

22:18

It's like Alex under vr, but I think if you type Alex vr,

22:22

you'll probably get to me. Okay. Awesome. Well,

22:25

thank you so much for your time and all of your input and expertise, Alex,

22:30

and for anyone else who might be interested, just like Alex said,

22:35

check out his podcast and contact ci,

22:39

we'll get your email listed wherever we can host this. But yeah,

22:44

thank you again and take care. It's been great having you.

22:47

Thanks, Sam. Looking forward to our next chat. Talk to you soon.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features