Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
This is the Amp Hour Podcast,
0:03
released July 9th, 2023. Episode 638,
0:11
Building AR Headsets with Aiden Cullen. Welcome
0:30
to the Amp Hour. I'm Chris Gamill, Contextual Electronics.
0:33
I'm Aiden Cullen. I'm interested in both
0:35
the hardware and software aspects of embedded
0:37
systems. Well, then you are in the right place,
0:39
Aiden. Thanks for joining. Thanks
0:41
for having me on the show. This is my first
0:43
time on a podcast. I'm interested to see how
0:45
it goes. Oh, yeah. Well, you're doing great
0:47
so far. I've seen you on
0:49
video. I think first and foremost, I think I was
0:51
watching your Hackaday Supercon
0:54
talk. I actually didn't get to see it live.
0:56
They started publishing all the talks and I was like,
0:58
wow, this is definitely something that I want
1:00
to talk to Aiden about this stuff. So you were basically...
1:02
So the title on your talk was... Actually, I don't
1:04
know the title on your talk, but you were basically talking about an
1:06
AR system
1:07
that you were building for yourself
1:10
and for fun, right? Yes. So
1:12
I started getting into this project after previously
1:14
being interested in software
1:16
for AR. So all of these sort of slam
1:19
and computer vision aspects of the problem.
1:21
And at that point, I was
1:23
in middle school and high school. There was really no hope
1:26
of doing any sort of custom hardware. But
1:28
then when I got to college, I decided
1:30
to make an effort to learn as much about microelectronics
1:34
design as I could and move as close
1:36
as I could to this
1:37
idea of all day AR devices.
1:41
So I just started prototyping and playing with hardware
1:43
and project has gone through quite a
1:46
few iterations since then. It's
1:48
been consuming your life, huh? Yes,
1:51
it has. As good projects tend to do.
1:53
Mm-hmm. Yeah. So that's great. And
1:55
you are on... What rev would you say you're on
1:57
now? So the... current
2:00
board that I'm working with, we can talk
2:02
a bit about this is essentially
2:06
the fourth major, I
2:08
would say major redesign in terms
2:11
of, you know, starting from a blank slate
2:13
and doing a complete electrical
2:15
and mechanical design. This particular
2:17
board I've now done a second revision of. My previous
2:20
boards, some of them, you know, just
2:23
I produced one revision and then moved on to a completely
2:25
new design idea, but it's been
2:28
about a four year project so far with about
2:30
four main design ideas.
2:33
Wow. Yeah. And I guess we should state as well.
2:35
So like, you know, this isn't, you're
2:37
not a long time veteran. You actually are
2:40
a young pup in the, in the world of electronics, though,
2:42
making things that I, I'm pretty sure that in my 20
2:44
years of electronics, I could not still do. So you
2:47
just graduated. Congrats on that. You just graduated
2:49
college, right? Yeah, thank you. It's,
2:51
it's definitely been exciting to try
2:53
to get into the world of, you
2:56
know, boards that are more similar to what you'd find
2:58
in a phone than what you find on the Arduino.
3:00
And I
3:02
suppose I found the augmented reality
3:04
idea to be a good motivation for learning
3:07
that, you know, because everyone is designing
3:09
phones nowadays, you know, so it
3:11
doesn't seem as exciting. It
3:14
doesn't, it's everyone. Is this people at your university
3:17
or just generally back out in the world? Well,
3:19
in the, you know, in the technology industry,
3:21
right? I see every company has
3:23
their phone design and they can do that. So
3:26
if you're trying to learn something new, at least for me,
3:28
it's a lot more exciting to
3:30
aim for something that no one really has figured
3:32
out yet.
3:33
Right. So that's why novel novel
3:35
areas of inventiveness and
3:37
where can you do something that's kind of out there?
3:40
Yeah, that's, that's a good point.
3:41
What about your exposure to AR?
3:44
Had you tried it previously? Like what got you actually interested
3:47
in that area?
3:48
I originally became sort
3:51
of involved in the software aspects of it
3:53
from my interest in robotics. So
3:55
in robot motion planning and navigating
3:58
robots, you have
4:00
very similar often slam
4:02
and computer vision pipelines, you know
4:04
tracking things using cameras
4:07
You know interacting with objects in a three-dimensional
4:10
space modeling the world like that. So
4:13
those computer vision You
4:15
know techniques led me
4:17
to get this will back in
4:20
I think 2015 ish
4:23
2015 2016 Nvidia released their
4:25
Jetson TK1 developer board So
4:28
I got one of those and was playing with tons of computer
4:30
vision stuff on it because you know, it was a great
4:32
Platform, you know that you could put on a
4:35
robot and that then of course people
4:37
did put in AR headsets
4:39
like magic leap With the subsequent
4:41
Tegra SACs. Oh, I didn't know that
4:43
I didn't know that was what was in there
4:45
Yeah, so magic leap they're like light
4:48
pack thing that you wear on your waist At
4:50
least it had in the first version haven't
4:52
taken a look at like the new Second
4:55
iteration of it, but it had an Nvidia Tegra SOC
4:58
So yeah, really those computer vision
5:00
aspects got me into it. I haven't
5:02
really gone around trying a ton
5:05
of you
5:05
know existing AR
5:08
hardware that really wasn't my goal
5:10
in it, you know, my goal was to You
5:12
know start and learn how
5:15
I think that this sort of hardware has
5:17
been developed There are lots of people who have you
5:19
know tried every headset on the market.
5:21
That's not me quite yet I
5:25
feel that actually sometimes it's better to not
5:27
have done that upfront too because then it's not like how
5:29
can I be so radically different? From these other things.
5:32
I mean some that somebody you could kind of get a
5:34
feel for what's good What's not but
5:36
sometimes then you feel like you can't repeat the
5:38
stuff other people are doing as well, which could
5:40
be limiting Right. I I
5:43
think in some regards it's helpful
5:45
to have you know, a clean slate
5:47
to start from and not Not
5:50
compare yourself to a design that you saw
5:52
previously and then in some regards It's also
5:55
helpful to be very aware of what already
5:57
exists so you don't duplicate
5:59
work
5:59
and so that you make progress
6:02
ahead that other people find useful.
6:05
Yeah.
6:06
I've heard SLAM before. What
6:08
does SLAM stand for again? That's simultaneous
6:11
localization and mapping, which
6:13
is this sort of computer vision topic
6:16
of taking one camera, multiple cameras,
6:18
perhaps other sensors like IMUs,
6:21
and tracking the location of this
6:24
camera unit in space. So
6:26
mapping the world around you as you
6:28
track your motion through the world.
6:31
So that's really the technique that you find
6:33
used on a lot of these headsets to locate
6:37
the user in space so that the graphics
6:39
can be rendered according
6:41
to their surroundings. In
6:44
a lot of cases, that may not even be
6:46
necessary if you don't need to have objects
6:49
anchored to the real world, but
6:51
that was kind of the bridge between robotics
6:54
and AR
6:55
that I found.
6:56
Got it. Yeah, that makes a lot of sense. Yeah.
6:58
Okay. So I was a glass hole and
7:01
those things were not locked. Those were locked
7:03
to just your field of vision, basically. So
7:05
as your head moved, the whole screen moved. But
7:07
you're saying that on certain things like Magic Leap, where you're
7:09
trying to have a little animated clown
7:12
dancing on a shelf in front of you, you have to first detect
7:14
the shelf, visualize
7:17
a flat surface, and then re-render this thing
7:19
to be dancing on that flat surface. That kind of idea?
7:22
Yeah, exactly. So there's
7:24
essentially an entire spectrum of
7:27
design choices you could make in this
7:29
field that is currently called augmented
7:32
reality. And I think people are going to have
7:34
to find better names for
7:36
things, specific devices
7:39
in the future, because there is a huge difference
7:41
between something like Magic Leap or
7:43
the goals behind HoloLens, these
7:45
large devices, and Apple
7:48
Vision Pro now, where the goal
7:50
is to be able to produce this 3D
7:52
immersive experience.
7:54
There's a huge difference between that and
7:56
something like the Vuzix devices or
7:59
the prototypes I've... been doing where
8:01
the goal is more to get a really low
8:03
power, lightweight device with simpler
8:06
graphics,
8:07
perhaps just 2D graphics. What
8:09
is the Vuzix you mentioned? I've never heard of that one. Vuzix
8:12
is a company that they've been doing
8:15
quite a few sort of smart glasses
8:17
devices for enterprise applications.
8:20
Most of them run Android. So if
8:22
you look at teardowns of them, they're essentially
8:24
more or less an Android phone stuffed
8:27
into a pair of glasses. They
8:29
do applications like surgery,
8:32
maintenance and such where
8:34
you want to have your hands free and view
8:37
reference information or
8:39
telepresence video conferencing things. Got
8:42
it. Yeah, I can imagine that's the
8:44
one that I always think of actually is that maintenance application
8:46
where it's like they're looking inside a turbine. There's
8:49
some kind of like QR code that locks
8:51
you to a frame of reference and then it's like doing
8:54
overlay arrows and being like, oh, you're going to change part 74
8:56
now and here's what's highlighted
8:58
it overlays that sort of thing.
9:00
Right. Yeah. So
9:02
there, Vuzix is going for those sorts of applications. Got
9:05
it.
9:06
Okay. Cool. Cool.
9:09
That's good to know. Okay. So
9:11
that's
9:11
kind of the space that we're in. And then you had mentioned
9:13
that you are not playing in the super high end kind
9:16
of, so
9:16
you are maybe locked to a
9:19
thing in space, but you're not necessarily
9:22
trying to make it immersive in that you're not
9:24
like a, so I guess Vision Pro is probably a good
9:26
one to talk about, right? That is like a basically
9:30
a, I think of actually like a VR headset where they just happen to repass
9:33
through the outside cameras. That kind
9:35
of feels like what that is.
9:37
Right. So my
9:39
designs that I've been doing are monocular
9:42
displays. So display in one eye, you
9:44
know, intended for two dimensional content. And
9:47
of course, you know, you still have the potential with a
9:49
monocular display to anchor the content
9:52
to the real world, have it change as the user
9:54
rotates their head.
9:55
But yes, it's a totally
9:58
separate and opposite.
10:00
design goal
10:02
from what you have with Vision Pro, where
10:05
the goal is to have this sort of large,
10:07
very powerful, immersive
10:09
headset and yes, then
10:11
you just happen to have a pass through. What
10:13
I find interesting about this low power
10:16
mindset is that it's
10:18
much easier to imagine how you
10:20
could get to something that someone would actually
10:22
wear all day, you know, if you pursue
10:25
low power designs,
10:27
you know, applications like the Vision
10:29
Pro and these immersive VR oriented
10:31
devices, you're certainly not going
10:34
to be wearing them
10:35
for more than a couple of hours. But if
10:37
you are just powering one display
10:39
rather than two, you have, you
10:42
know, limited compute on board, but
10:44
perhaps you can still provide a decent two
10:46
dimensional display, then you can
10:48
start to imagine,
10:50
you know, how to get to something that you could
10:52
actually comfortably wear. So that's
10:54
the direction I've been going. Got
10:56
it. So more akin to the Google Glass of the world's
10:59
though, look through the contextual information
11:01
that you might be able to have.
11:03
Right. I think in my view, the shortcoming
11:06
of Google Glass and a lot of the Vuzix
11:08
designs, as well, there were many, there were many,
11:10
right. One of them that
11:13
I was particularly interested in was
11:15
display resolution. Most of the devices
11:18
you find that Vuzix is making in
11:20
Google Glass had relatively low display
11:22
resolutions around 640 by 480 pixels, sometimes even
11:26
lower than that. And I wanted
11:28
to try to get a high resolution display,
11:30
you know, at least something like 1280 by 720.
11:33
Or now I have a full
11:35
HD display in my latest design.
11:38
I wanted to try to get this sort
11:40
of high resolution display in a
11:42
lightweight, low power device. Because
11:45
my goal for the use cases I would like
11:48
to see is to have, you know, phone
11:50
style applications on this
11:53
wearable display in a lightweight
11:55
device. So you still have a comparable
11:58
number of pixels to your phone and you could fit that.
11:59
that sort of rich content that you expect.
12:03
And are you still thinking about, would
12:05
it still also have like inside out cameras,
12:08
like looking out to view the world
12:10
and do that slam type of stuff, or is it more
12:12
just locked to the face right now?
12:14
Essentially the challenge with
12:16
slam and cameras in general
12:19
with my designs is just power
12:21
consumption, because you don't want to spend
12:24
too much power on processing
12:26
for these sorts of computer vision tasks.
12:28
So the question ends up being, what
12:31
is worth
12:32
spending power on? What
12:34
would improve the usability of
12:36
this device and what wouldn't? So
12:39
these
12:40
current designs I have,
12:42
my latest prototype supports one camera.
12:45
So the question is, you could point it at
12:47
the user's eye and detect blinks,
12:49
potentially track the direction that
12:51
they're looking for input, or
12:54
you could point it at the world. If
12:56
you pointed at the world, the processing
12:58
on board this device is a microcontroller
13:01
style system. It's not necessarily
13:03
the best thing in the world for doing something complex
13:06
like slam.
13:08
So there's sort of that trade off to deal with.
13:10
You could feasibly do
13:12
at least simple optical flow
13:14
stuff for tracking translation
13:17
of the user in space with
13:19
this camera system I have. I
13:21
haven't gotten into the software of doing that yet.
13:23
I mean, the SOC I use
13:26
has a DSP as well as
13:28
a Cortex-M33. So you
13:30
could program the DSP to offload
13:32
some of that vision stuff.
13:34
Okay, that's great. That's great. Okay,
13:36
that's actually a good segue. So let's go back to the previous
13:38
version you talked about in the talk. We'll obviously link in
13:40
the talk, the hack of the article,
13:43
that stuff, so people can watch kind of the rich
13:45
detail view and stuff like that. But can
13:47
we just talk about the system
13:50
level design of that one first, and then we'll
13:52
work our way towards the most
13:54
recent prototype. Right, so
13:57
that one used...
13:59
Well, at that time it was a bit newer, but
14:02
Allwinner had released their RISC-V
14:05
chip, which they called D1. And
14:09
seeing that and being interested in RISC-V, I
14:11
was immediately interested in doing
14:14
a design with it. Now, that
14:16
particular
14:17
system on chip, they
14:19
include quite a few
14:21
nice video processing features. It
14:23
has H.264 and
14:25
perhaps also H.265 decoding and encoding
14:29
on the chip. It has
14:31
MIPI-DSI outputs. It
14:34
also has LVDS outputs, which
14:36
were important for the display I was using
14:38
at that point. So I wanted to do a design
14:41
with this Allwinner chip because it seemed
14:43
well suited to my goals of having
14:45
good display capabilities. I could integrate it nicely
14:48
with the Sony display I was using
14:50
and
14:50
fit it in this small
14:52
form factor. They took that same
14:56
chip that's the
14:58
die in that D1 package. They
15:00
also package it with integrated DDR2 in
15:02
that D1S package,
15:05
which is also called F133 because
15:08
they have duplicate names for some
15:10
reason. So yeah,
15:13
I used that package with the integrated DDR2,
15:17
which is of course, it's nice because it's easy
15:21
to work with and low cost. The
15:24
system and package concept is very
15:26
helpful for doing miniaturized designs
15:28
like this.
15:30
And honestly, if you look at modern phones, you might
15:32
have brought up phones too, but those are all super
15:35
stacked up. You look at an iPhone, they're
15:37
stacked chips on chips on chips. And
15:39
then finally they touch down to the PCB
15:41
where they need to talk to another subsystem basically.
15:45
Right.
15:45
So previously, even before that design,
15:48
I was interested in system and package.
15:50
There's this company in Texas, Octavo
15:53
Systems, they did sort of beagle
15:55
bone on
15:58
a chip designs of sorts. It's an
16:00
entire BeagleBone Black essentially in a
16:02
package that was sort of their first product.
16:05
And then they moved on to doing the STMicroelectronics
16:09
STM32MP1. So they have
16:11
a system and package with that. So
16:13
previously I did a board with that chip on
16:15
it. I didn't show that at Hackaday Supercon.
16:19
But you know, those processors are nice for sort
16:21
of
16:22
industrial control and those sorts of
16:24
embedded applications. They don't have great
16:26
video capabilities. You're talking about
16:28
the DOCK table specifically? Yeah,
16:31
the current... Oh, I was going to say, those were huge when they first
16:34
started doing those. They're like 0.8 millimeter
16:36
pitch BGA's, right? Right. So
16:39
they still have large BGA pitches. So
16:41
they're still easy to work with. Yeah, the
16:44
SOCs they're putting in them currently aren't
16:46
really multimedia focused because that's not
16:49
exactly their target customer
16:51
right now, you know, for my understanding.
16:54
So I used those in the past and I liked them a
16:56
lot.
16:57
For what I showed at Supercon,
16:59
you know, though AllWinners stuff is terribly
17:01
documented, that was, you know, a more
17:04
exciting choice for my display application
17:07
at that point. Yeah,
17:08
totally. Okay.
17:10
So just to wrap up the system
17:12
view here. So we have a
17:14
display from Sony. We have this D1
17:18
system and package from AllWinner
17:20
with DDR and the RISC-V processor.
17:23
What about other
17:25
battery charging? What else is there? I mean,
17:27
it sounds like that's the core of the thing, but what else is there?
17:29
Yeah, so there's a single cell lithium
17:32
ion polymer battery charger. That's
17:34
just a typical linear battery charger.
17:37
And then there's Wi-Fi as well.
17:39
That, well, I'm still using the same
17:41
Wi-Fi module from Silicon Labs that
17:43
I put on that design, which is quite
17:45
nice because it has good sleep
17:48
modes and low power consumption in the sleep
17:50
modes. So that's why I had chosen
17:52
it. I was choosing Wi-Fi over Bluetooth
17:55
because of the increased data rates, because
17:57
with Bluetooth, you know, having... a
18:00
practical limit of one megabit
18:02
per second or less for throughput, that's
18:05
pretty limiting in terms of sending content
18:07
to the device. Because since I have limited
18:10
processing on the device, providing
18:12
content from a phone connected over Wi-Fi
18:15
or some server on the internet,
18:17
that's a big opportunity
18:19
for improving the capabilities
18:21
of the device because you have limited compute
18:23
on board.
18:25
Great. That's good. What
18:27
about other things? So interconnect, other power
18:29
handling? What else is on there?
18:31
So most of the interesting
18:34
design challenges came with the display.
18:36
At that point, I was using this
18:38
display from Sony, the ECX337. So
18:42
that requires 10 volts
18:44
analog power supply. So there's
18:48
boost converter for that, a
18:50
few other random power management
18:52
things.
18:53
I was using on that board these
18:56
little sort of pack,
18:58
they're like co-packaged inductors
19:01
with a boost, well, inductors
19:03
with a DC-DC converter, I see a buck
19:05
converter under them from Torx
19:07
semiconductor. So they're these nice little
19:09
packages. They basically have the inductor stacked
19:12
on top
19:13
of the chip. So it provides a
19:15
really nice small footprint DC-DC
19:17
converter with pretty good efficiency.
19:20
So I was using those around the
19:22
entire board for various power supplies
19:25
and such.
19:26
The Sony display, it has
19:29
its 10 volt supply, and
19:31
it has this LVDS video
19:33
input. That's kind
19:35
of the most annoying part about the Sony displays,
19:38
they're not using MIPI-DSI
19:40
inputs. So you
19:42
either have to have LVDS
19:44
outputs on your SoC, which the
19:47
Allwinner D1
19:48
does, or you have to use
19:50
a bridge chip.
19:51
So previously, when I did a design with
19:54
Octavo's system and package, the
19:56
STM32 MP1, I had, you know, parry,
19:59
parallel RGB signals going
20:02
to a bridge chip to LVDS
20:04
than to this Sony micro display. And
20:07
of course, the bridge chip occupies
20:10
board area, consumes power, it's just
20:12
an annoyance to
20:14
design around it. So then
20:16
you start deciding to just choose SOCs
20:19
with
20:20
the correct display interface already
20:22
and not bridging it. Yeah, yep,
20:25
yep. But I mean, LVDS, I
20:27
suppose, is technically the better
20:29
name is FPD-Link, which came
20:31
from TI and National
20:34
Semiconductor, I think, but it's, you know, kind of
20:36
old. Not many people are using it in mobile
20:38
devices currently.
20:40
Yeah, it's interesting hearing, like, I mean, what that
20:42
sounds to me like is like, so you chose the display
20:44
for its specs, for its size, and
20:47
then that just like had this cascading effect throughout
20:49
your design, which is, I feel
20:51
super common, you know? Right, and in this
20:53
case, it's like that
20:55
because of the miniaturization
20:58
required, right? I don't want to spend board area
21:01
on a bridge. Yep. I don't want
21:03
to deal with routing it. It's too much of an annoyance
21:05
in this tiny form factor I'm going for,
21:07
yes.
21:08
Interesting. Okay. So
21:10
then, so you got the board, this is what you're
21:12
talking about. Did you get all the way up through
21:15
sending data? I mean, able
21:17
to send data, able to send images
21:19
over Wi-Fi?
21:21
Yeah, so I've been doing
21:23
also currently a lot of work
21:26
in
21:26
figuring out the most efficient ways to
21:28
do that for user interfaces, because
21:30
of course you can easily just stream
21:33
video over Wi-Fi in the standard
21:35
ways people will do, you know, just taking
21:38
a transport stream and,
21:40
you know, throwing it over the network.
21:42
But then of course, you know, maybe you want to turn
21:44
the display off sometimes, you want to start
21:47
and restart this, you want to maybe not
21:49
send entire frames, just refresh
21:51
part of the display. All right. So
21:54
at that, you know, I've kind of had this
21:56
trend of finishing a hardware
21:58
design like that. you know, then working
22:00
on the software and then wishing I had different
22:03
hardware. Right. Cause you're,
22:05
you're always, always thinking of what
22:07
could be better and what could be next. So it,
22:09
you know, it would certainly be nice
22:12
to eventually get more people developing
22:14
software for this because I just keep, keep
22:16
going back to doing another hardware revision.
22:20
So I mean, I didn't, I didn't spend too much
22:22
time doing like application level
22:24
software development for that, you know, beyond
22:26
bringing the thing up and getting stuff running on the
22:29
display and such, because then
22:31
I was onto another hardware design. Yeah.
22:34
This is an interesting idea to kind of just almost have a mirror
22:36
of
22:37
like, you could have like a wifi connection
22:39
and then stream like a mirrored view
22:42
of a computing device, like
22:44
a
22:45
laptop that might be, you know,
22:47
sending stuff over wifi as well,
22:49
because then it becomes a
22:51
computer side software problem versus
22:54
a
22:55
display side problem.
22:56
Like it feels like a lot of the, again, my main
22:58
context for all this is Google class, but they
23:00
use the no map on that. Right. That was the TI
23:03
chip that's in the beagle as well. And like
23:05
they basically rewrote, I mean, they
23:07
basically wrote a quasi Android
23:09
thing for that as well, where they're, you know,
23:11
they're all cards and they're all, it's all embedded
23:13
processing. You, if you wanted a new feature, you had to push
23:16
firmware down to it. And then it could talk
23:18
over Bluetooth to your app on the phone sort
23:20
of thing. Instead of the phone, just
23:22
doing the heavy lifting and streaming it up to the device.
23:25
Right. Yeah. I think offloaded
23:28
processing to a laptop, if
23:30
you're by a laptop or your phone, if you're walking
23:32
around, I think that's going to be a big
23:35
trend in these sorts of designs, because
23:37
people do want to just put a low power, you
23:39
know, microcontroller or very basic
23:42
Linux system into something like this without
23:45
spending too much power on doing
23:47
a very complex OS, at least people
23:49
who are interested in this sort of low power all
23:52
day wearable.
23:53
Of course, if you're doing a big bulky thing that
23:55
you don't expect the person to wear all day, then
23:57
you have tons of leeway. Yeah.
24:00
What about power? So let's talk about power too, because
24:03
again, my Google Glass had like a four
24:05
or 500 milliamp hour battery on it and it just
24:07
didn't last. But
24:10
I imagine Wi-Fi streaming as well is even
24:12
in low power Wi-Fi is, there's
24:14
just a lot of electrons flowing
24:16
around here.
24:18
Yeah. Yeah. That's really a challenge
24:21
with choosing between Wi-Fi and Bluetooth
24:23
as well, because there's not
24:26
really a very satisfying low
24:29
power personal area network beyond
24:31
Bluetooth if you want higher data
24:34
rates. Because Bluetooth is
24:36
great for low power consumption and like Nordics,
24:39
NRF series of chips are
24:41
awesome for that.
24:44
But if you want to transfer video over it,
24:46
Bluetooth is not the best link to choose for
24:52
that. If you're forced into Wi-Fi
24:55
and yes, then you have this sort of order
24:57
of magnitude power consumption jump. Yeah,
25:00
right. My friend actually was at a startup that
25:02
did that.
25:03
They basically tried to do,
25:05
they successfully did video over Bluetooth, but
25:08
they didn't abuse the spec, but they basically
25:10
kind of
25:11
squeezed a lot of efficiency
25:14
out of the spec because it's just like not what Bluetooth is designed
25:16
for, right? Even then it's like low bandwidth,
25:19
that sort of thing.
25:20
Right. Right. That's
25:22
what it's designed for. And I suppose in this
25:24
area, you can make a lot of
25:27
progress in some cases by using things
25:30
in ways they weren't intended to
25:32
work.
25:33
On the display side, I've also been doing
25:36
that a lot recently with this other
25:39
non-Sony display that I switched to
25:41
using. Okay. Yeah,
25:45
that is interesting about that.
25:47
I think just packaging up frames
25:50
and sending them over Wi-Fi is an
25:52
interesting idea. But like we said, the
25:54
power is a concern. What
25:57
was the battery size you were kind of targeting and
25:59
what was...
25:59
the, what was the relative,
26:02
you said an all day device, what is, how do you,
26:04
how do you kind of define that or benchmark it?
26:06
So my goal now is
26:09
for most of the power consumption to be
26:11
in things like the display and
26:13
wifi. So I want to try
26:15
to minimize power consumption in processing.
26:18
So that design that I was showing at super
26:20
con doesn't really align
26:22
with that goal yet, you know, because just
26:24
running that ship with the DDR
26:27
and running Linux and power management
26:30
for the
26:30
risk five core, all
26:33
that stuff consumes more power than I would like.
26:36
You know, I sense moved to
26:38
doing really low power microcontroller systems
26:40
with the goal of having the display
26:43
and wifi data transport
26:46
being the main consumers of power. So then
26:48
you're only, you only want
26:50
to consume power in that view on things
26:53
that are facing the user or delivering
26:56
content to the, to the user. I want to try
26:58
to get as much processing as possible
27:00
off the device. So you don't have
27:02
to spend much power on that. So
27:05
with, with that sort of mindset,
27:07
when power consumption is dictated by the display,
27:10
then it's really the type of content
27:12
that determines how long you can run it for.
27:14
Of course, if most of the pixels on
27:17
the display are off, you
27:19
know, in these O LEDs, then those
27:21
pixels are not consuming power. So if you just
27:23
have some small UI,
27:26
you know, floating in the center of the screen
27:28
where those, right. Then
27:30
the display consumption is pretty low compared
27:32
to the maximum. If all the pixels were white.
27:35
So I want to be able to, you know, eventually
27:38
get to the point of, you know, eight hours
27:40
or so, that's what I considered all
27:42
day with that sort of lightweight
27:45
information. Of course, if you decide
27:47
to, you know, mirror your computer screen
27:49
in these cases, it's
27:51
hard to get, you know, solid
27:54
eight hours, you know, from a display that's
27:56
fully.
27:57
Without having like a hip backpack.
28:00
or a hip pack of batteries or something as
28:02
well. Right. Yeah. The design I had at Supercon
28:04
had a 500 mAh lithium ion cell. And then
28:06
the
28:08
prototypes I'm doing now,
28:10
I'm trying to cut that down to more
28:12
around 200 mAh or so. Oh,
28:15
okay. Yeah.
28:17
That's great. That's
28:19
a good kind of... I think this is the thing where like
28:23
you said, you started from a blank page and
28:25
one of the problems there is that you're defining
28:28
your own specs as well. And
28:30
we haven't even talked about like
28:32
cost and, you
28:34
know,
28:35
the requirements you're
28:37
creating for yourself sounds like they're very reasonable,
28:40
but they are also
28:41
for a user of one versus if you were like
28:43
making this a product. So just thinking about like the
28:46
kind of the requirement space that you're creating
28:48
for yourself. It's interesting
28:50
to hear your process around that, honestly.
28:52
Yes. I suppose in,
28:55
well, in my more recent design
28:58
since Supercon cost
29:01
became more of a consideration because this
29:03
was my first HDI board. And
29:05
so it was a sort of unknown
29:08
area in terms of
29:09
fabricating things.
29:11
For my design that I showed
29:14
at Supercon, those were also the
29:16
first boards that I did, you know, did full
29:18
turnkey assembly and just had a contract
29:20
manufacturer build them for me. Yeah.
29:22
Because at that point I had kind of become
29:25
tired of the stress of hand assembling,
29:27
you know, 0402, 0201. Just wanted to make
29:29
sure that... I
29:32
put enough flaws into my own designs,
29:34
the soldering doesn't need to be one of them. That's how I think
29:36
about it, you know? Right. Right. Yes.
29:38
And then I, you know, I wasn't going to try, well,
29:41
you know, I probably should at some point, but at
29:44
that point I didn't feel like spending the time,
29:46
you know, assembling three or
29:49
four, you know, 0.4 millimeter
29:51
pitch chip scale packages on this board.
29:53
Yeah.
29:54
And you should eventually, but after you
29:56
have working unit, like a golden unit that you can compare against,
29:58
you'd be like, Oh, this is the best. And then you're like, Oh, this is the best.
29:59
isn't like some, if you do it on your first board,
30:02
it's like, is it the soldering or is it the
30:04
code or is it something else? You know, like if
30:06
you at least have one working, that's a good starting
30:08
place, I feel like.
30:09
Exactly. Yes. So fortunately,
30:13
these first prototypes that I did
30:15
earlier this year for this new design, they
30:17
did essentially completely work on the first
30:20
revision, which I was happy about. Yeah.
30:23
And given that there were some things
30:25
on this board that were not tested on like
30:27
a larger evaluation style
30:29
platform first,
30:31
kind of the particular risk that
30:33
I took with this design was
30:36
with this new display I chose, which
30:38
is a MIPI-DSI display. And then I'm connecting
30:41
it to this SOC from NXP,
30:43
which also has a two lane MIPI-DSI
30:46
interface. The
30:48
really interesting thing to talk about
30:50
with this design is the fact that the display
30:52
is not really supposed to support
30:55
two lane MIPI-DSI. They
30:57
intend to use four data lanes
30:59
or eight data lanes
31:01
rather than two.
31:03
So there was a kind of interesting
31:06
roller coaster ride in configuring the display
31:09
to work in
31:12
this system. Could
31:13
you explain the lanes thing?
31:16
So I'm not
31:17
a display person. I don't know if I've ever
31:19
done, aside from like hooking up a CM4
31:22
on a project, like I've never gotten
31:24
that stuff. I've never dug into it enough that I've
31:26
needed
31:27
to do this sort of thing. So what does that, what do the lanes refer to?
31:30
Sure. So in MIPI-DSI,
31:32
the display serial interface, the
31:35
data is transmitted as, you know, a serial
31:37
stream of bits over these differential pairs.
31:40
Each pair transmitting
31:42
data, they call it data lane. So typically
31:45
like on the Raspberry Pi's connectors, you'd
31:48
find four. So four bits are
31:50
being transmitted in parallel. And then of course,
31:52
you know, in a large serial stream overall.
31:55
So there are four differential pairs for data,
31:58
one pair for the clock signal. for the high-speed
32:00
clock. And most
32:03
displays, you know, for
32:05
the
32:06
typical DSI standard will
32:08
run, you know, those four data lanes at around
32:11
one gigabit per second.
32:13
And you can get, you know, full HD at 60 frames
32:16
per second through that just fine. But
32:19
then, you know, some displays use
32:21
more non-standard configurations
32:24
in some like watch designs. You'll
32:26
find people with one data lane
32:29
for their maybe DSI interfaces
32:31
or two data lanes.
32:33
So in this design,
32:35
I was really interested in using
32:38
this chip from NXP, which
32:40
is part of the IMX RT
32:43
family.
32:44
So these are kind of high-end microcontrollers
32:46
with some nice
32:49
graphics capabilities as well. So
32:51
this chip I'm using is the RT500.
32:54
And when you look at it, your
32:57
first thought is this is designed
32:59
for Garmin, because Garmin has
33:01
been using NXP microcontrollers,
33:04
well, previously freescale, in their
33:06
watches for quite a while.
33:08
And this particular one has five megabytes
33:11
of SRAM on board.
33:13
So, you know, wow, that's
33:15
for a micro, that's, that's, that's big. Yeah, right.
33:18
And the reason it's there is for a display
33:20
frame buffer, because Garmin puts these in
33:22
their devices and they don't have external memory,
33:25
you know, but they still want to be able to drive reasonably
33:27
decent
33:28
displays over MIPI-DSI on
33:30
their watches. So, you know,
33:32
it also turns out this is a nice fit for what I'm
33:34
trying to do. You know, I also want to have
33:37
reasonable display capabilities. It also has like
33:39
a simple 2D GPU and such in
33:42
this really low power, you know, small package.
33:45
So it has, you know, two MIPI-DSI
33:47
lanes out, right. And this particular
33:49
display I'm using, you know, you
33:51
go ask the manufacturer, you know,
33:53
can we use it with two lanes? They're like, no. So... Real
33:57
quick on the lanes thing too, just to do some
33:59
additional...
33:59
math on like board, board layout
34:02
and stuff like that. So like, so you said most
34:04
displays will do four lanes at one gigabit
34:06
per second. So that would be eight
34:09
conductors for those four lanes. Cause they're differential
34:11
pairs and then an additional set of conductors for a
34:13
clock. Is that right? So 10 total conductors.
34:16
Right. Yeah. So that's what you'd typically find.
34:18
Okay. And then so, and that's at one gigabit
34:21
per second on each pair, but there's
34:24
four lanes being four bits. So is that
34:26
an equivalent of like 500
34:28
kilobytes per second?
34:30
Because it would be
34:31
double. There's like a nibble.
34:32
You have essentially
34:35
four giga nibbles per second. Yes.
34:38
I get nibbles per second. Oh, I like that. If
34:40
you want to measure in nibbles. Yes.
34:43
You know, I have to, I have to, uh,
34:46
make the name of the show. That's something
34:48
that like makes people understand who you are and what you
34:50
do. But if this had been a normal amp hour
34:52
show, giga nibbles per second would totally be the title.
34:56
Yeah. That's very nice. Okay.
35:00
Great. So, so then when it gets kind of
35:02
decoded on the display side, then something
35:04
you have to do inside the display
35:07
tells it I'm sending you four lanes or eight
35:09
lanes or whatever the available things are. Here's
35:11
how you kind of unwind that to get
35:13
the data into your localized memory.
35:15
That is that how it works? Right. So
35:18
some displays, you can't configure
35:20
the controller I see to use,
35:23
you know, a different number of lanes
35:25
than, you
35:26
know, really was intended. So some
35:28
you find, you know, can be switched
35:30
between four lanes and two lanes. You
35:32
know, some are only intended to work with the standard
35:35
sort of four lanes set up and
35:37
these display controllers can
35:39
sometimes be annoying to deal with because of course
35:41
they have tons of registers for configuration
35:44
and the documentation of what the registers
35:46
do can be quite hit
35:48
or miss. So like this particular,
35:50
got it. Cause these are like a custom, custom
35:52
Silicon, often out of China sort
35:54
of thing where you just,
35:56
you're usually working with an FAA or something like that.
35:58
But
35:59
if you're not.
35:59
then you don't have someone to just set it up for
36:02
you. Right. Yeah.
36:04
So this particular display, it's
36:07
a silicon backplane with the OLED
36:09
structure on top of it. And then this silicon
36:12
backplane has the driver IC
36:14
on it. And then this driver IC is responsible
36:17
for taking those lanes,
36:19
shuffling the data around, and then driving
36:22
the actual array of pixels. And,
36:25
you know, from this
36:27
particular company, this company
36:29
is Vutrix Technology, who
36:32
makes this display,
36:34
particularly the backplane they design.
36:36
Do you think they say the name out loud before they
36:39
decide this is going to be a company? Yeah,
36:41
I don't know. I mean, yeah,
36:43
it's, they have a website which sometimes loads and
36:46
sometimes doesn't. Vutrix.com.
36:48
V-I-E-W-T-R-I-X.
36:51
Yeah. Wow. So,
36:53
yeah, this particular display,
36:56
there are eight data lanes in
36:58
hardware that you can have,
37:00
because they want to be able to refresh it at like 90
37:03
hertz and such. But
37:05
then you can also run it with a typical
37:07
four lane interface. And
37:10
the interesting thing about it is you can configure
37:12
it to only refresh parts of the panel.
37:15
You don't have to send an entire
37:17
panel's worth of data to all the pixels
37:19
at once. So there
37:22
are registers that can configure it for two lane
37:24
mode, but I had to poke around quite a bit in
37:27
order to find where they were and, you know,
37:29
email back and forth with Vutrix.
37:32
So basically it's, it's, beg
37:34
for help as a small time person, but
37:37
then sometimes they do help. Right.
37:40
So, yes, sometimes they do help. I mean, at
37:42
first I was sort of reverse engineering the
37:44
thing and I dumped out all the registers
37:47
and I was trying to look, you know, where can I flip bits and
37:49
potentially make this happen? Eventually,
37:51
they did help me enough to get it to work. Got
37:54
it.
37:55
But then, you know, there, since, since
37:57
the display is so poorly documented,
37:59
there are...
37:59
are essentially an entire
38:02
separate set of problems
38:05
in getting the microcontroller,
38:07
the display controller in this MCU
38:10
to talk to it
38:11
because of just
38:13
all the other considerations and how this data
38:15
is formatted. Some displays require
38:18
you to keep the clock running
38:21
during the blanking periods. Some displays
38:23
require the clock lane to be
38:25
put in the low power mode during the blanking period.
38:28
So you have to figure out
38:30
what the proper configuration is. It's like the
38:32
magical sequence of all the
38:34
events you have to do, right? Exactly. So
38:37
in this case, NXP's driver
38:40
didn't support putting the clock lane in low power
38:42
mode during the blanking period. So
38:44
I have to go and add that.
38:46
This is in the Zephyr real-time
38:48
operating system.
38:50
Oh, it's in Zephyr. Nice. Yeah.
38:53
So I've been using Zephyr for this, which has been quite
38:55
fun so far. Good promo.
38:57
Okay. Yeah, that's
38:59
great. Okay. So now you have this
39:01
new processor, the RT500. And
39:04
if people don't know, the RT10.6X, 1062, I think,
39:07
or 1064, that's what's on
39:09
the teensy four.
39:13
So that's the size, that's the beefiness of those processors.
39:15
They're called crossover MCUs. And I know some of
39:18
that team. Yeah, no, it's definitely,
39:20
it's got some heft to it. That's great.
39:22
So now you have a two lane setup to
39:26
the screen. What does it take
39:28
to actually write a driver that makes
39:30
it all kind of go? Well, of course,
39:32
NXP already
39:35
have in their MCU
39:37
espresso SDK,
39:40
they have drivers for all the peripherals on this chip.
39:43
Often you find things that
39:46
weren't completely thought through for the particular
39:48
application you're working on. So of course, one of
39:50
those is this sort of configuration
39:52
detail I had to deal with. Zephyr
39:55
is also kind of a young project in
39:57
the sense that support for those NXP.
39:59
chips is kind of developing as
40:02
we speak. The particular driver
40:04
for the display controller on this chip, this
40:07
DC Nano display controller, just showed
40:09
up in Zephyr earlier this year. So you
40:12
end up, you know, poking around a lot
40:15
as things change in order
40:17
to get things working how you like.
40:19
So, you know, there were some, you
40:21
know, minor bugs in terms of managing
40:23
frame buffers and things like that. So
40:26
I haven't, you know, done complete drivers
40:29
really for anything in this system
40:31
yet because NXP did a lot of that work.
40:34
But you have to be willing to dig into
40:37
the details to fix problems.
40:39
I suppose the one unsupported thing currently
40:41
in Zephyr is the GPU.
40:45
GPU driver I don't believe has any
40:47
sort of integration with Zephyr. So that will
40:49
be the next step and that will be fun.
40:52
And what is that? So, you know, I hear GPU
40:54
a lot in that they get repurposed
40:56
for AI crap. But in this case,
40:58
the GPU, is that to
41:01
process the frames that you're kind of sending
41:03
out to the display? How is that actually used in in
41:06
a display context?
41:07
So in this SOC, there are kind
41:09
of, well, I would say three
41:12
parts at work in this sort of, well,
41:15
four if you count memory in this typical
41:17
display sort of setup. So
41:19
you have the actual display controller
41:21
itself, which is a block, you
41:24
know, connected to the connected to a bus
41:26
in the SSE so it can access memory, it
41:28
can go read the frame buffer from memory, and
41:31
then it formats that, you know, with the
41:33
proper timings, sends it out
41:35
to the MIPI-DSI-FI, which
41:37
then transmits into your display. So
41:39
you have the display controller, its job
41:42
is just to,
41:43
you know, pipe frames out to the display. Of
41:46
course, you have the CPU,
41:48
you know, managing the display controller, setting all
41:50
of this up, you know, potentially touching frames
41:52
in memory and whatnot. There's the
41:54
memory itself. And then over on the side,
41:56
you have the GPU
41:58
and the GPU in this
42:00
is from Vivante, now
42:03
called Very Silicon. I think they
42:05
were acquired. And it's called-
42:07
More great names, more great names. It's called, well,
42:09
it's one of the GC Nano series.
42:12
I think the full name is GC
42:14
Nano Ultra Light 5 or
42:16
something. Yeah,
42:19
it also has a numerical name, GC,
42:21
and then three numbers, I forget what it is. But
42:24
it's essentially a small 2D
42:26
GPU intended for vector graphics.
42:29
So it has some limited
42:30
2D rasterization capabilities, but
42:35
it's supposed to support Open VG, I
42:37
think 1.1 or something. I'm
42:40
not totally familiar with the Open VG
42:42
standards.
42:43
It's meant for those vector graphics applications.
42:46
And so these exist, so it
42:49
has a different brand name, but it does, it exists
42:53
inside the silicon that you're already using, right? Yup,
42:56
yeah. So NXP of course goes to them and
42:58
uses their IP. Yeah,
43:00
on the silicon, yeah. Got it, oh, okay. So
43:03
it's like a partnership type thing or
43:05
whatever. And then eventually NXP will buy them. Maybe.
43:09
As large companies are likely to do, yeah. Yeah,
43:12
I mean, you find that with a lot
43:14
of blocks in these
43:15
SOCs, they're designed by other
43:18
companies other than the SOC maker, and then
43:20
they get dropped in. So like the display
43:22
controller itself is from the same
43:24
family of products. It's called DC Nano.
43:29
I thought DC, I thought it was the name of
43:31
the display driver, is that different?
43:33
Or is that the same thing? Well, DC Nano
43:36
is the name of the IP
43:38
block, which does the display control.
43:41
Then of course, there's a driver, which is the DC
43:43
Nano driver, which
43:46
NXP has put together. Yeah,
43:49
interesting. All right, so then, so 2D GPU
43:51
in this case would be like, would that be to create
43:54
animations that overlay on top? So
43:56
a GPU in that scenario is being
43:58
used to...
43:59
actually process.
44:02
I'm really out of my element here, Aidan. So like, what
44:05
does the GPU do? So in
44:07
this, in this sense, the 2D
44:09
GPU, they intend for doing,
44:12
you know, sharp text rendering as
44:14
vectors, right? So if you want
44:16
to do, you know, text that you can scale to
44:18
any size, have it look clean, you can render
44:20
that really nicely. Of course you can do arbitrary
44:23
paths, you know, filled shapes,
44:26
other sorts of.
44:28
You know, solid colored UI
44:30
elements with various outlines
44:32
and shapes like that. In terms of raster
44:36
stuff, you know, if you have bitmaps,
44:38
you can transform them, do all sorts
44:40
of rotation and scaling stuff. So
44:43
it's, you know, it's meant for producing the sort
44:45
of GUIs you'd find on a smart watch
44:47
where maybe you have some buttons, some texts,
44:50
some small icons. Okay.
44:52
So then how does this all translate to, so
44:54
if it's the 2D GPU, what
44:57
you're going to be pushing
44:58
actual like fully formed frames over
45:00
wifi. So like, what is, what is the, what
45:03
is the connection there? Like, how does, how does that all, how
45:05
would you be able to use a GPU if not in those like
45:08
vectorized ways? So that's
45:10
a good question. And that's to some extent
45:12
still, you know, an open question. How
45:14
do you best utilize this sort
45:16
of on device processing with
45:19
content sent over wifi? So
45:22
that's definitely a question I'm still thinking
45:24
about
45:24
with the transformation
45:27
that you can do on the GPU. Of course,
45:30
you can imagine things like when
45:32
the user rotates their head, while you potentially
45:35
want to move the content very quickly, you know,
45:37
transform it so that it appears to
45:40
stay in a certain place. And
45:42
that, you know, those sorts of very
45:44
fast closed loop interactions
45:47
between, you know, motion to
45:49
a
45:50
frame on this, on the display, those
45:52
have to be done on the device. So
45:54
you could, you know, you can certainly use the
45:57
GPU to shuffle things around like that,
45:59
even if it's
45:59
not producing the
46:02
actual sort of rich content. And
46:05
then how you encode or compress
46:08
this content to send it over Wi-Fi is
46:10
another question. And
46:13
some people will do things like just
46:16
simple drawing instructions, like put
46:18
text here, put vectors here,
46:21
draw this path, pipe those
46:23
over Wi-Fi, have the GPU draw it on
46:25
the device.
46:26
Of course, the advantage there is you use
46:28
your nice little GPU. The disadvantage
46:31
is, well, now you have to program in this
46:34
language of your small GPU rather
46:37
than using what you have on your phone.
46:39
Got it. Okay. So let me paint a picture for
46:41
listeners then maybe. Okay. So we're
46:43
watching an episode of Rick and Morty on
46:46
my computer and I want to push it to my display.
46:48
What are you calling this thing, by the way?
46:50
Oh, it doesn't really have a name yet. I mean, have
46:53
you thought of view tricks? Oh, it's taken. All
46:55
right. Yeah.
46:59
I just have like random numerical
47:01
code names for my boards to keep track
47:04
of them, but I don't have any flashy marketing
47:06
style names yet. All right. We can
47:08
do a contest as part of the amp hour
47:10
episode if you want. And it would be called probably
47:13
viewing a view face or something like that. Right. Yep.
47:16
Yep. That's how it all goes. Okay.
47:19
Well, whatever this thing is going to be called.
47:20
So I'm watching a Rick and Morty
47:22
episode on the computer. I'm like, all right, I want to switch it to
47:24
the eyepiece. Now
47:27
something on the computer
47:28
packages up an individual. So it's
47:30
like watching it 30 frames per second
47:32
frame one out of 30 in a second. It
47:35
packages that up, sends it over wifi. The
47:37
device gets it and then it can like transform
47:40
it. Like almost like a, just a, a
47:42
flip book of JPEGs. Is
47:46
it coming through how little I know about displays? I hope
47:48
it is. In
47:50
reality, compressing video is a lot more
47:52
complex than that because you want to be able to handle similarities
47:56
between frames and take advantage of those
47:58
to reduce the information.
47:59
you have to transfer. So that's already
48:02
all handled by very
48:04
smart people designing codecs. We
48:07
know with H.264 and H.265,
48:10
AV1, all the fancy codecs, which I know
48:12
really nothing about how they work.
48:17
So those are very impressive
48:19
at what they accomplish.
48:21
In this particular piece of hardware I have,
48:24
I don't have a hardware decoder for those
48:26
codecs. So decoding
48:28
them in software is not feasible
48:31
on a microcontroller. So my
48:34
goal essentially
48:36
would be to come up with some other lightweight
48:39
forms of compression for simple
48:41
graphical content so you can still pipe
48:43
that over Wi-Fi. So you wouldn't
48:46
expect to be doing full
48:49
complex videos with very...
48:51
Okay.
48:52
So watching Rick and Morty on my full display
48:55
might not be the right fit for this design.
48:58
Watching movies is not really
49:00
the intended application. The application would
49:02
be graphical user interfaces
49:04
and such where you may be able to compress them
49:07
more effectively. Taking
49:09
advantage of the fact they have these structured elements.
49:12
Okay. So more like the heads up display
49:15
that Tony Stark uses in the Iron Man film, something like that
49:17
where it's like a graphical thing
49:19
that's a part of the screen is not sending everything,
49:22
that sort of idea? Right. Yeah. I'm
49:25
not a big movie person, but yeah, doing
49:27
sort of assistive user
49:29
interface overlays. That's really... Right.
49:31
That's the goal.
49:33
Right. That's what I was doing instead of
49:35
doing this kind of design in my free time. I watch movies
49:38
instead of this. I can imagine
49:40
the stuff you're doing takes a lot of time though. That's it's very impressive.
49:42
Yeah, it does. Yeah.
49:44
Okay, cool. So
49:46
in that case though, so what is actually sending...
49:50
Is there like a helper program that would be on the computer or
49:52
the phone? Is that the idea? Right. Yeah.
49:54
So you need a helper program like that.
49:57
And I've been working on some things pretty
49:59
similar. to that. And of course, the disadvantage
50:02
there is, you know, you consume some extra power on
50:05
whatever device is transmitting this
50:07
data, you know, so you're saving
50:09
power on the actual wearable device,
50:11
you know, by doing this concept
50:13
of transmission over Wi-Fi, but
50:15
then, you know, your phone's battery life is going to have
50:18
to take a bit of a hit. But these are the
50:20
trade offs we have to deal with, you know, trying
50:22
to make any progress in this area. So
50:25
then do you create like, either
50:28
on past versions or visions of this
50:31
most recent version, do you create like a
50:33
simulator to
50:35
kind of, I don't even know what it would look like, honestly.
50:38
Like, how do you prototype stuff before you actually
50:40
send it down to the device?
50:42
You mean prototype this sort
50:44
of processing?
50:45
Like what a single frame would be. So if you had like
50:47
a HUD style kind of
50:50
like a heads up display type frame that you're going to send
50:52
over this link to this device,
50:55
how do you know what it's going to look like? How
50:57
do you code that up, I guess, on the
50:59
computer side or on the phone side, and then
51:01
kind
51:01
of visualize what it's going to look like when you view
51:04
the world? Do you build a simulator first? Ah,
51:06
so like, I don't have a simulator of that sort yet.
51:09
The, you know, the optics on
51:11
the device are definitely a big challenge
51:13
in
51:14
making things look nice, determining how
51:16
things are going to look. And that's
51:19
really still a big area of exploration for
51:21
me, because I'm not a big optics
51:23
design person. Currently,
51:26
you know, the display in this device is oriented,
51:29
you know, as a typical 16 to 9 ratio
51:32
landscape display. So
51:35
if you have content that, you know, fits that
51:37
sort of form factor, of
51:39
course, you could also put it in portrait orientation,
51:42
you could expect it to look, you
51:44
know, reasonable, just prototyping
51:46
it on your computer screen. But in terms
51:48
of like the actual usability
51:51
of this, you know, in the real world, you
51:53
have to, you know, certainly wear the device and
51:56
try it.
51:58
And that also comes into
51:59
of a mechanical designs of it, you know, where
52:02
the display is placed, whether it's comfortable
52:04
to wear, right? There are all these
52:06
problems. Right. Yeah.
52:09
That's definitely interesting challenges. I mean, like, I guess one,
52:12
one use case that I've seen a similar thing
52:14
in the past, and I mentioned before the show is Zach
52:16
Friedman from Voidstar Labs. Like
52:19
he has a display that he created just mostly
52:21
just for, I think, a teleprompter and
52:24
similar kind of thing though. Just like
52:26
an overlay that he then kind of scrolls through
52:29
content, that sort of idea. But
52:31
it's known content because it's just written and he's kind
52:33
of looking through that written content. Right.
52:38
And I think he also brought this up before the show. The
52:40
big questions, you know, turn out to
52:42
be where should the display be
52:44
placed in your vision? You know, do you need
52:46
the display to be sort of
52:48
positioned so that
52:50
it's, you know, sort of front and center? Can you
52:52
put it in the corner of your vision? It all depends
52:55
on, I suppose, the sort of use cases
52:57
that you could imagine for this. And
52:59
that does get into these sort of considerations
53:02
with the display flex, like you mentioned earlier.
53:05
Oh yeah. How
53:08
you lay out this sort of thing into
53:10
a physical device,
53:11
especially if you're trying to, you know, 3D print it, how do
53:14
you assemble this sort of thing? Right.
53:16
Right. So how did you, what was your process
53:19
when you were kind of figuring out the flex circuits because you were
53:21
kind of getting this all put together?
53:23
I think I've probably spent
53:25
too much time on this to some extent,
53:27
because when I get into
53:30
CAD and start doing 3D models
53:33
for the enclosure of this device,
53:35
it's difficult to decide.
53:39
I mean, it's very difficult to
53:41
decide where to put things because you don't
53:43
have, at least not yet on the computer,
53:45
a good simulation of what your
53:48
actual eye would see as a human. You
53:50
have to actually print the thing, assemble
53:52
it. Things are squishy. Yeah. Especially
53:55
eyes. So the process ends up being,
53:57
you know, build
53:58
something.
53:59
in CAD around a 3D model
54:02
of a human head, you know, positioning things
54:04
the best you can
54:06
print it, assemble it, and then just, you
54:08
know, I have an entire bin
54:10
of, you know, failed revisions,
54:13
which all look the same, but are like millimeters
54:15
different in various areas, you
54:17
know, in order to try to
54:20
tweak this so that it, you know, so the
54:22
display is positioned reasonably. So
54:25
in terms of planning,
54:27
when I do the board design, I really,
54:30
I do have an idea in mind
54:32
of where the display should go, how the
54:35
flex is going to need to be folded to put the display
54:37
in that correct orientation.
54:40
But I'm also trying to keep it as
54:43
open-ended as possible so that I could
54:45
change plans and try different things
54:47
without, you know, re-spinning the board. Yeah.
54:50
So like this most recent board
54:52
I did, it kind of has the display
54:54
connector on the end so that
54:56
the flex goes, well,
54:58
this board is sort of long and narrow, 58 by 10 millimeters.
55:02
And then the flex, you know, leaves the end
55:05
perpendicularly to the length of the
55:07
board. So if you put the board sort
55:09
of horizontally across the top
55:11
of the front of your glasses, sort of above
55:14
one eye where your eyebrow is, then
55:16
the display flex can come down nicely
55:19
and put the display just beside your nose.
55:22
So that's what my current prototype is doing.
55:25
But then you can also put the board
55:27
lengthwise on the side of the glasses
55:29
in the sort of frame by your temple
55:32
and then fold the flex behind
55:34
the board, make a 90 degree turn in the
55:36
flex by folding it and then have
55:39
the display come forward kind of where Google
55:41
Glass used to have. Right.
55:43
Yep. Yeah. And
55:46
then they had a prism as well. One client on yours as well to
55:48
get like the shooting in your eye kind of thing.
55:51
So that's an interesting problem.
55:53
Because if you do want to have a see-through,
55:56
you know, transparent display
55:58
where the display
55:59
is partially
56:02
artificial content and partially the real world
56:04
that you can see through, you do need some sort
56:06
of combiner. So people do prisms.
56:08
They do birdbath-style combiners,
56:10
like the N-real devices.
56:13
Now N-real is called X-real. And then
56:15
there are people who do diffractive waveguides,
56:17
reflective waveguides.
56:19
And all of these have advantages
56:21
and disadvantages.
56:23
Of course, the combiners, like
56:25
waveguide-style things, diffractive and reflective
56:28
are very slim and nice-looking
56:30
in terms of replicating a real
56:33
lens that you'd find in a typical pair of
56:35
glasses. The disadvantage
56:37
is, well, currently they're expensive and hard
56:39
for me to get my hands on to prototype.
56:42
And their optical efficiency
56:45
is pretty terrible. So you lose a ton
56:47
of energy by
56:48
feeding more light into the system
56:50
than you will get out of the system. Right.
56:53
So then the battery suffers
56:55
and? Exactly. Exactly. So
56:58
most of those waveguide systems
57:00
you find, they will not be using OLED
57:03
microdisplays
57:04
because this OLED on silicon technology
57:07
is not really currently bright enough for
57:09
the requirements of those waveguides. So you'll
57:11
see people doing DLP things with
57:14
essentially tiny projectors in your
57:17
glasses frames or
57:19
liquid crystal on silicon with bright white
57:22
LEDs as the light source. Sci-fi
57:24
has told me that I was supposed to be getting lasers that
57:26
directly draw on the back of my retinas. And I
57:28
have not yet gotten that one. Oh. Not
57:31
sure I actually wanted, but that's what sci-fi told
57:33
me I would be getting. Right. And there
57:35
are people who do laser beam scanning too.
57:37
No way, really. Where they'll
57:39
have red, green, and blue lasers,
57:42
combine those, then have a MEMS
57:44
mirror, sometimes
57:46
one mirror scanned in two axes
57:48
in a raster pattern or two mirrors.
57:51
And then you just get persistence of vision you
57:53
just depend on that to actually maintain
57:55
the image. Right. So at least
57:58
one of the HoloLens designs
59:59
decisions in a lot of these things. And
1:00:02
just when you start to really look
1:00:04
at
1:00:04
a wide variety of faces, just, you
1:00:06
know, Google human face on the internet and like,
1:00:09
just like how much differentiation there
1:00:11
is with pupillary distance and just
1:00:13
head size and all these, all these factors,
1:00:16
because people just look different. And like that actually
1:00:18
really impacts how your experience
1:00:20
would be too, which sucks. Right.
1:00:23
And I suppose with the monocular
1:00:25
display, I'm trying also to
1:00:27
avoid some of the issues with the inter pupillary
1:00:30
distance adjustment that other headsets
1:00:32
have to deal with. If you have
1:00:33
two displays, it's
1:00:36
definitely interesting to try sort of wild
1:00:38
things like using your brain as the combiner
1:00:41
between the two eyes. Like I'm trying
1:00:43
here, you
1:00:44
just, some things like that, you
1:00:45
know, perhaps they could be useful. It depends
1:00:48
in my most recent design, it
1:00:50
really depends whether the entire
1:00:53
optics setup can be small enough that
1:00:55
it doesn't look too bulky being beside
1:00:57
your nose. Cause if it, you know, if it becomes too
1:00:59
bulky, it starts to come out in front of your eye,
1:01:02
which you don't want. You know, you still
1:01:04
want your eyes to be clearly visible from
1:01:06
the front. So if someone's talking
1:01:09
to you, it doesn't look like there's something.
1:01:11
Are you sure you don't want to just project
1:01:13
a copy of someone's eyes onto the front of your device?
1:01:15
Creepy digital eyes.
1:01:18
I don't know what the hell Apple was thinking with that.
1:01:20
Like it looks, even in the promo video where
1:01:22
they like could make it look the best it possibly
1:01:24
could, that lady looked weird.
1:01:26
It just, you know, like my brain was just like,
1:01:29
that doesn't look natural to me. I mean,
1:01:31
I suppose, I suppose if you want to have,
1:01:34
you
1:01:34
know, the chance to do fully immersive
1:01:37
stuff with, you know, high resolution
1:01:39
displays in VR, you have no choice,
1:01:41
right? If you want to be able to see the person's
1:01:43
eyes, you have to do that. I
1:01:46
mean, that's the other thing too, is like all of these things, they just
1:01:48
look so dumb. I'm not, I'm not
1:01:50
saying yours in general, but I'm just like,
1:01:52
like the Apple thing, it's very nicely
1:01:55
done for what it was, but like HoloLens
1:01:57
looks so, so dumb. And
1:01:59
the,
1:01:59
Magic Leap looks so, so dumb.
1:02:02
And honestly, Google Glass looks so, so dumb. And
1:02:04
that was nicely designed too, I thought, you know, but all
1:02:06
of these things, it's just like you mess with the view
1:02:08
of what a human normally looks like. And it's like, wow, that,
1:02:11
that doesn't look right. So it's starting
1:02:13
from a bad place. Right. Which was, you know,
1:02:15
part of the motivation to just get, you
1:02:17
know, if you can get the electronics really, really tiny
1:02:20
and the display really, really tiny, you know,
1:02:22
then maybe, you know, it can fit in something
1:02:24
that looks like a normal pair of glasses, but
1:02:26
that's always, you know, the big challenge.
1:02:30
I suppose I find at a lot of events,
1:02:32
even
1:02:33
industry conferences in this area, like
1:02:35
I was at the augmented world expo
1:02:38
in, at the end of May,
1:02:40
you don't see people wearing these sorts of devices
1:02:42
around, you know, even though you're
1:02:45
all the people interested in this technology,
1:02:47
you know, no one is wearing anything around
1:02:50
and I suppose that's a good indication of the
1:02:52
status of that industry.
1:02:55
Yeah. That's a really
1:02:57
good measure. Yeah. Yeah. So
1:02:59
do you wear these at home or, or do you just
1:03:02
not want to wear them because you don't want to look stupid
1:03:04
in public? Actually
1:03:06
yesterday I was wearing them,
1:03:09
you know, out at a sort of social event,
1:03:12
pretty much the entire time for several hours.
1:03:14
So I'm trying to wear them
1:03:16
around to get a good idea of your dog
1:03:18
food, comfort aspects of it. Yeah.
1:03:23
In terms of actual applications, you know, obviously
1:03:25
I spend so much time on hardware
1:03:27
and then I have to be like, okay, now I have to switch
1:03:29
into software mode, right? And I won't put some
1:03:31
actual good applications for it, you know,
1:03:34
or, or
1:03:34
get some other people to do it, ship them
1:03:36
some hardware, maybe they might have to start
1:03:38
doing that. Yeah. Well,
1:03:40
let's, let's, uh, let's start to wrap up
1:03:43
on that. I mean, what is your, what's your vision for the future of this stuff?
1:03:45
I mean, this is super awesome so far.
1:03:47
I, I don't know if I've complimented it enough yet. I'm
1:03:49
very, very impressed. Thank you.
1:03:52
Yeah. The vision for the future is
1:03:54
essentially to, well, at this
1:03:56
point, I think the hardware prototyping
1:03:59
is going to be on a. bit of a pause for the
1:04:01
rest of this year while I do software
1:04:03
development for this. Yeah. You know, microcontroller
1:04:06
platform, cause I'm pretty happy with the status
1:04:08
of that and its capabilities.
1:04:11
Long-term, you know, I, I'm
1:04:13
definitely going to try to continue working
1:04:15
on this in some form. It's, it's
1:04:17
always an exciting sort of hobby project
1:04:19
to pursue. I've been quite happy
1:04:22
so far because in the, the original
1:04:25
goal for it was really to learn about this
1:04:27
sort of microelectronics design, you
1:04:30
know, when I started college, I really, I hadn't
1:04:32
even done like assembly of a BGA
1:04:35
or really anything beyond through-hole, simple
1:04:37
through-hole boards.
1:04:39
So it was pretty nice to just, you know, spend
1:04:41
a few years and now feel confident that
1:04:43
if I want to design a board, you know, I can
1:04:46
do it. I can, you know, have a set
1:04:48
of requirements and fulfill them and bring
1:04:50
up something that actually runs software
1:04:52
and works
1:04:53
in some sense, the, you know, the purpose of the project
1:04:56
is fulfilled, but you know, the long-term
1:04:59
purpose I suppose would be to find
1:05:01
applications for it. Maybe I open
1:05:03
source some of this stuff at some point. Everyone,
1:05:06
you know, always asks me if I've considered doing a
1:05:08
startup related to this technology and of course
1:05:10
that's also a possibility. Yeah.
1:05:12
The downside to that is that if you, if you started, then
1:05:14
that means you got to run it and that's not the fun
1:05:16
part. Yeah. For
1:05:18
me, the fun part is the, is the engineering
1:05:21
and solving problems. So I'll always
1:05:23
be chasing that. Yeah. Yeah, definitely.
1:05:25
No, that's a, that's great. I, I, hopefully
1:05:28
people hear this, they can reach out and watch
1:05:30
the video and, you know, see if there's
1:05:32
other people interested. Yeah. It's, it's definitely
1:05:34
a great hobby,
1:05:36
hobby project and maybe, and maybe more. I'm
1:05:38
sure, I'm sure that maybe I could
1:05:40
ask this. I mean, if you don't mind me asking
1:05:42
if you're interviewing or have interviewed, did you
1:05:45
bring this
1:05:45
to interviews with you?
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More