Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:02
This is MIT Technology
0:04
Review.
0:11
From the cornfields of Iowa
0:13
to outer space, scientists are
0:15
building a world where plants and machines
0:17
communicate their needs with us and
0:20
each other in real time.
0:21
Are you ready? I'm
0:24
ready. Okay.
0:28
We're inside an experimental tractor on a test
0:30
farm that belongs to John Deere. So
0:33
we're sitting in a 8RX tractor, 410 horsepower.
0:38
This machine we're in is loaded with
0:40
tech. It has self-driving capability.
0:42
Joe Leifer is the senior product
0:45
manager of Autonomy. This machine
0:47
also has the capability to run full infield
0:49
autonomy. Just through the click of a
0:52
button here on this lever, right, I can change
0:54
ultimately the ground speed that we're running at.
0:56
It can run in straight rows or curved
0:58
ones, stay within the boundaries of a field,
1:01
and knows where it should place seats. So
1:03
you've got a fair number of sensors,
1:05
cameras. Absolutely, yeah. It's
1:07
equipped with six stereo camera pairs, three
1:10
on the front of the tractor, three around the back of the
1:12
cab. We have two NVIDIA
1:14
Jetson GPUs that are classifying
1:17
the world around us as we run in full autonomy
1:19
mode. So we just came to a stop,
1:21
and I'm going to power down the tractor here now.
1:29
In some ways, AI
1:30
has arrived at scale on the farm. A
1:33
majority of farmers now use sensors and
1:35
digital technologies that collect data,
1:37
and most tools that analyze that data also
1:40
use AI. But it's still early
1:42
days for other forms, including machine
1:45
vision.
1:46
I imagine a world within the next 10
1:48
years where all of these vehicles
1:51
literally have eyes and maybe
1:53
ears. Those eyes and ears,
1:55
those cameras, those sensors
1:58
are powered by machine learning.
3:57
years.
4:01
It's looking for is the machine
4:03
being a little bit too rough on the inside and
4:05
damaging the kernels. And if it is,
4:07
it begins to make adjustments to its mechanical
4:09
systems. And that's what you'll see back
4:11
here. All these mechanisms are meant to just
4:14
distribute all of that residue
4:16
in an even fashion because another
4:18
job of the combine is to get the ground ready
4:21
for next year. So by evenly distributing
4:23
all the residue, that becomes essentially
4:26
fertilizer for next year's crop.
4:28
Not all of this is done without the vehicle
4:31
ever stopping. And then this big tube,
4:33
this big auger above us, folds
4:35
out basically 90 degrees
4:38
perpendicular to the vehicle. And
4:40
then a tractor pulls up next to it with
4:42
another wagon. And then the grain
4:44
that is stored here is transported from
4:47
the machine through that auger to another vehicle
4:49
pulling a wagon. Looks like
4:51
cameras or sensors up on the arm as
4:53
well. Oh yes.
4:54
One of the pain points of farmers in the past
4:56
was from the cab of this vehicle,
4:59
from the operator station of this vehicle, it's
5:01
sometimes hard to see because the
5:03
tractor may be far away and it's dusty. So
5:05
we've built a machine vision system that helps
5:08
aim
5:09
where that auger dumps the corn.
5:12
So shall we hop on? Please. Let's
5:14
do it. All right. But it's more
5:17
of a climb than a hop. This machine towers over the both
5:19
of us and the latter feels at least
5:21
a few stories tall. And
5:24
why don't I let you jump in the driver's seat?
5:26
Ooh. Thank you.
5:29
We are sitting now in the cab,
5:31
the operator station. This vehicle
5:34
has 17,000 parts and all of those 17,000
5:36
parts have to work in synchrony
5:38
to essentially
5:42
create what we refer to as a factory on wheels.
5:44
Up inside the combine, I'm able to get a
5:47
better sense of how it works. He
5:49
describes how AI talks to mechanical
5:51
systems in real time to make
5:53
sure the harvest, in this case corn kernels,
5:56
isn't damaged. It's also recording
5:58
the crops quality and yield.
6:00
and it provides farmers with a report card,
6:02
so to speak, called a yield map that
6:05
allows them to assess their performance.
6:07
In the 90s, we talked about one meter
6:09
accuracy. In the 2000s, we would talk
6:12
a few inches. We
6:14
are now at a point where the accuracy of
6:17
our GPS solutions is
6:19
less than one inch, 99% of the time.
6:22
Which matters when you're placing
6:24
something as small as a seed. It matters a
6:26
ton. And actually, if you put
6:29
two seeds
6:30
next to each other, but you
6:32
put them too close to each other, they
6:35
begin to compete for nutrients right away.
6:38
If you space them apart just enough,
6:40
then each one kind of gets their own little ground
6:42
and they grow the most healthy.
6:47
A
6:47
common misperception of farmers is that
6:49
they reject new technology or their
6:52
late adopters. But he says that's far
6:54
from true. After all, farmers
6:56
accepted self-driving technology much
6:58
earlier than the rest of us. But they
7:00
do have what he calls a show-me culture.
7:03
They need to know these tools work and will
7:05
work for them on their land and
7:07
for their crops.
7:10
Let's say there's a farmer that farms
7:13
a thousand acres just to make it a round number. What
7:16
you'll see farmers do is test
7:19
out technologies in small portions
7:21
of their farms. So they'll say, hey, I'm gonna take this small
7:24
field that I have and try this
7:26
out for this year and see if it scales. They
7:28
don't just do that with equipment. They do that also
7:30
with seeds and with chemicals and
7:33
even new agronomic practices. So
7:35
they wanna understand how that
7:37
a new practice or a new technology would scale
7:39
across their operation. He
7:42
believes AI on the farm through
7:44
the lens of data analytics has arrived
7:46
at scale. But
7:48
we're still just getting started with machine
7:50
vision and other forms of AI.
7:53
Right now, the machine
7:56
reactively makes decisions based
7:59
on what it injects.
7:59
and it optimizes itself based
8:02
on what's inside of it. As we look toward the
8:04
future, with more machine vision capabilities,
8:07
the plan is to begin to look ahead of the machine
8:09
with cameras and be able to predict
8:11
what's ahead of it. So you can actually,
8:14
for example, measure the height of the
8:16
crop ahead of the machine, and the machine
8:18
can almost begin to configure itself
8:21
proactively.
8:22
Another way to do this is through satellites.
8:25
That information
8:28
can be then transferred to the machine via
8:30
the operations center, and now the machine
8:32
has knowledge of what that satellite
8:34
saw, and again, can begin to proactively
8:37
configure itself before it actually
8:39
gets to the crop. We believe that
8:42
persistent and reliable connectivity
8:44
is critical to agriculture. It's not just moving
8:46
data off the vehicle, it allows them to
8:48
have access to data that could be moved
8:51
from other places into the vehicle so that they
8:53
become more intelligent in real time.
8:54
Right, so that if you have multiple pieces of equipment,
8:57
they can essentially talk to each other. Exactly, exactly.
9:03
After the break, plants that have been genetically
9:06
modified to communicate their needs by
9:08
giving off new forms of light that can be
9:10
seen from outer space. You
9:15
can find links to our reporting in the show notes,
9:17
and you can support our journalism by going to
9:19
techreview.com slash subscribe.
9:23
We'll be back right after this. We're
9:30
mining soils in a rate
9:31
that's never been seen before. Those soils took thousands of years to develop, and
9:34
they might not be usable in 30 years,
9:37
and
9:37
people do not understand that. So we gotta do something, and
9:40
it's likely technology and the win-win that's
9:42
gonna save it, right? You have to make farmers more productive,
9:45
more profitable, more sustainable. You
9:48
can't take anything from the soil and
9:50
just throw it away. So we have to do something. We
9:52
have to do something about it. We have to do something
9:54
about it. We
9:55
have to do something about it. We have to do something about
9:57
it. develop
10:00
essentially complicated solutions that give them an
10:02
easy work around. I'm
10:05
Shelly Aranov, I'm the CEO and founder
10:07
of InnerPlant. We
10:10
create crops that can communicate what they need. So
10:12
let's start at the beginning, actually. We use 250
10:15
billion worth of chemicals every year, fertilizers,
10:18
pesticides. At least 30 percent of those
10:20
are misapplied, overapplied. At
10:22
the same time, we lose 20 percent of potential yields
10:25
to pathogens. So we have an inefficient system.
10:27
And the reason it happens is because farming is really
10:29
large. Right. An average farm is probably
10:31
the size of San Francisco. And
10:34
it's impossible to find problems at the right time. So
10:36
what farmers do is they hedge their bets, they
10:38
apply chemicals in advance on entire fields, and
10:41
the data clearly shows that that doesn't give you better
10:43
results. It just gives you more chemicals. So
10:46
what we do is we enable crops
10:48
to tell the farmers what they need.
10:50
This is achieved by genetically engineering
10:53
plants to give clear signals about what's
10:55
wrong with them. So it starts with the plant
10:57
itself. Plants already
10:59
react natively to stress. So
11:02
people think that plants are just stuck
11:04
and do nothing. They're super active in their environment
11:07
because they're stuck and they can't move. So,
11:09
for example, if a plant is eaten by insects,
11:12
it will actually start producing a new compound in
11:14
its leaves to make it taste bad. Or if
11:16
it doesn't have enough nitrogen, it's going to start mobilizing
11:18
its roots to be able to capture more nitrogen
11:21
from the soil. All these things happen
11:23
very early on and they're different reactions.
11:26
And we know them very well
11:27
because people have been studying them for a long time. But
11:29
they're all on the biological level, so you can't see
11:32
them. What we do is we code the plants
11:34
as they're reacting to that native stress. They're going
11:36
to start producing a new protein that signals
11:39
optically from the leaves.
11:40
There is a problem. I'm attacked by insects. I
11:42
have fungal pressure. I need more water. And
11:45
then we use sunlight, optical
11:48
equipment and algorithms to be able to
11:50
see those signals from anywhere from satellite imagery
11:52
and as close as, you know, a tractor sensor
11:54
in the field.
11:55
And this is how it works. What
11:57
we do is we go in. take
12:00
that piece of DNA and
12:02
we add something to it. We add the fluorescent
12:04
protein. And then what happens is that
12:07
when the plant starts activating that piece of DNA,
12:09
it's going to start producing the new fluorescent protein
12:11
that we taught them how to make. So that's really it. And
12:14
then it's embedded in the plants. So in
12:16
the next generation, they already have this line
12:18
of code. And they just know that when
12:21
they're responding to stress, they're going to start producing this
12:23
new protein. And
12:25
also, when the stress goes away, the protein
12:27
stops,
12:28
which means you have a signal that's on and off and
12:30
you know when problems are resolved.
12:32
What she's talking about is called solar-induced
12:35
fluorescence. It was discovered about
12:37
two centuries ago by a Scottish preacher
12:40
who figured out that when sunlight hits a green
12:42
alcoholic extract of laurel leaves, it
12:45
brings out a bright red light.
12:47
It's part of the photosynthesis process, but
12:50
not something the human eye can usually
12:52
see without help.
12:53
And the reason that's important is because if you just need
12:55
the sunlight, then you can do it from satellite
12:58
imagery. If you need another light source,
13:00
then you're going to have to be closer to the ground
13:02
and it's not scalable. And if you want to cover
13:04
oceans and the entire globe and large agriculture
13:08
land, you need that.
13:09
Before she got into the agricultural business,
13:12
she was in the food business.
13:13
An Israeli-style hummus brand in the Bay Area. And
13:16
at the peak of that, we're selling at about 300 stores. And
13:19
then I had my first daughter, and I thought,
13:21
I want to do more with life. I want to
13:24
make an impact. I want to do something that matters. I
13:26
really want to work with agriculture. I love food,
13:28
but I wanted to go back to where things start and
13:31
where a lot of impact could be done.
13:34
Do you want to see the demo? What you'll see,
13:36
if you put on the glasses.
13:40
And I'm putting on the
13:42
headphones. Feels like sunglasses
13:44
I'm putting on here. Oh, you kind of are. So
13:47
basically, what you'll see is we have two kind
13:49
of plants here. These are the regular plants,
13:51
and these are the sensor plants. Oh, wow. Exactly.
13:54
So they really glow. And here, if
13:56
you think about it, the laser and
13:58
the color.
15:38
This
16:00
one has the problem and I can treat just
16:02
in the places where I have the problem
16:05
and skip the guys that
16:07
do not.
16:08
First, he needs to understand the
16:11
way a plant responds to problems. One
16:13
way to do this is by grinding up infected
16:15
plants and comparing their gene expression
16:17
levels to healthy plants,
16:19
as the stressed plants will have turned on
16:21
a bunch of genes that help combat the
16:23
problem.
16:24
And once they know what those genes are,
16:27
they can add their color signal onto that process.
16:30
Yeah, go ahead, go in here. There's
16:33
no pathogens or anything in this one. At least we try
16:35
to keep the pathogens down. There are no
16:37
intentional pathogens in this growth chamber.
16:40
So we have tomatoes, rice
16:42
and soybeans in here right now. So
16:45
we're developing sensors for all of
16:47
these. If they come in here, we grow them in
16:49
here, we do the tests in the lab, and
16:51
then the best performing ones go out into
16:53
the field.
16:57
So that's sort of dirty. So that's
16:59
where we grow the plants in the soil. This is
17:01
where we grow the plants in tissue culture. So
17:03
this is more, well, called sterile,
17:06
but it's a cleaner environment. So here's
17:09
where they do the actual integration
17:12
of the genes into the plant. And
17:16
so we need to keep these guys sterile because they grow in
17:18
these tissue culture environments until
17:21
they're big enough to be moved over
17:23
into soil.
17:29
They are generating the plants. They
17:31
start from like little tiny clusters of cells
17:33
and then through different manipulations
17:36
of media. So I have this little cup here
17:38
with plants that they
17:40
grow up. Some of the plants that are
17:43
non-edited or non-changed
17:45
or transgenic, they turn white. They
17:48
don't live, but the ones that are starting to turn green, those are the
17:50
ones that will make it through. So they're in these little
17:52
sterile cups. So all this stuff in here is sterile. We
17:54
don't like to have to get the soil in here, not
17:57
until we can move across the way. And then
17:59
we take
17:59
them across the hall. and then we put them in dirt and then
18:01
eventually take them out to the greenhouse where we can collect seeds
18:04
and then in the next generation we can start
18:06
doing our
18:06
tests.
18:11
We do two detections currently,
18:14
so two different colors, and then we can do up to five.
18:17
Once again, Shelly Aronoff.
18:19
And the idea is to get to the point where we can tell farmers
18:21
everything they need in one seed. So
18:24
fungal, insect pressure, nitrogen,
18:26
phosphorus, potassium. And
18:29
the plan is to do this at scale. The
18:31
technology is embedded in the seeds, right? When the farmers
18:34
plant seeds, they're harvesting data and
18:36
they don't need to do anything else. It's just a line of code
18:38
in the seeds. So every single plant in
18:40
the field is emitting this information. It's
18:42
a living sensor. And then the idea
18:44
is first, salad imagery, because we want to make
18:47
sure that we can cover the globe really cost effectively.
18:49
And we're talking two and a half acre pixel
18:51
size. So we can look really, really small and
18:54
start seeing an infestation in the early days, for
18:56
example, if it's fungi.
18:58
So this is the starting point. We call that the scouting
19:00
tool.
19:00
But then once you get into the field, right, we're talking about
19:03
the OEMs and the tractors, then
19:05
you have another opportunity because you already know
19:07
you have a problem, but you have equipment
19:09
now that can literally look at every individual plant
19:12
and give it just what it needs.
19:15
Right? So in the really a few years into
19:17
the future, we can go into the field
19:19
and scan every plant knowing, for example,
19:21
there's a nitrogen stress somewhere,
19:24
but then still look at every individual plant and only
19:26
give the plants that need nitrogen, nitrogen
19:28
and the plants that are healthy for whatever reasons, those
19:31
can just be left alone. And
19:33
that will be, I believe, the most efficient
19:36
way we'll ever farm.
19:43
This mini series on satellites and farming
19:45
was reported and produced by me, Anthony
19:48
Green and Emma Silikens. It was edited
19:50
by Matt Hounen and our mix engineer is
19:52
Garrett Lang with original music by
19:55
Garrett Lang and Jacob Gorski. Thanks
19:58
for listening. I'm Jennifer Strong. This
20:05
is MIT Technology Review.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More