Join the Newsletter

Stay up-to date with food+ag+climate tech and investment trends, and industry-leading news and analysis, globally.

Subscribe to receive the AFN & AgFunder
newsletter each week.

Meet the Founder: Neatleaf’s Elmar Mair discusses the critical role of autonomy in ag: “It takes a lot of anxiety out of cultivation”

April 19, 2024

Disclosure: AgFunderNews’ parent company, AgFunder, is an investor in Neatleaf.

As AgFunder partner Tom Shields recently wrote, it’s time to take greenhouses into the next era, where we’ll need to “optimize yield and reduce costs. To do that, growers need more data, faster, and they need to obtain it without human labor.”

This is exactly what California-based Neatleaf hopes to do with its flagship product, the Spyder, a fully autonomous robotic platform that scans crops and can generate millions of data points on plant health and growth metrics.

It’s a relatively simply setup — Neatleaf co-founder and CEO Elmar Mair likens it to a “football stadium camera” that operates above the canopy, gathering and relaying information to growers that’s more granular than could be garnered by humans alone.

Below, Mair discusses why this technology could be a “game changer” for greenhouses, why autonomy is critical for agriculture, and why Neatleaf isn’t a cannabis company, despite starting out in that market.

AgFunderNews (AFN): How did you get into robotics?

Elmar Mair (EM): I grew up in the Dolomites, the Italian Alps. I spent my summers on my auntie’s farm and that’s where I got into Dutch horticulture. Then I [started] a career in computer science and that’s where I fell in love with robots. I also worked as a senior researcher at the German Aerospace Center, where there is a variety of robots — humanoids, flying robots, etc.

I ended up joining the Robert Bosch Research and Technology Center where I worked on self driving cars. I joined a startup, Lucid Motors [which went public in 2021], then ended up at Google X where I led a large engineering team and ended up working on the Everyday Robot project and everything around perception: sensor fusion, mapping, localization, scene understanding, object detection and machine learning.

It was incredibly exciting, all these robots moving around and grasping things. But self-driving cars, the Everyday Project, they all had this far-fetched roadmap.

AgFunderNews (AFN): What led you to agriculture?

EM: I live in Santa Cruz [California], where there’s so much agriculture going on. Everyone talked about “the fourth agriculture revolution” and automation, data-driven farming, and controlled environment agriculture.

I visited a greenhouse and [saw] dangling sensors in the middle of the greenhouse that measured temperature, humidity, and saw how [the CEA process] heavily relied on humans. In every other industry, every time you want to optimize a process, the first thing you do is get data. Somehow in agriculture we haven’t done that yet.

So in 2020, I left Google and started Neatleaf. Within a month, we had a first prototype in the field with stationary cameras and sensors. Then we realized we needed a moving platform, a sensor box that moves around and captures all the information humans do when they walk the field.

We came up with this cable-based system, which is like a football stadium camera, that operates above the canopy.

Neatleaf Spyder flying over a cultivation facility monitoring plant health. Image credit: Neatleaf

AFN: Explain a little more about how your technology works.

EM: We capture the full canopy overview: temperature, humidity, CO2, images, leaf temperature, lighting conditions.

[With full canopy overview], you can really see systemic issues [in the greenhouse]. Our AI will be picking up on things for the human to highlight certain patterns and see how things have changed. It’s incredibly powerful for someone to ask, “When did [the problem] start? Why did it start? Where did it start?”

The most important thing is, you get this in numbers. You can [for example], count every single yellow leaf and see something like, “Oh, I have 10% more yellowing.” You can’t tell that if you go into a growing space and just look at it.

It’s such a complex space. You have all these basic parameters — humidity, CO2, light, nutrients, irrigation, airflow — impacting the plant. If you have to change one thing, you have to change everything.

We know that machine learning is incredibly good at understanding these patters and extracting these patterns. Being able to leverage AI also for identifying the buds, bud sizes, diving into yield forecasting — you can’t do that as a human, and it’s where the challenges in agriculture come from.

AFN: How is this helping growers?

EM: You can’t run around with a hand-held camera and [capture all this data] like microclimates across the full canopy. [With the system] constantly having an eye on these things, it takes a lot of anxiety out of cultivation.

[For example], you can see plant stress in numbers as a stock ticker [on the system’s dashboard], which can show something like 3% less yellowing on plants. That’s a game changer because we never had that information and humans had to run around and check on the plants.

AFN: What do current customers say they like about the tech?

EM: The remote access. Especially for consultants, if they get called, they have to fly [to the facility] to see a damaged plant and spend all this time interviewing and providing guidance based on the interviews. [With Neatleaf’s system], they can stay remote and go back in time and see what happened and provide better feedback.

In one specific case, the manager said, “Management is giving me so much shit because they see that the plants are unhappy. Now I can tell them what happened. We had this outage and the plants were yellowing.”

Neatleaf co-founder and CEO Elmar Mair

AFN: Is your focus more indoor or outdoor ag?

EM: The technology is crop agnostic but we’re focusing right now on indoor. At the same time, our data pipeline is agnostic about where the data comes from, so we can feed in [information] from other sources like drones or something else.

Our beachhead right now is cannabis. It has incredibly high margins and it’s incredibly hard to cultivate because it grows so fast, it attracts pests, etc.

If you develop a new technology, you want to start with the highest-margin market and make your way down to address the broader market. You bring down the cost as you scale up.

But we’ve already planned an installation with one of the biggest blueberry nurseries in the world and we are also installing a system in a leafy greens facility, so it’s exciting to get out of the cannabis market. We’re definitely not a cannabis company.

AFN: Where are you at in terms of deployment?

EM: We started iterating on the system in 2021. Last summer, we finally said, “Let’s ramp up.” So we started producing more systems and building up the production lines.

Since the beginning of the year we have doubled our systems in the field and we have a backlog of orders. We’re still working on getting the word out, but so far we haven’t lost a single customer. We’re still at small numbers. We have 30 systems in the field and that will be more like 70 by the end of the summer.

AFN: Indoor crop agriculture has had a rough couple of years. How do you view the current state of the industry?

EM: I think CEA [controlled environment agriculture] isn’t doing that bad, it’s actually still growing 20% year over year.

The bottleneck with CEA is that you need expert knowledge and we don’t have enough experts. They fly around experts from one facility to the next. Here, you can do it remotely. At some point we’ll be able to provide expert knowledge to cultivators without human intervention, with just the AI.

CEA makes total sense, but the problem is that CEA still has 10% to 20% crop loss depending on the crop. How can that be, in such a controlled environment? It’s because we rely on humans to make decisions and we don’t capture what’s going on.

I think we have huge potential to bring that number down and increase the potential. Right now we’re just scratching the surface of what’s possible but I think data collection is the missing puzzle piece [that will drive the next] agricultural revolution.

AFN: Did you choose Santa Cruz as your location?

EM: It’s in between Silicon Valley and a lot of our clients [in Monterey County].

Now we have deployed systems all over the US and Canada, but [initially] it seemed like a good fit.

We do also have an office in Munich, where we do a lot of engineering. That also allows us to have a footprint in Europe, which is a big market for CEA. We haven’t deployed a system yet, but I hope we’re going to soon.

Join the Newsletter

Get the latest news & research from AFN and AgFunder in your inbox.

Join the Newsletter
Get the latest news and research from AFN & AgFunder in your inbox.

Follow us:

AgFunder Research
Join Newsletter