Join the Newsletter

Stay up-to date with food+ag+climate tech and investment trends, and industry-leading news and analysis, globally.

Subscribe to receive the AFN & AgFunder
newsletter each week.


Startup Using Deep Learning to Fight Food Waste Raises $2m Seed Round

March 26, 2018

AgShift, a California-based food inspection technology startup using deep learning to assess the quality of produce has raised a $2 million seed round led by Exfinity Ventures, an Indian VC focused on frontier technologies.

AgShift applies deep learning to the inspection of fresh produce through a mobile app. At every point in the supply chain from farm to wholesaler, to distributer, to packer or processor, to retailer, fresh produce is inspected for freshness, damage, size, and color to fit within the specifications the industry has set as well as the USDA’s grading system.

Currently this process, according to AgShift founder Miku Jha, is time-consuming and subjective. Receivers accept a palate of produce, unwrap the palate, remove a few packages and inspect them either with the human eye or using rulers, evaluating size, color and amount of bruising. This process leads to inconsistency of quality arriving at the final destination, which ultimately leads to food waste.

“Whether its strawberries or cashews, there is a well-defined spec for what parameters are needed to make the cut. We can make that whole process autonomous,” said Jha, who recently completed her first pilot study with a major berry brand.

After photographing strawberries, the AgShift app tells the user what percentage of red is in the photo, which speaks to how much shelf life is left. “If this goes to 90% then you have to make a call because you have very little life left,” Jha told AgFunderNews.

The app also measures size and proportion of bruising  – which determines the USDA grade. Jha said that the app cuts the inspection time roughly in half for each individual item inspected. It does not replace any workers currently required in the process, but it does increase the capacity of any receiving team by up to 100%.

Deep learning is the same technology that makes self-driving cars possible, explained Jha. It is related, but not the same as machine learning because while machine learning is a form of artificial intelligence that can improve upon a specific task by analyzing large amounts of data, deep learning algorithms are built more similarly to how the human brain works – with a network of algorithms that mimic human reasoning.

“Deep learning can be applied to any problem statement,” said Jha.

With each new crop, Jha and her team need six weeks to fine-tune the deep learning algorithms and allow the app to learn what to look for. After strawberries, the team will next complete a pilot study with cashews and in the future, wants to work in seafood for quality and fraud inspection.

“With deep learning, you have to train it. You have to seed it with images and that takes time,” said Jha, who explained that the market will inform which crops she focuses on and when. Since there is no publically available data set to help train her system to assess new crops, all data must be gathered in the field — or at the tailgate.

“That is good and bad because the industry itself has never done inspections this way. They haven’t taken photos ever — even today. Even if they take pictures, they do it to put it in the claim report,” said Jha, who hopes her app can decrease or eliminate claims (when a shipment must be refused for quality reasons). These claims are a major contributor to food waste in the supply chain because of the time they consume in the short shelf life of a delicate fruit like strawberries.

Jha will be hiring more deep learning experts on the back of the new funding, as well as getting ready for full commercialization with more pilot studies.

“We are looking at either the commodities that are very high margin, or ones that suffer in transit because of the form factor of that fruit,” said Jha.

Though the current app uses the camera in any modern cell phone, Jha said that other types of photography may allow her app to be even more precise in the future — namely near-infrared imagery, which may allow the app to detect issues the human eye cannot see.

AgShift completed the THRIVE Agtech Accelerator program in 2017 and is the first agrifood tech investment for Exfinity Ventures.

Join the Newsletter

Get the latest news & research from AFN and AgFunder in your inbox.

Join the Newsletter
Get the latest news and research from AFN & AgFunder in your inbox.

Follow us:

AgFunder Research
Join Newsletter