A Blog by Jonathan Low

 

Dec 12, 2018

How Companies Are Using AI To Detect Unfresh Food

Improving safety and reducing waste. JL

Paul Sawers reports in Venture Beat:

Machine learning and hyperspectral imaging, a technique that combines spectroscopy and computer vision, assess the quality of food in factories. While the human eye perceives the color of light in three wavelength bands, hyperspectral cameras cover a wider spectrum to see beyond what’s visible to humans. Cameras and machine learning software can serve up information on the quality of food including how fresh it is, its expected shelf-life, and any contamination that may be present by scanning the food from the outside.
Computer vision is infiltrating just about every industry — to bring analytics to retailers’ shelves, identify early signs of Alzheimer’s, and even help security cameras identify weapons.
Then there is ImpactVision, which is leveraging machine learning and hyperspectral imaging, a technique that combines spectroscopy and computer vision, to automatically assess the quality of food in factories and elsewhere.
The San Francisco-based startup previously raised around $1.6 million in funding, and today it’s lifting the lid on another $1.3 million in a round led by transport and logistics giant Maersk. Participants in the round include the Yield Lab, Acre Venture Partners, AgFunder, and Xandex Ventures. A spokesperson told VentureBeat that the fresh funds will be used to “accelerate product development” and grow the company’s sales and engineering teams.

Food for thought

The roots of hyperspectral imaging can be traced back decades to NASA, which developed the technology for use in aerial imaging.
While the human eye broadly perceives the color of light in three wavelength bands — red = long, green = medium, and blue = short — hyperspectral cameras cover a much wider spectrum to see beyond what’s visible to humans. Thus, ImpactVision’s cameras and machine learning software can serve up information on the quality of food — including how fresh it is, its expected shelf-life, and any contamination that may be present — purely by scanning the food from the outside. A core selling point here is that no food is damaged in the process, which is entirely non-invasive.
One way of determining the maturity of an avocado, for example, is to carry out a dry matter (DM) content analysis. Traditionally, this is slow and sample-based, meaning that it’s not entirely reliable in terms of ensuring the quality of each individual avocado. ImpactVision claims its automated visual-based classification system produces far more accurate results, while also being able to scan 100 percent of the products.
Above: ImpactVision: Scanning avocados
Above: ImpactVision: Avocado scanning on the screen
ImpactVision told VentureBeat that it’s working with avocado distributors to replace their current systems, which lead to a great deal of waste and consume a lot of resources.
“Hyperspectral imaging allows us to perceive qualities the human eye cannot detect by accessing information from across the electromagnetic spectrum — for example, the freshness or ripeness of food products,” said ImpactVision CEO Abi Ramanan. “We recognize the potential this data has to transform the way supply chains process and distribute food.”
It’s not just about checking for ripeness and contamination, however. ImpactVision also said it’s in discussions with large berry distributors to potentially automate some manual processes, such as counting strawberries to keep tabs on the amount of produce created.
Above: ImpactVision: Counting strawberries

Cutting waste

Around a third of food produced globally each year never reaches a human mouth, according to the United Nations’ Food and Agriculture Organization. This translates to $1 trillion of edible food ending up in a landfill.
ImpactVision’s underlying promise fits into a broader waste-cutting trend across the technological landscape.
“Hyperspectral imaging technology is a game-changer for the food system, and ImpactVision’s machine learning-first approach positions them competitively as the sensors increasingly commoditize,” said Peter Jorgensen, venture partner at Maersk. “We see clear potential in enabling food supply chains to be more predictive, in terms of waste reduction but also increasing quality and safety for consumers worldwide.”
Numerous startups raised VC cash to help companies cut waste this year. Santa Barbara-based Apeel Sciences raised a chunky $70 million to eliminate food waste by applying a second layer of skin to fruit and vegetables to reinforce protection and prolong shelf life by up to 3 times. Apeel Sciences also commercialized its product this year, with Costco and Harps Food Stores selling avocados and reportedly garnering a 65 percentage-point margin increase and a 10 percent sales increase.
Elsewhere, Swedish startup Karma raised $12 million for a consumer marketplace that enables restaurants and supermarkets to sell their surplus food at a discount, while Full Harvest raised $8.5 million for a B2B marketplace that helps farmers sell surplus and imperfect goods to food and beverage companies.
Last year, Volvo revealed that it was testing self-steering trucks to help sugarcane farmers improve their crop yield. Roughly 4 percent of crops can be lost to trucks driving over them, due to human error. With self-steering trucks, drivers don’t have to worry about keeping the vehicle in a straight line.

Software-as-a-service

It’s worth noting here that ImpactVision’s product is actually the software and machine learning algorithms that interpret the images captured by the hyperspectral cameras. However, the company provides everything needed for factories and food companies to use its software — it sources off-the-shelf hyperspectral imaging cameras, rather than developing its own expensive hardware, and sells everything as a bundle. It also charges to install the camera sensors, and there is a recurring software fee.
“We’re taking this technology into the food industry in order to digitize food supply chains, which are today overwhelmingly analogue — [involving] visual inspections and destructive sample-based tests,” Ramanan continued.
ImpactVision has carried out a number of pilot projects around the world to predict quality attributes of meat, fish, fruit, and salad. But Beta San Miguel, a sugar processor based in Mexico, represents the first commercial installation of ImpactVision’s technology. The tools are used to spot foreign objects in the sugar that may be missed by X-ray, metal detectors, and the human eye.
“ImpactVision’s foreign object detection system gives us full confidence that potential contaminants will be detected in real time during processing,” added Ismael Santiago Aguirre, director of special projects and innovation at Beta San Miguel. “This means we can always guarantee premium quality sugar to our clients, enhancing our brand and product.”

Future applications

For now and the foreseeable future, ImpactVision will be largely focused on food companies. But with smartphone cameras constantly evolving, don’t be surprised if you find an ImpactVision app to download from Google or Apple’s respective app stores in the next few years.
As you can see from these mockup screenshots, ImpactVision envisages a day when you will no longer have to fondle produce to determine ripeness — you’ll just point your phone at the fruit aisle and receive a freshness score on the spot.
Above: ImpactVision: Mockup consumer app
Researchers have also demonstrated hyperspectral imaging on mobile phones, which could have ramifications beyond that of determining food health — potentially being used to scan humans and detect conditions such as cancer.

0 comments:

Post a Comment