Google's AI can now identify food in the supermarket, in a move designed to help the visually impaired.
It is part of Google's Lookout app, which aims to help those with low or no vision identify things around them.
A new update has added the ability for a computer voice to say aloud what food it thinks a person is holding based on its visual appearance.
One UK blindness charity welcomed the move, saying it could help boost blind people's independence.
Google says the feature will "be able to distinguish between a can of corn and a can of green beans".
Eye-catching, not easy
Many apps, such as calorie trackers, have long used product barcodes to identify what you're eating. Google says Lookout is also using image recognition to identify the product from its packaging.
The app, for Android phones, has some two million "popular products" in a database it stores on the phone - and this catalogue changes depending on where the user is in the world, a post on Google's AI blog said.
In a kitchen cupboard test by a BBC reporter, the app had no difficulty in recognising a popular brand of American hot sauce, or another similar product from Thailand. It could also correctly read spices, jars and tins from British supermarkets - as well as imported Australian favourite Vegemite.
But it fared less well on fresh produce or containers with irregular shapes, such as onions, potatoes, tubes of tomato paste and bags of flour.
If it had trouble, the app's voice asked the user to twist the package to another angle - but still failed on several items.
The UK's Royal National Institute of Blind People (RNIB) gave a cautious welcome to the new feature.
"Food labels can be challenging for anyone with a visual impairment, as they are often designed to be eye-catching rather than easy to read," said Robin Spinks from the charity.
"Ideally, we would like to see accessibility built into the design process for labels so that they are easier to navigate for partially sighted people."
But along with other similar apps - such as Be My Eyes and NaviLens, which are also available on iPhones - it "can help boost independence for people with sight loss by identifying products quickly and easily".
Lookout uses similar technology to Google Lens, the app that can identify what a smartphone camera is looking at and show the user more information. It already had a mode that would read any text it was pointed at, and an "explore mode" that identifies objects and text.
Launching the app last year, Google recommended placing a smartphone in a front shirt pocket or on a lanyard around the neck so the camera could identify things directly in front of it.
Another new function added in the update is a scan document feature, which takes a photo of letters and other documents and sends it to a screen reader to be read aloud.
Google also says it has made improvements to the app based on feedback from visually impaired users.