Google Photos – Lens

I have been waiting for some time to explore Google lens. This Google Photos capability was first available for Android devices but was supposed to be rolling out for iOS. Cindy has the capability on her iphone, but I, as yet, do not.

Anyway, I found that I can use Google Lens on my Chromebook and the larger screen offered advantages in recording a demonstration of what Lens can do.

My demonstration may paint a picture that is too positive. The service is impressive. One of the capabilities I keep searching for is related to my background in teaching biology. I am a sucker for apps that purport to identify plants and animals. Google Lens might be expected to have similar capabilities.

What I have found about this application of AI is that plant identification is very challenging. This makes sense as the images provided may or may not reveal critical features the AI needs to make an accurate identification. What I remember from classes requiring that I identify unknown species with a “key” is that even with guidance this process is challenging. I like to test the AI capability of these products by visiting a zoo or botanical garden that offers examples I do not know, but also provides labels for the exhibits. Does the identification of the technology offer a match?

What Lens does in such circumstances is make its best guess, but it also shows you images of other matches it considered. This seems a reasonable combination of AI and human intelligence. As a learning experience, the consideration of the options may offer a superior opportunity. You have to be involved. The technology scaffolds the experience by limiting the options and you end up making a decision.

 

Loading