Pinterest Lens also happens to power Samsung’s Bixby Vision. Last February, Pinterest launched its own Lens tool, which lets users search the site using the Pinterest camera. Facebook, Amazon, and Apple have begun been building their own visual search platforms or acquiring technology companies that analyze photo content. It’s a problem that many others are trying to tackle as well. "So the problem of search in vision is just vastly larger than what we've seen with text or even with voice." Think about how many objects there are in the world, distinct objects, billions, and they all come in different shapes and sizes," Bavor says. If you’re trying to do voice recognition, there’s a really small set of things you actually need to be able to recognize. "In the English language there’s something like 180,000 words, and we only use 3,000 to 5,000 of them. "We’ve always used vision technology in our image recognition algorithms, but in a very measured way," she says.īavor says it’s also the sheer number of objects that exist in the world that makes visual search a unique challenge. So why is it so hard to bring it all to Lens search? Chennapragada insists that it’s quite difficult to provide on-the-fly context for visual objects in what she calls a "very unstructured, noisy situation." Of course, Google already has all of that information indexed, whether it’s puppy breeds, restaurant menus, clothing inventory, or foreign languages. This update just means if you’re a native speaker in one of those new languages, you can run a version of Lens that’s specific to that language. Lens has always been able to translate languages supported by Google Translate. The new Google Lens will also support Spanish, Portuguese, French, German, and Italian-which, it’s worth noting, is different from translation. If the first version of Lens was about pets and plants, this version might be defined by clothes and home decor. It even knew the pillow I brought with me for the demo was from. The new Lens has something Google calls "Style Match": It found a match for all three items, showed options for where to buy them, and recommended similar items. An earlier version of Lens might simply identify the object as a sweater, or a pillow, or a pair of shoes. And in general, the shopping results were impressive. Your next Samsung phone might ditch Google Search for BingĬheck your inbox - Google may have invited you to use Bard, its ChatGPT rivalīixby vs.But the version of the app I saw was still in beta, and Google says the misidentification will be fixed by the time it rolls out at the end of the month. What is NFC? How it works and what you can do with it Google Lens is currently available on most Android smartphones that support the Google Assistant, and you can expect it to be incrementally upgraded with new features as Google adds to its suite. Its deep learning capabilities mean we should only expect it to get better in the future. While Google Lens is still in its infancy, it shows a lot of promise. Point it at a business card and it will let you save the person as a contact, and fill in all the details on the card for you. Still, we were impressed when it offered up reviews, social media accounts, and business information when we pointed it at the awning for a small store. The company admits the technology works best for identifying books, landmarks, movie posters, album art, and more. You can even use Google Lens on pictures you’ve already taken by tapping the gallery icon in the top right. If it’s too dark, you can tap the light icon in the top left to switch on your device’s flash.
0 Comments
Leave a Reply. |