How to Use Visual Search in iOS 15 to Identify Plants and Pets

Select iPhone models can take advantage of Visual Lookup, an iOS 15 feature that can identify plants and animals in a photo taken in the United States

Plants and animals can be identified by certain iPhone models running iOS 15 or later in an impressive demo from Apple. Visual search feature, which allows users to learn more about famous landmarks, art, plants, flowers, and pets. The feature was introduced with Live Text as part of an iOS 15 release that focused on augmenting actions in photos using on-device intelligence. Due to the computing required on the iPhone, it makes sense that only specific iPhone models can take advantage of this feature. By keeping so much computing on the device, Apple can protect user privacy by keeping it completely off company servers.

VIDEO OF THE DAY

Live Text and Visual Lookup features analyze objects in a photo saved to the camera roll. Their functions are different. While Visual Lookup aims to provide information about a subject – perhaps to identify it or add more background information – Live Text extracts data from a photo. Users can display the text in a Live Text-compatible image, which will likely be in prominent panels or captured documents. Live text can be copied, searched the web, or shared with other people and apps. But for unidentifiable plants and animals in a photo, Visual Lookup can fill in the gaps and add more information about the flora and fauna captured on the iPhone.


Related: iOS 15 Live Text for iPhone: Tap a Phone Number on a Photo & More

New features and how to use


Live text in action

To see if a photo contains elements that can be detected by Visual Lookup, open it in full screen on the Photos app. Every photo saved in the Photos app will have an “i” button located at the bottom of the screen when it’s open. However, when a photo has elements detectable with Visual Lookup, the “i” button will be partially covered by two floating star icons. Tap this icon to view photo details, which offers the option to add a caption and can display information about the lens used to take the photo. For example, when a plant or animal is detected, a ‘Look Up — Plant’ or ‘Look Up — Animal’ will be displayed directly below the legend bar. Press these buttons to view Siri’s findings related to topics, including plant or animal name, photos, scientific name, and other information.


While Visual Lookup’s current features are available on iOS 15 releases, it’s expected to receive improvements as part of the upcoming iOS 16 release in the fall. With the new version of Visual search, users can copy and paste the subject of a photo for use in other apps. It is also possible to drag and drop the subject of a photo to other applications, such as a Messages field or in a document. On beta versions of iOS 16, the implementation works well but sometimes misses part of the topic in the cutout. The feature will likely be refined as iOS 16 develops and should see a public release in fall 2022.


Next: iOS 16 first impressions: what works and what needs improvement

Source: Apple, Apple Support

Stranger Things Season 5 Eleven Millie Bobby Brown

Stranger Things Season 4 Part 2 Reactions To This Death Are Devastating


About the Author

Comments are closed.