
Newsletter Subscribe
Enter your email address below and subscribe to our newsletter
Enter your email address below and subscribe to our newsletter
Get information about objects and places on your iPhone instantly.
With iOS 18.2 or later, Visual Intelligence finally gives Apple its version of Google Lens, a standout new feature among the others. Showcased at the iPhone 16 series launch event, Visual Intelligence lets you learn about and gather information on objects and places around you using Camera Control. Keep reading to discover what Visual Intelligence is and how to use it on your iPhone.
Visual Intelligence is an Apple Intelligence-powered feature introduced with the iPhone 16 series. It uses the powerful A18 Bionic chip with an enhanced Neural Engine to process visual data directly on the device.
With Visual Intelligence, you can recognize and find information about various objects and places around you by simply pointing your iPhone’s camera to them. This makes it useful in scenarios where you want to say, recognize a plant or animal, extract text, translate languages, look up an image on Google, or make advanced queries to ChatGPT.
Visual Intelligence is part of the Apple Intelligence suite, which not all iPhones support. Although other Apple Intelligence features are available on iPhone 15 Pro or later models running on iOS 18.2, Apple has reserved Visual Intelligence for the iPhone 16 series—at least for now. This is primarily because the Camera Control button, which is currently only available in the iPhone 16, 16 Pro, 16 Plus and 16 Pro Max, is required to invoke Visual Intelligence.
From getting information about a place to asking ChatGPT to get more information about the subject to making an image search on Google, there’s a lot you can do with Visual Intelligence and Camera Control on your iPhone 16. Let’s take a look.
Visual Intelligence can help you get more information, such as hours of operation, available services, and contact information about businesses around you. If a business is service-based, Visual Intelligence will also suggest actions like viewing reviews and ratings, making a reservation, or placing an order for delivery. Here’s what you need to do:
Besides these actions, you’ll also find options to call the business, view its website, and more.
The arrival of Visual Intelligence has taken the Live Text feature to the next level. Visual Intelligence combined with Camera Control can help you interact with text in multiple ways. For instance, you can translate text if it’s different from the device language, summarize it, or make your iPhone read it aloud.
Besides translation, Visual Intelligence can also identify contact info in text, like phone numbers, email addresses, and websites, and suggest actions based on the text type. Here’s what interacting with text using Visual Intelligence looks like:
With Siri and ChatGPT integration on iPhone, you can ask ChatGPT to fetch you more information about the subject in your frame using Visual Intelligence. Here are the steps you need to follow:
Other than getting information about the subject, you can also use Visual Intelligence to identify the object and then do a Google search for similar items on the internet. Here’s how:
Starting iOS 18.3, you can use Visual Intelligence to create Calendar events simply by positioning your iPhone cameras towards a flyer or poster.
To do so:
With iOS 18.3, you can use Visual Intelligence to identify animals and plants in a jiffy; you no longer need to tap the Ask ChatGPT or Search Google button.
Earlier, you could only access Visual Intelligence via the Camera Control button, which is currently only available on the iPhone 16 series (except the iPhone 16e). However, the arrival of iOS 18.4 (currently in beta) extended the Visual Intelligence support to iPhones without the Camera Control button but support Apple Intelligence (iPhone 15 Pro, 15 Pro Max, and iPhone 16e).
With iOS 18.4, Apple lets you unlock Visual Intelligence via the Action Button. This means you can customize and use the Action button on your iPhone to access and use Visual Intelligence. Here’s how:
Customize the Action Button to unlock Visual Intelligence
Access Visual Intelligence
Now that you have assigned the Visual Intelligence functionality to the Action Button, all you need to do is long-press the Action Button, and the Visual Intelligence experience will appear. Once done, you can use Visual Intelligence as you normally would.
Signing off…
Visual Intelligence on your iPhone 16 is a great way to look up details about objects and places nearby. I’ve been using the feature for quite a while now and definitely believe that it’s a game-changer for productivity and convenience.
What are your thoughts on Visual Intelligence on the iPhone 16 series? Do share your thoughts in the comments.
Also read: