Visual Intelligence

This is what Apple's photo search with AI can do - gets help from Google and Chat GPT

Visual Intelligence at best tells what the cameras see.

The feature called visual intelligence is a function you access via the phone's camera controls, and it allows you to get information about what the cameras see. At least, that's the idea. Today, visual intelligence is quite limited. You can point the camera at, for example, a flower, and with a bit of luck, the iPhone can then guess which flower it is. I also try with other subjects, such as books, wine bottles, kitchen utensils, furniture, paintings, basically everything I have at home, but it's only flowers that it reacts to. 

At its core, Visual Intelligence is threefold in function. The foundation is what I've just gone through, and Apple provides it. When it recognises a flower, you can click for more information, which often comes from Wikipedia. 

If you want additional information, whatever may be displayed on the screen, you can call external services. Here, both Google and Chat GPT are available. Google is essentially an image search, so you can bring up images similar to the one you've taken and then ask follow-up questions or further specify your search in words. 

With Chat GPT, you can ask questions about the image and what is visible there. It also works in Swedish, and this applies to both Chat GPT and Google. 

This tip is an excerpt from a longer article that we call Mobil's big guide to Apple Intelligence. That article was previously published exclusively for Plus members on Mobil.se. To be able to test Apple Intelligence (our getting started guide to Apple's AI is here) you need to change the language and region on your phone to English and have one of the latest iPhone models.