When Apple introduced the iPhone 16 last year, the company said it would have an exclusive Apple Intelligence feature related to Camera Control called Visual Intelligence. With Visual Intelligence, you can quickly learn “more about the places and objects around you.” You can look up details about a restaurant or business, have text translated, summarized, or read aloud.
Apple says you can use Visual Intelligence to get details about a business in front of you, such as hours of operation, available services, and contact information. Depending on the business, you can also view reviews and ratings, make a reservation, or place an order for delivery.
That said, we first thought this technology was related to the Camera Control, as you need to long-press it to start using it. However, with the recently announced iPhone 16e, Apple added Visual Intelligence to the Action Button.

That made people wonder why the iPhone 15 Pro didn’t have it. Now, Daring Fireball reports that Apple will bring it in a future software update, most likely iOS 18.4. Here’s what he wrote:
Apple representatives also told me today that owners of the iPhone 15 Pro will soon be able to bind their Action Button to visual intelligence, ‘in a future software update. I suspect that future software update is iOS 18.4, which should be launching in beta any day now, but Apple wouldn’t comment, on or off the record, when exactly this feature will come to the iPhone 15 Pro. They also confirmed that the Control Center button to launch visual intelligence is also coming to iPhone 15 Pro (and presumably iPhone 16 models, too).
As it did with other features, Apple eventually expanded the “exclusive features” of a generation to the previous one. Of course, BGR will let you know once Visual Intelligence arrives on the iPhone 15 Pro and how it will work.