Visual Intelligence got an upgrade in iOS 26
In iOS 26, Apple Intelligence will turn screenshots into a powerful tool for shopping, planning, and asking questions. Here’s how.
Apple is giving iPhone users a smarter way to interact with what they see on screen. With iOS 26, the company is expanding its Visual Intelligence feature to go beyond photos and the camera app.
Now, users can press the screenshot buttons to access a new tool that analyzes whatever is currently on their display. The update brings Visual Intelligence into everyday use.
After taking a screenshot, users can ask ChatGPT about what they’re seeing, search for similar items on sites like Google or Etsy, or pull event details straight into their calendar.
For example, if someone screenshots a concert poster or flight confirmation, Apple Intelligence can automatically extract the date, time and location, then offer to create a calendar event.
The goal is to make iPhones more helpful in the moment. Instead of copying text or jumping between apps, users can interact with content directly on screen. Apple says the process happens on device for speed and privacy.
Visual Intelligence can also help with online shopping. If a user sees a jacket they like on social media, they can screenshot it and get visual matches from retail sites.
It’s also possible to highlight just part of the image to refine the search. Highlighting gives users more control and context without needing to retype or search manually.
The goal is to make iPhones more helpful in the moment, although it’s not perfect.
ChatGPT integration is built into the experience, letting users ask natural-language questions about what’s on screen. The integration includes definitions, background information, or even help understanding forms and documents.
Part of a larger shift in iOS
The software will be released publicly in the fall of 2025 as a free update for supported devices. Visual Intelligence and other Apple Intelligence features require at least an A17 chip, meaning they are only available on the iPhone 15 Pro, iPhone 15 Pro Max and newer models.
A public beta will be available in July through Apple’s Beta Software Program. Apple’s move to integrate screen-level AI tools is part of a broader push to compete with Android’s Pixel and Galaxy phones, which already offer similar on-screen help features.
Apple’s move here feels overdue but smart. Screenshots have quietly become one of the most common ways people save information, especially when it isn’t easily copyable.
Until now, iOS treated all those images like any other photo. Giving screenshots a brain and a purpose is the kind of quality-of-life upgrade that makes Apple Intelligence feel useful rather than flashy.
Instead of bouncing between apps or relying on clunky share sheets, you just take a screenshot and follow your curiosity. It won’t be perfect, especially early on.
But this is Apple leaning into the idea that the screen itself is the interface, not the app you’re in. That shift might end up being more important than any of the AI buzzwords it’s wrapped in.