iPhone 15 Pro owners will soon gain access to the Visual Intelligence feature, a capability that enhances the device’s camera functionality. This update aligns the iPhone 15 Pro with the newer iPhone 16 series, which already includes this feature.
Visual Intelligence, comparable to Google Lens, allows users to point their camera at objects and receive real-time analysis using artificial intelligence. This feature will be accessible through a long press of the dedicated camera button on the iPhone 16 and 16 Pro models. However, since the iPhone 15 Pro and Pro Max lack a physical camera button, users will need to assign the feature to the Action button or access it via a Control Center shortcut.
While Apple has not confirmed the specific iOS version that will introduce Visual Intelligence to the iPhone 15 Pro series, speculation suggests it may be included in an upcoming release, likely iOS 18.4, which is expected to roll out to beta testers shortly.
Visual Intelligence is part of the broader Apple Intelligence suite, which integrates various AI functionalities. Users can utilize the feature to search for products, translate text, read aloud, and summarize information. For example, if a user wants to find a specific towel pattern, they can activate Visual Intelligence, select the Google search shortcut, and browse for purchasing options online.
This update is poised to enhance the user experience for iPhone 15 Pro and Pro Max owners, providing them with advanced tools for everyday tasks.
For further details, visit the original article at Engadget.