- Apple is developing a “screen awareness” feature that will allow Siri to understand and interact with the content currently displayed on the screen.
- Apple will also provide an API for developers to integrate screen awareness capabilities into third-party applications, and is currently testing integration with ChatGPT, which allows Siri to answer questions based on screenshots.
- Although the “screen awareness” feature is not available in the iOS 18.2 beta, it may appear in time in iOS 18.4 in 2025.
Among digital assistants, Siri excels, especially compared to Microsoft’s disappointing assistant, Cortana. Now, Apple is working to further improve Siri, enriching it with a new feature called “Screen Sense.” This feature is designed to improve your digital assistant’s ability to better understand what you’re viewing on your device’s screen.
Apple provided more details about the feature in a dedicated section of its developer documentation, explaining that it will be included in beta versions of iOS for testing. The basic idea is that if you’re looking at a file or web page and have a question about what you’re looking at, you can ask Siri to answer or take action in a more targeted way. For example, Siri may provide specific information about the content you are watching or route the content to a compatible third-party application.
Apple introduced the idea of ”screen awareness” in June 2024, indicating that the feature is still under development and testing. If this feature works as expected (an important “if”), Siri will become smarter and you won’t need to explain in detail every action you want it to perform. For example, you can open a document and ask for a summary without copying and pasting text or describing exactly what you want.
Essentially, Apple’s goal is to make Siri more interactive and context-aware, allowing the assistant to answer questions more accurately and naturally related to the actual content you’re viewing on the screen.
Apple’s near-term plans for Siri
MacRumors reports that Apple has made new APIs available to developers that will allow them to integrate content from their apps with Siri and Apple Intelligence. The goal is to provide developers with sufficient preparation time before the feature is officially released, so that they can create Screen Aware-compatible applications immediately after the official release.
Currently, Apple is known to be testing the integration of ChatGPT with other Apple operating systems in the latest beta version of iOS 18.2. Through this integration, Siri will be able to answer questions related to content displayed on the screen, such as images, PDFs, and videos. The way it currently works is that Siri takes a screenshot of what’s visible on the screen and sends it to ChatGPT for a response. Currently, this feature is limited to screenshots, but it represents the first step toward greater interaction between Siri and visual content.
However, Screen Sense itself is a more advanced and simpler feature. According to MacRumors, while integration with ChatGPT relies on the use of screenshots, screen awareness will allow Siri to analyze and interact directly with content visible on the screen without requiring additional steps. For example, if someone sends a message that includes their phone number, you can simply ask Siri to create a new contact without providing additional details or performing a series of intermediate steps.
Essentially, Screen Awareness is designed to make Siri a more intuitive, responsive assistant that interprets and responds to commands more naturally based on visual context, simplifying and speeding up many everyday operations on Apple devices.
Can Siri survive or even thrive in the age of artificial intelligence?
Screen awareness is apparently not yet available in the iOS 18.2 developer beta, and MacRumors speculates that this may be one of the many Siri features we won’t see for the time being. However, the news remains hopeful. According to some predictions, Screen Sense may be included in iOS 18.4, expected to be released in the first half of 2025.
If this hypothesis comes true, Siri may become a more useful digital assistant, and due to Apple’s prowess in design, it may become the digital assistant of choice for many people. This development is reminiscent of Microsoft’s ambitions for its Copilot, but at least for now, it hasn’t been well received, leaving room for Apple to pass the baton.