While iOS 18.4 brings modest improvements to Siri, including on-screen awareness for processing visible information and multi-app integration, it still falls behind Perplexity AI‘s capabilities. Apple’s voice assistant can now handle phone numbers and emails displayed on screen with expanded language support. However, technical challenges have delayed Apple’s more ambitious AI features for Siri. The gap between Siri’s limited responses and Perplexity’s dynamic intelligence continues to widen in today’s AI landscape.
While Apple continues to update its digital assistant, Siri is getting several notable improvements in iOS 18.4. The update brings new capabilities that aim to make the assistant more useful in everyday tasks. However, despite these additions, Siri still trails behind more advanced AI assistants like Perplexity regarding real-time capabilities and depth of responses.
One of the most significant improvements in iOS 18.4 is Siri’s new on-screen awareness. This feature allows the assistant to respond based on what’s visible on the user’s screen. For example, Siri can now process information like phone numbers or email addresses that appear on screen and offer to save them as contacts.
Siri’s new on-screen awareness intelligently processes visible information, helping users save contacts from displayed phone numbers and emails.
The update also enhances Siri’s integration with multiple apps. Users can ask Siri to perform tasks that require jumping between different applications, making the assistant more efficient. This improved integration means faster interactions and smoother task completion.
Apple has expanded language support in this update as well. Siri now works with additional languages, making it accessible to more users around the world. This global approach helps more people use voice commands regardless of their native language. As multimodal AI continues to evolve across the industry, Siri must eventually catch up to systems that can seamlessly integrate text, audio, and visual inputs for more contextual understanding.
Siri’s context understanding has also been improved. The assistant can now better recognize user preferences and provide more personalized responses based on previous interactions. Users can now access Siri through a new typing option in iOS 18.4, offering an alternative to voice commands. iOS 18 brings richer language understanding for Siri, enabling it to process complex queries with greater accuracy than before. This makes conversations with Siri feel more natural and relevant to individual users.
Despite these improvements, reports indicate that some of the more advanced AI features originally planned for Siri may have been delayed. These postponed enhancements were meant to be part of Apple Intelligence, the company’s broader AI initiative.
Technical challenges have apparently forced Apple to hold back certain features until future updates, possibly iOS 18.5 or later. Industry observers note that while Siri is becoming more capable, it still doesn’t match the real-time intelligence and extensive responses offered by more advanced AI assistants like Perplexity, which can process and analyze information from across the web instantly.
References
- https://www.apple.com/ios/ios-18/
- https://www.youtube.com/watch?v=D0gdqaFXs6I
- https://support.apple.com/en-us/121161
- https://www.cnet.com/tech/services-and-software/missing-siris-new-glow-heres-how-to-enable-your-iphones-latest-feature/
- https://economictimes.com/news/international/us/ios-18-4-to-arrive-in-april-what-new-siri-improvements-and-apple-intelligence-features-can-iphone-users-expect/articleshow/118422301.cms