Apple’s AI Evolution: New Siri, iOS Updates, and iPhone Innovations
Apple is scheduled to unveil a redesigned version of Siri on June 8 at WWDC 2026. This update, arriving with iOS 27, iPadOS 27, and macOS 27, represents a fundamental overhaul of the personal assistant, transitioning it from a tool for simple tasks into a conversational chatbot.
Transition to LLM-Based Architecture
The redesigned Siri will move to a custom foundation of Large Language Models (LLMs) based on Google Gemini. This shift is intended to modernize the assistant’s capabilities, allowing it to handle “world knowledge” queries in a manner similar to ChatGPT or Claude. Unlike the current version of Siri, which primarily handles common questions and basic commands, the iOS 27 version is designed to engage in back-and-forth conversations and process complex, multi-step queries within a single interaction.
As part of this technical expansion, Apple is testing the ability for Siri to execute multiple actions from one command. For example, a user will be able to request that Siri simultaneously adjust a thermostat and turn off bedroom lights without needing separate prompts.
New Functional Capabilities
The upcoming overhaul includes three primary upgrades that were previously delayed: personal context knowledge, onscreen awareness, and new in-app and cross-app actions. These features are intended to develop the assistant more aware of the user’s current activity and data across the Apple ecosystem.

To maintain competitiveness with other AI platforms, Apple is also expanding its “Extensions” system. This will allow users to connect Siri to third-party AI assistants, specifically including Google Gemini and Claude, providing an alternative for those who prefer different AI models for specific tasks.
User Interface and Application Changes
The interaction model for Siri is changing alongside its intelligence. Apple plans to launch a standalone Siri app that will display a list or grid of past conversations, mirroring the interface of existing AI chatbot applications. The user interface will integrate more deeply with the Dynamic Island.
These changes follow the current iteration of Siri, which is built into Apple devices including the Apple Vision Pro and supports back-to-back requests and CarPlay integration. The new system aims to move beyond these utility-based functions toward a more comprehensive AI platform integrated into the iPhone and other hardware.
The full specifications of the redesigned assistant and the standalone application are expected to be detailed during the June 8 presentation.
