Apple’s visionOS 26.4 beta, released today, introduces foveated streaming support for the Apple Vision Pro headset, a feature designed to improve the quality and efficiency of wireless virtual and augmented reality experiences. The update allows applications to display high-resolution, low-latency immersive content by streaming it to the device, according to Apple’s release notes.
Foveated streaming differs from foveated rendering, though the two can be used in conjunction. Whereas foveated rendering focuses on rendering the area the user is looking at in higher resolution, foveated streaming prioritizes sending that higher-resolution data to the headset. This technique is already utilized in platforms like Valve’s Steam Frame, and Apple’s implementation aims to facilitate the porting of existing VR applications to the Vision Pro.
The new framework is described as “low-level host-agnostic,” a departure from the macOS Spatial Rendering introduced in visionOS 26, which was limited to local Mac hosts. Apple’s developer documentation highlights NVIDIA’s CloudXR SDK as a compatible host, but also indicates support for local PCs. Notably, Apple has provided a Windows OpenXR sample on GitHub, marking the company’s first public acknowledgement and utilization of the industry-standard XR API.
The technology addresses limitations in headset video decoders by streaming high-quality content only where the user is looking, optimizing performance. Developers receive information about the approximate region of the user’s gaze, enabling them to render or stream higher resolution content in that area. This is a change from the typical visionOS privacy approach, which generally does not provide developers with specific gaze tracking data, instead offering only event-based information like pinch gestures.
Apple’s documentation also details the ability to simultaneously display content rendered on the device and streamed remotely. The company suggests a scenario where a racing game could render the car’s interior using its RealityKit framework while streaming the more demanding external environment from a remote computer. This approach aims to reduce latency and improve stability by offloading processing-intensive tasks to a more powerful machine.
Developers are already exploring integration with the new framework. Max Thomas, lead developer of ALVR for Apple Vision Pro, an open-source wireless SteamVR tool, has indicated that adding support for foveated streaming will be a significant undertaking, but could potentially enable foveated rendering within ALVR itself. ALVR recently added support for PlayStation VR2 Sense controllers in its TestFlight build.
The visionOS developer community is expected to explore enterprise applications of the foveated streaming framework in the coming months.