This is pretty big news! It basically means anyone can write nearly any* apps they want without going through the App Store or needing approval from Apple in any way.
It also allows the headset to launch with an existing ecosystem of VR apps to play around with.
*Modulo possible performance limits of web-based VR apps and access to the same types of APIs that native apps will get
Great news indeed. Announced support though is only experimental (disabled by default behind flag). It'll probably take some time post launch of Vision Pro to release enabled by default.
It's curious that it only supports immersive-vr and not immersive-ar XRSessions given the focus of the headset. Wonder if it has something to do with their rendering backend, maybe something about camera access or even the features of immersive-ar sessions as well?
Regardless, I'll be interested to see if this starts to promote more interest in developing hand tracking support in upcoming WebXR apps.
It also allows the headset to launch with an existing ecosystem of VR apps to play around with.
*Modulo possible performance limits of web-based VR apps and access to the same types of APIs that native apps will get