I just had a holy-shit moment reading through what you just said. I had not considered that AR overlays wont be blurred, without sampling what's behind the overlay/glass.
People are in for a world of pain when they realize this.
You can only get a frosted glass effect with that.
Imagine an overlay in front of a red circle. If you want the red circle blurred in the overlay, you need to know about red circle, and sample from it for each pixel. Vision Pro cant sample the entire viewport 120fps (or whatever fps they are running at). It would be a janky mess.
Vision Pro UI is not transparent / translucent but frosted.
> Vision Pro cant sample the entire viewport 120fps
Even worse than that. Each pane of glass that's blurring needs to do its own sampling of what's behind itself. That means you have to render from back to front, sampling in 3D, for each eye, to get realistic blur effects. Your vision isn't a 2D grid coming from each eye, it's conic. A line from your retina out through two equally sized panes placed in front of each other will likely pass through two different points on each pane.
You'd probably need to implement this with ray tracing, to make it truly accurate, or at least convincing. And to make your device not slow to a crawl as you open more and more windows.
People are in for a world of pain when they realize this.