Though it was two weeks ago that Apple announced their Vision Pro to the world, yesterday felt even more significant for Apple developers. Until then, we only had a brief glance at the new product, through marketing videos, technical talks, and secondhand accounts of brief demos. But yesterday, people finally started to get their hands on it. Not the hardware; the actual headset is not set to hit shelves until next year. But Apple updated XCode to support a virtual build of VisionOS, and a simulator to approximate the experience of using VisionOS in the home.
What followed was a feeding frenzy of activity from Apple devs, as people who have live apps on the app store started to tweak their builds to support the new OS. No one was expecting perfection, and no one got it, but I saw at least half a dozen one-person operations have a working app in realityOS with minimal tweaks. This is very promising to see in the first 24 hours of the platform's availability; I'm guessing we'll see a robust app ecosystem on day 1 of the hardware being on the market.
Apparently some of the augmented reality features are not simulable yet, limiting devs to essentially iPhone and iPad apps that hang in the air in virtual space, but we're literally on day one of this tool being available, so I expect rapid iteration from Apple as we begin our slow march to the Vision Pro debut.
This seamless interoperability makes me even more excited to dive into learning about Apple development! I've messed around with their augmented reality APIs and it's pretty intimidating to get in those weeds, so it's very encouraging to know that even a basic iPhone app can be useful on this brand new platform.