Time to get a jump on all of that spatial computing. Apple today announced that its visionOS software development kit is now available, allowing 3D parties to begin building content for the Vision Pro. The SDK is available at least half a year before the headset officially goes on sale in the U.S., priced at $3,500.
The company is banking on developer interest to help drive excitement around the system, which was met with a lukewarm reception when it was unveiled at WWDC earlier this month. Content has been a major sticking point for years of VR and AR development, but Apple is no doubt banking on a stocked App Store by the time the system arrives in early 2024.
“Developers can get started building visionOS apps using the powerful frameworks they already know, and take their development even further with new innovative tools and technologies like Reality Composer Pro, to design all-new experiences for their users,” VP Susan Prescott said in a release. “By taking advantage of the space around the user, spatial computing unlocks new opportunities for our developers, and enables them to imagine new ways to help their users connect, be productive, and enjoy new types of entertainment.”
The SDK is built on top of the same basic framework as Apple’s various other operating systems, utilizing familiar dev tools, including Xcode, SwiftUI, RealityKit, ARKit and TestFlight. The company is clearly hoping to lower the barrier of entry for existing developers. The path of least resistance seems to be effectively porting existing software over to the new platform (see also the company’s Game Porting Tool Kit for the Mac and iPad).
Spatial computing windows are built in Swift, for example. Apple notes on its development page:
By default, apps launch into the Shared Space, where they exist side by side — much like multiple apps on a Mac desktop. Apps can use windows and volumes to show content, and the user can reposition these elements wherever they like. For a more immersive experience, an app can open a dedicated Full Space where only that app’s content will appear. Inside a Full Space, an app can use windows and volumes, create unbounded 3D content, open a portal to a different world, or even fully immerse people in an environment.
Questions remain around how effective such ports will ultimately be in a three-dimensional plane or “infinite canvas” — borrowing a phrase from comics scholar Scott McCloud. To ease the growing pains even further, the company will begin opening “developer labs” in a variety of cities next month, including Cupertino, London, Munich, Shanghai, Singapore and Tokyo.
That’s designed, in part, to address one of the biggest pain points at the moment: getting an extremely expensive and unreleased headset in front of developers. Teams will be able to test their app on the hardware at the site, or apply for hardware developer kits to test outside of the official locations.
In addition to existing developer tools, Apple is introducing Reality Composer Pro. The Xcode feature makes it easier to preview 3D models, images, sounds and animation on the headset. There’s a simulator, as well, which offers a virtual approximation without the actual hardware. Unity development tools will be added to the mix starting next month. That’s good news, as gaming experiences were conspicuously missing from the original presentation.
Today’s announcement also lends credence to the notion that enterprise is going to be a key focus for the Pro’s first iteration.
“Manufacturers can use AR solutions from PTC to collaborate on critical business problems by bringing interactive 3D content into the real world — from a single product, to an entire production line,” said Stephen Prideaux-Ghee, AR/VR CTO of digital product development firm, PTC. “With Apple Vision Pro, stakeholders across departments and in different locations can review content simultaneously to make design and operation decisions. This capability will unlock a level of collaboration previously not possible.”
Apple has promised more information and tools in the coming months.