Alright, so we have the above visionary approach t...
# linking-together
i
Alright, so we have the above visionary approach to spatial computing where you can actually touch and feel the computable things, your body is fully present in the computable space — assuming the dream comes true and it ever takes off — versus the now all-too-inevitable Vision where you don't even get to touch glass, your body is so absent that they have to make a meta human out of you. Here's hoping the latter doesn't take the wind out of the sails of the former. I'm curious to see what sorts of programming tools Apple has come up with for Vision. I'm assuming it'll be the same old Swift, SwiftUI, and Xcode (now in a spatial window), but maybe there'll be some hint of novelty in Reality Composer Pro or some other accessory tooling. But I'm far, far more interested in seeing what we all can come up with.
c
Hopefully Swift Playgrounds lets us play around here (and one day soon just add an app to our homescreen)
j
Also saw something about Unity XR being available for it?
i
Curious if that's just support for hand tracking, head tracking, etc, or if there's anything more to it. I mean, it won't be "a new interface to Unity for authoring content spatially within Vision"
j
I was really surprised by the lack of incorporating physical objects into the system. Seems like augmenting reality should include augmenting actual reality, not just overlaying objects in your vision. Given the cameras it has and the processing they are capable of doing, it can obviously do it. I'm wondering what will be exposed to developers. Are we going to be able to use the apple vision pro to imbue the world with computation? Or will we be stuck with floating rectangles?
c
I think I'm putting on this facething when my present context is feeling dilute / void of stimulus – ie, go ahead and wipe out this current scene – but I get the "imbue" question I want my memory palace
i
On the "what sort of dev tools?" front, there's a bit of smoke from an AI/ML fire
n
VisionPro just seems to be simulating old media in a slightly more convenient form
i
I'm more optimistic — I think it's a new medium, being bootstrapped off the existing app ecosystem so as to avoid feeling empty on day 1. Remember how the iPhone (and Apple Watch) launched with effectively no apps?
c
VisionPro just seems to be simulating old media in a slightly more convenient form
If eye tracking is supported at OS level as a first class concept like cursor and touch then that would be interesting. The rest is impressive but meh.
I wrote that thinking about a minority report style desktop then immediately realised the dystopian ad tracking implications
j
image.png
I think this would be a good time to revisit the videos in the “tangible” category here: https://jackrusher.com/classic-ux/ … which offer many ideas that would work well in a VisionPro context.
s
There's a decent amount of info on the Apple developer site, but yes it's mostly the same stuff
For a device that's supposed to be for productivity there is very little on device development support. You can live update apps running on device though which is cool
I guess it's for productivity like iPad Pro is, not like MacBook Pro?
Unity is available also. It effectively has two modes, a mixed reality pass through mode where apps "render" by giving a system level 3D engine scene descriptions using USD
And a more traditional "immersive" (VR) mode where apps render using metal
Apparently Unity supports both modes
Apple has their own 3D editor called Reality Composer Pro https://developer.apple.com/augmented-reality/tools/ which is an updated version of Reality Composer that supports Apple Vision Pro
So all the dev tools are very incremental
Theoretically you can port an existing iOS AR app with very little change
If you just watched the keynote, you wouldn't really know any of this. They specifically downplayed the VR aspects and never called it a VR headset
Which seems to have mostly worked from a marketing standpoint