https://futureofcoding.org/ logo
#share-your-work
Title
# share-your-work
i

Ivan Reese

11/07/2021, 8:09 PM
Future of Coding • Episode 53 Scott Anderson • End-user Programming in VR @Scott Anderson has spent the better part of a decade working on end-user programming features for VR and the metaverse. He’s worked on playful creation tools in the indie game Luna, scripting for Oculus Home and Horizon Worlds at Facebook, and a bunch of concepts for novel programming interfaces in virtual reality. Talking to Scott felt a little bit like peeking into what’s coming around the bend for programming. For now, most of us do our programming work on a little 2D rectangle, with a clear split between the world around the computer and the one inside it. That might change — we may find ourselves inside a virtual environment where the code we write and the effects of running it happen in the space around us. We may find ourselves in that space with other people, also doing their own programming. This space might be designed, operated, and owned by a single megacorp with a specific profit motive. Scott takes us through these possibilities — how things are actually shaping up right now and how he feels about where they’re going, having spent so much time exploring this space. • https://futureofcoding.org/episodes/053
🍰 11
g

Gray

11/08/2021, 3:01 AM
I absolutely loved this episode! Thanks for putting it together, and I loved all the Dreams shoutouts
🍰 3
m

Mariano Guerra

11/08/2021, 9:54 AM
do you explain somewhere what's your music composition process/tools?
i

Ivan Reese

11/08/2021, 6:13 PM
@Mariano Guerra — You're the first to ask (<3), and no, I haven't talked about it anywhere. I used to play in bands, and have recorded a few dozen albums (eg), so I've accumulated a lot of music.. stuff. I have ~hundreds of instruments (and "instruments") packed into every nook and cranny of my office and around my house. So while working on my editing process for the podcast, I also just naturally started making unique intros/outros/stingers for every episode. I have an Ableton Live project called "Startup Chime" that has roughly a hundred layers, and I just bash on sounds in there for an hour or so each time I do a new episode. I've settled on a core set of samples that I base the sound design on, so that it feels somewhat tonally consistent across all the episodes. But that's the only consistency I'm really aiming for — I don't have a particular mood or "genre" that the sound design should have. It's just whatever I gravitate toward when prepping an episode. I also don't bother saving a copy of this project or whatever — I just open it up in whatever state it was in when I left it, mess around with what's there until it's similar but different, and then save out a render. Repeat that a few times, and boom: 2-6 sound cues for an episode.
❤️ 1
👏 1
👏🏽 1
w

wtaysom

11/09/2021, 1:31 PM
Let me add that I love the whimsy of the sound feel of the podcast.
🍰 1
@Scott Anderson tangible computing. I'm going to remember that. For me, the space, the sense of place and situatedness in it are the best parts of VR. With better haptics, I think I'd be happy without the headset.
s

Scott Anderson

11/09/2021, 4:30 PM
It's definitely possible without the headset. One issue is, using existing tech its a lot harder to get all the pieces right without VR. We do have some pretty nice holographic displays, but you can reach inside any of them. If you're using motion controls outside of VR you lose some of them 1:1 volumetric interaction and you're basically interacting with a mirror. Another option would be to build tracked physical tangible components and remap them in digital space. There have been some research examples of this (I think by Microsoft research). This would work with or without an HMD
i

Ivan Reese

11/09/2021, 5:35 PM
Dynamicland did a little bit of the tracked physics tangible components stuff too (and, IIRC, stuff like servo-positioned magnets under the table to move things around)
🤔 1
s

Scott Anderson

11/09/2021, 5:45 PM
Oh yeah I'd definitely put Dynamicland in that group
they had prototypes of "robots" with their own compute also
w

wtaysom

11/10/2021, 2:41 AM
What I mean is for example say you're wearing haptic gloves —not even fancy ones with resistance just a dozen or so tappers on each hand —the feedback of invisible objects on a table can be remarkable. And for input, buttons that are just contact between thumb and finger is a lot more precise than the pinch inferring AI of Hololens or the Quest these days — though honestly that's gotten pretty remarkably good recently.
s

Scott Anderson

11/10/2021, 5:12 PM
Oh yeah, I strongly agree with that. One of my Rift Jam (an internal game jam at Oculus) projects was using (very very early) versions of hand tracking that eventually shipped on Quest with a bootleg (arduino + some small haptic motors) haptic glove, didn't get very far into it (spent a lot of my time trying to get early handtracking working), but it was fun to work on
😀 1