imagine coding inside of "the volume", basically a...
# linking-together
c
imagine coding inside of "the volume", basically a giant LED room: https://ascmag.com/articles/the-mandalorian
👍 4
i
Yeah, that gives "room-scale VR" a whole new meaning.
c
That is cool. I did my PhD inside of something like that, except with the bottom half of the sphere as well. https://www.optimistdaily.com/2018/10/enter-the-allosphere-with-dr-joann-kuchera-morin/
…and it was stereo, and mostly used realtime rendering.
worth seeing if you’re in Santa Barbara
e
amazing technology.
w
"Due to the 10-12 frames (roughly half a second) of latency from the time Profile’s system received camera-position information to Unreal’s rendering of the new position on the LED wall, if the camera moved ahead of the rendered frustum (a term defining the virtual field of view of the camera) on the screen, the transition line between the high-quality perspective render window and the lower-quality main render would be visible. To avoid this, the frustum was projected an average of 40-percent larger than the actual field of view of the camera/lens combination, to allow some safety margin for camera moves. In some cases, if the lens’ field of view — and therefore the frustum — was too wide, the system could not render an image high-res enough in real time; the production would then use the image on the LED screen simply as lighting, and composite the image in post [with a greenscreen added behind the actors]. In those instances, the backgrounds were already created, and the match was seamless because those actual backgrounds had been used at the time of photography [to light the scene]."
g
Looks cool. Kind of reminds me of the Heroes and Ledgend theather at the Kennedy Center though of course that one is pre-recorded 3D, not real time https://www.kennedyspacecenter.com/explore-attractions/heroes-and-legends/featured-attraction/heroes-and-legends