Just found out about <https://hexler.net/products/...
# of-graphics
s
Just found out about https://hexler.net/products/kodelife even though I've started working on a similar project for graphics prototyping :)
🤔 1
Has anyone used it? I know there are some other web based shader toy style environments, some made by people here, like vertex shader art. My project is inspired mostly by shader toy, one thing is I want to experiment with new features like raytracing and mesh shaders in a shader toy like environment. Kodelife is still "stuck" in DX11 level capabilities
Note that in Vulkan most of those features were exposed as proprietary Nvidia extensions unlike recently and DX12 only supports them on Windows Insider builds with beta Nvidia drivers :)
And AMD doesn't have hardware support yet
c
My project as shown in the video on #C0120A3L30R was partly inspired by Kodelife, and I have experimental backends for DX12 and Vulkan. Still a a way off supporting them properly though!
👍 2
That’s the plan though...
c
👍 1
would love to have some type of higher level tool for playing with RTX…
looking forward to seeing what you make @Scott Anderson
s
The raymarching playground looks cool
I have actually not added RTX support yet
But it's next on my list, right now I only support compute shaders
Used in a pixel shader style fashion
I plan on adding support for inline raytracing with acceleration structures loaded from vox files (my main goal with this is to prototype voxel renderers) and maybe add a simple editor as well
And mesh support (obj and/or gltf)
Right now I'm live coding hlsl so there is no higher level language, but I will start to add a programming language to control dispatches, and ultimately I think it will end up being a general purpose high level language that defines a render graph
Probably graphical although I'm goin back and forth on that, I might just integrate Lua or a lisp dialect and call it a day
d
@Scott Anderson "inline raytracing with acceleration structures loaded from vox files" This is something I'd like to understand in more detail for my own project. What is a "vox" file (ie, what file format should I use), and how do you encode an acceleration structure in a "vox" file? Can anybody suggest a link, or an existing project that implements this, that I can study?
s
Vox file is the native file format exported by Magica Voxel, a voxel editing tool
Magica Voxel has a scene editor
So vox files support a list of instances that have transforms (3x4 matrices, representing translation and rotation), in RTX there are two acceleration structures, a top level and bottom level structure that are used to speed up ray tracing, the top level acceleration structure input looks very similar to a magica Voxel scene, instances of bottom level acceleration structures with transforms
Any tool that uses RTX needs a way to define acceleration structures, could be an editor or import from a file (gltf can also define scenes and works for the "normal" RTX use case of triangle raytracing)
But I could allow for scripting or pick some arbitrary restriction for a hardcoded AS as well, might be interesting to always build a regular grid structure or use a UAV and define the AS in a special compute shader
d
Thanks Scott. I don't have a Nvidia RTX graphics card, but I would like to eventually do something related to ray tracing using acceleration structures. Hopefully using portable code that runs in WebGPU.
s
WebGPU won't support ray shaders
maybe WebGPU 2 or 3 in 2030 🙂
but compute shaders will be an option and relatively fully featured
d
Actually I've been doing shadertoy stuff using sphere tracing. Nothing photorealistic that requires ray shaders. I think that acceleration structures will speed up sphere tracing for complex scenes.
s
It does. IQ wrote an article about it (not using RTX) https://www.iquilezles.org/www/articles/sdfbounding/sdfbounding.htm
The DXR intersection shader sample has some raymarched shaders
d
@Scott Anderson It's going to take a while to reverse engineer IQ's code from his blog post, but that is indeed what I want. Another general approach I'm considering follows from Raph Levian's architecture for GPU-accelerated 2D graphics, applied to 3D. It's a 2-stage compute pipeline. The first stage partitions the viewport into 16x16 pixel tiles, and for each tile, intersects all of the primitives in the scene against that tile, and constructs a command list containing only those primitives. The second stage renders the command list for each tile. • https://raphlinus.github.io/rust/graphics/gpu/2019/05/08/modern-2d.html

https://www.youtube.com/watch?v=eqkAaplKBc4

s
Yeah that's also how modern game engines do lighting
In 3D
If you haven't seen Alex Evans talk about Dreams I suggest you check it out
c
Do you mean in general or a specific presentation he did?
s
A specific presentation
Learning from failure
d
There is some newer information about SDF rendering in this 2019 video, around the 20 minute mark. More details about their sparse voxel octree data structure. Plus, the Brick renderer that was declared abandoned in the 2015 talk is used in the production version of the game, for performance reasons.

https://www.youtube.com/watch?v=1Gce4l5orts

👍 1
s
Nice hadn't seen that one
I'll check it out later