<https://www.unrealengine.com/en-US/blog/a-first-l...
# of-graphics
i
I think it's interesting that these two new technologies (Nanite and Lumen) are designed with artist workflows in mind, taking manual or slow batch processes and making them automatic / realtime. Sure the results look impressive, but I like that it cuts both ways.
💯 1
c
even though a lot of these features are already available in UE4 (although not nanite and lumen, which look fantastic), it’s super impressive to see them all combined, running in realtime on soon-to-be commodity hardware.
👍 1
w
Getting the subsystems to play well together though...
n
I agree with a comment by someone in my Twitter feed: whilst the technology is impressive, this demo has absolutely no soul. Once upon a time, games were fertile ground for new kinds of experiences. Nowadays though all the big companies are building Call of Duty 27 or Tomb Raider 14, and this demo has that same vibe.
We can probably excuse this demo of course, because it's all about the tech. But I wouldn't call it an inspiring glimpse of the future...
w
@Nick Smith to me it seems that the market has grown so large. The so-called AAAs definitely try to live in a sort of giant/dinosaur/blockbuster/spectacle niche, but there are many smaller productions pushing novel interaction ideas.
i
Tech that makes it easier for artists to go from zero to live in-engine assets, removing hurdles like baking maps or building LODs, should help Indies just as much as AAAs. While it's true that most indie devs don't push the limits of graphics, there are plenty of cases where the indie world comes up with a novel way to make visuals that go toe-to-toe with the giants, and uses them in a meaningful way. No Mans Sky is an easy example, but I'd love to have seen what Memory of a Broken Dimension would have looked like with that many triangles available to warp and permute.
Some details on how Nanite and Lumen work
Nanite is really interesting because it primarily uses software rasterization
Software rasterization on the GPU for clarification
The criticism of the demo looking like a generic AAA game are kind of missing the point btw
Ivan's comments about this improving artist's workflow and iteration time are spot on, even at indie scale there could be massive time savings
Fortnite will be the first major customer for this tech, and love it or hate it, it doesn't have the look of a traditional AAA game (it's heavily stylized) and they continue to do weird social experiments in it
i
@Scott Anderson Anything you can share about how this demo is going over in the Unity team? Do you know if anyone there is feeling pressure to do your own GI / micropoly features in a conceptually similar way, or are you safe to pursue these goals from a totally different angle?
s
Can't say and I honestly don't know
People are discussing and talking about it, of course many of the Unity graphics engineers have talked about it publicly
But so far it's the same as anyone else in the industry
It's interesting to see if this will be a widely adopted approach or not in the future
The GI solution isn't so out there, other engines have similar approaches, and it's not to far from Nvidia VXGI or RTX GI (it doesn't use RTX hardware though)
s
@Scott Anderson Do you think the new features are actually for AAA games? It seemed to me that game engines were looking for expanding their market over the last few years and movie production (virtual sets) and architecture seemed like something they were trying to explore more — the new features seem to be even more interesting for those use cases. Sure, games will benefit as well, but strategically it strikes me as an attempt to establish their presence in other markets.
🤔 1
@Scott Anderson Would you mind explaining what you mean by “software rasterization on the GPU”? I never really fully grasped the “software/hardware” distinction in graphics rendering — in the end there’s always hardware crunching the bits, but so far I thought “software” means it runs on the CPU and is not using any specific hardware features for acceleration (but then even that distinction was somewhat muddy when the first MMX extensions appeared in Intel Pentiums…).
I always liked graphics engine demos, and the Unreal engine ones in particular — it hits a sweet spot for me between getting a glimpse at The Future™ but in a “no, really, that’s going to be the actual future in a few months from now” kind of way. And realtime 3D graphics still seems to be the most intuitive visualization of raw computing power we have found so far.
s
In this case hardware means fixed function hardware, a hardware designer built the rasterizer, it's part of the GPU hardware, and the software engineer has no (or very little) control over the rasterization algorithm
DrawIndexedPrimitive/glDrawArrays in your graphics API, the vertex transforms and shading of those triangles is driven partially by software, but rasterization is fixed outside of what primitive to draw (triangles, points, lines) and maybe some interpolation parameters or whether or not the rasterization is conservative
Software rasterization means the code for drawing triangles is written by a software engineer
And it runs in a compute shader
The compute shader has limited access to the fixed function graphics pipeline, it can sample textures but it mostly reads and writes to buffers and textures
Because GPUs are general purpose parallel processors there are a lot of interesting software rendering techniques possible that don't use fixed function hardware or use it sparingly
s
Thanks, @Scott Anderson! I also came across this in the meantime: https://twitter.com/raphlinus/status/1261351524703170562?s=20
s
There was another thread talking about GPU programming where I hinted at this (I think it was a Jon Blow Twitter rant) lol
😂 1
But a high level standard API came up as a solution and I mentioned that GPUs gave very few primitives that are actually accelerated (basically triangles)
So there is a lot of effort in getting fast 2D rasterization working on the GPU for font and UI rasterization
Like Slug and Pathfinder
👍 1
But the UE5 announcement might mean that a lot of 3D rasterization in games goes software as well
n
Raph Levien (the tweet linked above) is doing amazing work on a super-efficient vector graphics renderer for 2D GUI apps. I'd definitely recommend everyone follow him on Twitter and at his blog: https://raphlinus.github.io/
He should be posting a huge update in the next week or two
s
Yeah he's been posting good stuff
s
Thanks for the explanation, @Scott Anderson and the link to that article, @Nick Smith. I think I just had a bubble-burst revelation. I wasn’t aware that the divide between classic and mobile graphics is still so large, and my views were completely one-sided on tile-based deferred rendering (TBDR) and Apple’s Metal API — which is frankly all I know how to use. It seems that this was an early bet for Apple to make in 2014 (well, that’s when they revealed the results of their decision, so they must’ve made it years before). For mobile it now seems very obvious to go the TBDR route, mostly to save energy, but it’s interesting to see how there’s now an angle that this architecture is potentially coming full circle back to desktops. That makes me even more excited for Macs with ARM-based custom processors in them. This was another good article that was linked from the one above: https://c0de517e.blogspot.com/2017/08/tiled-hardware-speculations.html?m=1