Thank you, this was a fun listen. I also really appreciate that you have dropped the disrespectful nicknames. Thank you.
When comparing the priorities of the authors of Unix versus the authors of ITS, I think it's worth remembering some of the technical differences between the hardware.
Unix grew up on a mini computer, the PDP-11 (Yes, it started life on a PDP-7, but the real growth into the operating system we would recognize today occurred on the PDP 11.), where the kernel had to fit in 64K, and each program had to fit in its own 64K (later models let you use 64K for code and a separate 64K for data). This environment will naturally encourage one to prioritize performance and simplicity of implementation.
On the other hand, ITS grew up on a mainframe, the PDP-6 (later PDP-10), which had a 36-bit word and 18-bit addressing, making it possible for a single address space to contain substantially more memory. It's much easier to put more complexity into your kernel in this environment.
As a result, I'm not convinced that the differences in prioritization were fundamentally the results of the two cultures in question. I suspect the different priorities may have arisen partly from the technologies in question.
With respect to the question of how well the Unix interface hides complexity, I would argue that many Unix tools provide a good, simple interface for utilizing some pretty deep complexity. In fact, I think the relevant comparison is not Lisp versus C, but Lisp versus sh. The ubiquitous data type in Lisp is the list, while the ubiquitous data type in sh is the file full of variable length one line records. Pipelines in sh fill the role of function composition in Lisp. The equivalent of C in ITS was the MIDAS assembler, which WAS really nice for an assembler. See https://wiki.c2.com/?SymbioticLanguages
for some elaboration on where I'm coming from with this comparison.
For example, make provides a relatively easy way to make use of a topological sort, without having to understand the implementation details or even what a topological sort is.
The sort command allows you to sort files substantially larger than RAM, handling all issues of breaking files into chunks that can be sorted within RAM, merging those chunks into larger chunks, storing intermediate results on disc, etc.
The diff command provides a simple interface for finding the longest common substring between two sets of lines.
lex and yacc pack a lot of powerful computer science into a relatively straightforward interface for specifying tokens and grammars.
join, comm, awk, dc, bc, and many other tools that were already available in v7 Unix also present simple interfaces for making use of powerful code.
Speaking as a huge Smalltalk and Lisp fan, as well as a huge Unix fan, I think the question of how Unix won extends far beyond Gabriel's analysis, though I appreciate the factors that he identified.