Can’t help but find your approach trying to connect UI/UX to how humans think a very promising one. I got really excited when I started reading the blog post, thinking, “can this possibly go the same direction I’m thinking about?”, but then you drift off specializing on the nervous system and the freeze response. Which makes it even more exciting, as we probably have stuff to talk about…
I’m looking at categorization and research in idealized cognitive models to draw parallels to what people (want to) do with computers. Currently, programmers clearly favor a symbolic approach, which requires learning abstract categories and concepts that are hard to pick up by end users, who just don’t want to put in the time to learn all that. UIs use metaphors and other structures to make computers more accessible. I think there is tremendous potential to take what we know about metaphorical structuring, metonymy, kinesthetic image schemas, etc. and see how we can take advantage of these in UI. Ideally, we’ll massively reduce the learning curve as the right metaphors “just click” with users as they leverage fundamental patterns of how we think.
I’ve started to write about, hoping to get more technical people interested in that domain and trying to make the research more accessible to us programmers:
https://stefan-lesser.com/2019/12/06/structure-and-behavior/
I think we’re very similar in approach (leveraging neuro-science) and found different areas in that huge domain to focus on. Would love to compare notes and see how this all fits together.