> The complex processes we call “systems” prese...
# linking-together
s
The complex processes we call “systems” present special challenges to our uneducated imaginations. We tend to think additively, and are constantly surprised when something that seems to be “just added in” causes surprising and often disastrous changes.
One of the reasons the consequences were not imagined is that our human commonsense tends to think of “stability” as something static, whereas in systems it is a dynamic process that can be fragile to modest changes.
In our world, we have enough power to topple our most important systems, but not the power to restore most of them.
Being heroic in the face of disaster—as humans often are—will not help in most of these cases. This means that we have to “learn about consequences before they happen”. We have to be able to summon vivid enough imaginations of the disasters to be heroic long before they happen. And we have to educate our imaginations how to do this without introducing superstitions and paranoid delusions.
http://worrydream.com/EnlightenedImaginationForCitizens/
❤️ 4
Not quite sure where I picked this up — Bret didn't tweet it recently, and I couldn't find it posted here…
Every now and then we have a discussion about values here. This text has a strong Why that I am sure many here will align with.
j
I love that they use the term heroic. I see similar behaviors at the company I work at. People are very ready to go to extreme measures (long hours, risky changes, manual intervention) when the service is down, but not prepared to take extreme measures (deep refactoring, extensive testing, cleaning up tech debt) when the service is healthy. I think there’s an economic aspect to it that it’s (at least perceived as) cheaper to slow feature velocity briefly during an outage than it is to slow it all the time to prevent outages, but it’s not a very inspiring take (and might not be correct).
❤️ 1
w
Societies generally celebrate people that save the day when disaster strikes. We rarely see a monument for somebody that prevented a disaster from happening. Their actions may be many times more impactful, but the evidence, by it’s very nature, is absent.
☝️ 1
s
Not only is the value of those trying to prevent disasters not recognized, its often dismissed or treated with hostility. Few, in 2006, wanted to hear that home prices (increasing far faster than population or density increases) were unsustainable and that the government backed debt was fueling the debt bubble which would inevitably implode. Likewise, few since wanted to hear that the $12T in QE had the result of vastly increasing (nearly tripling) global debt levels (of underpriced risk) or what this would mean for the future.
👏 1
r
It seems nearly impossible to get from the computer systems we have now to the computer systems we dream of and outright fantasy to reform education to the point it’s needed. But if the media we use really transform us then maybe it’s not such fantasy. I think like we suffer from this lack of imagination to foresee disaster is because it’s really hard and generally left to the experts. It takes a lot of specified training. As a result a lot of people are left to merely trust the experts and this isn’t convincing for a lot of people. So one solution is to use computers so more people can be experts or at least proficient enough to have that imagination. This solution seems fairly visible here. Another possible solution is that computers could impart in us powerful ideas about systems eg about feedback and stable or unstable equilibriums which would let us imagine analogous systems in climate for example. This solution at least isn’t as visible here.
k
@Jared Windover @wolkenmachine back in 2014 I made an analogy to monkeys watching a storm slowly approaching: https://www.ribbonfarm.com/2014/02/12/consensual-hells