Dependencies.
If you create your own brush, a physical tool that doesn’t depend on anything other than the atoms it’s made of (and ok, maybe some laws of physics), you can be pretty sure it’ll last because it doesn’t depend on anything else that’s constantly changing.
Same for writing on a piece of paper or paint on a canvas. There might be some slow chemical processes happening to your substrate and ink over time that make it harder to read or even disappear. But that’s not even close to what it’s like with digital artifacts.
We can’t create digital artifacts without tons of dependencies. Hardware, OS, file formats, libraries, frameworks, languages, environments — even the often praised web standards that should help solving this problem don’t help, apparently (the first example in the thread is about web browsers).
Same applies to data, not just executables. Yes, we can drag piles of bits into the future with us, as long as the decoding and parsing libraries keep working and are constantly adapted and recompiled for the current OSes and hardware architectures. And then still some bits might flip due to physics and if your stack of checksumming layers of hard- and software doesn’t account for that then that’s it.
Should we consider all these dependencies as just props, or should we be a little more concerned about what libraries we link to when we publish something or how many parts of the runtime environment we can take for granted?