An interesting Twitter thread on the preservation ...
# thinking-together
k
An interesting Twitter thread on the preservation of digital art, including comment on how it hurts that digital artists can't fully manage the media they use: https://twitter.com/Polackio/status/1223303212775739392
❤️ 1
k
I'm not sure artists can solve this by themselves. Monet made his brushes in a certain cultural milieu, and for software we're just not there. If the goal is centuries, nothing we have is likely to help. When it comes to software we need to rely on future archeologists. And this sucks. By a coincidence I was trying to run my copy of LightBot today and it no longer works on my phone. (I've moved phones a couple of times since I last used it. Each new phone helpfully transfers all my apps over. But it doesn't ensure they run, and that's really the most important part Seinfeld)
😆 1
i
The imminently-about-to-be-released episode of the podcast talks about this exact subject in detail.
Neat to see it resonating with folks here. I was a little worried this wouldn't be the right audience for this subject. Frankly, I'm not bothered by artworks being short-lived. A lot of art is an implicit or subtextual commentary on impermanence, and I think one of the qualities that makes high art gauche is the insistence on its lasting value to society. I like my art the way I like my comedy — extremely of the moment and brutally uninteresting after the moment has passed.
🖕 1
k
I'm a digital packrat. But yes, preservation shouldn't be the responsibility of artists. I wager even Monet when he made his brushes wasn't thinking, "which one will last longer?"
i
Well, the brushes aren't the work. But in any case, I agree that artists shouldn't need to concern themselves with this problem. The technology should be built to last, not built with a 3 year expected lifecycle. That's so wasteful, and narrow. Now, on the other hand, look at what digital technology has done for recorded music and video — flawless eternal preservation is on the table for the first time ever, and it's not an unequivocally good thing either.
💡 1
In the case of music, video, and similar purely data media, though, at least the default perfect preservation gives us the choice about whether to archive and curate the work. It is tragic when there's work that we can't archive.. though.. this is such a big topic. What does it mean to archive an artwork divorced of it's cultural context?
k
Without having thought about this much, archival by definition seems about preserving things divorced from their cultural context. Sending things to the future so they can be seen in new contexts.
i
Or letting those things be the mooring to past culture.
k
Herewith, my piece of purely data short-lived art entitled, _"The moon is WHERE right now?!_"
👏 1
❤️ 1
i
Consider a piece like this — http://www.simonweckert.com/googlemapshacks.html You can't archive it because the outcome of the art isn't a physical object, it's a series of actions in the world, depending highly on the current state of the world. It's a performance. You can't typically archive performances, but you can document them in a form that can be archived (in this case — photos, video, writeup). Perhaps we should think of the artworks that the archivist in the linked Twitter thread is bemoaning as performances, rather than as artifacts. If an exhibited piece requires a defunct web service, it was a software performance, not a software artwork.
There might be a desire to preserve the software, and the needed hardware, and emulate the web service.. at that point, what you are doing is attempting to preserve the materials needed to stage another performance of the piece. Those materials aren't themselves the piece, they're the script, costumes, props, etc.
That's why, I think, the best message to take from the thread is this one: https://twitter.com/Polackio/status/1223311516772052992 Some of these software art pieces need to be treated like theatre, not like painting or sculpture.
👍 1
s
Dependencies. If you create your own brush, a physical tool that doesn’t depend on anything other than the atoms it’s made of (and ok, maybe some laws of physics), you can be pretty sure it’ll last because it doesn’t depend on anything else that’s constantly changing. Same for writing on a piece of paper or paint on a canvas. There might be some slow chemical processes happening to your substrate and ink over time that make it harder to read or even disappear. But that’s not even close to what it’s like with digital artifacts. We can’t create digital artifacts without tons of dependencies. Hardware, OS, file formats, libraries, frameworks, languages, environments — even the often praised web standards that should help solving this problem don’t help, apparently (the first example in the thread is about web browsers). Same applies to data, not just executables. Yes, we can drag piles of bits into the future with us, as long as the decoding and parsing libraries keep working and are constantly adapted and recompiled for the current OSes and hardware architectures. And then still some bits might flip due to physics and if your stack of checksumming layers of hard- and software doesn’t account for that then that’s it. Should we consider all these dependencies as just props, or should we be a little more concerned about what libraries we link to when we publish something or how many parts of the runtime environment we can take for granted?
💯 2
r
Is this really a problem? Wasn’t there a thread going around about someone who opened a website they made 20 years ago with just HTML/CSS/JavaScript and it worked perfectly? Wouldn’t the same be true for png/jpg/mov/wav/aif/mp3/pdf, etc...? I understand this is a problem if you’re using certain technologies, but I can’t help but consider this a self-inflicted wound. People are choosing to use different mediums because those mediums have different advantages than the mediums that would help in preserving their work. The problem isn’t that technologies don’t exist that can preserve their work, the problem is that people do not value the preservation of their work enough. Why not? Who knows. But if they did, they’d choose a different medium.
💯 1
s
Part of the problem is that many people who are using technology as a medium might not know how to use technology in the way such that their artifacts can be preserved better. Many people start learning web technologies because that is supposed to solve the problem of being tied to a particular platform, but then they only test in the latest version of Chrome and keep adding all these JavaScript libraries and shoot themselves in the foot without realizing.
👍 1
r
I agree 100%, but that means it's a people problem, not a technology problem, and people problems are harder to solve. You can't just make people have different values, and it takes Herculean effort to "sell" people on values. It seems to actually be almost impossible (e.g., see FSF). I think that would be an interesting topic in and of itself, when have new software values been successfully sold to people? The most successful example I can think of is Apple and privacy, the only other example that comes to mind is Free software, and open source in general, which has been far less successful (outside of developer technologies). Are there others?
k
@robenkleene That seems uncharitable. We're talking about artists who aren't tech-savvy. They shouldn't need to be! I'd much rather blame the marketing customs of our society, that focus excessively on how easy it is to get something working and not enough on the externalities and responsibilities it involves. "Side effects may include: tearing your hair out, security vulnerabilities, getting pwned by script kiddies in Kazhakhstan, identity theft, loss of credit rating, and death."
👍 3
@Stefan The paper I just wrote basically argues that minimizing dependencies is the number one responsibility of professional programmers.
👍 1
💯 1
❤️ 1
r
@Kartik Agaram Agree that would be great, still just shifts the blame to a different group of people though, which still makes it a people problem. And unfortunately, while technology problems have solutions, people problems have much worse record of being successfully solved... I would love to be convinced otherwise though, examples of groups that have been convinced to be more responsible, even if it might be against their best interests, because it's for the great good? (Heh, feels like I'm describing all of the worlds problems right there...)
k
Well, complaining about people problems is also a people thing to do 🙂 Sometimes it even has an effect.
🍰 2
i
Bit rot and digital encodings of audio/video are not concerns on the same scale as the dependencies of executable software. It's trivially easy to archive digital music and video — that's a well-understood, well-addressed problem. Preserving executable software might be impossible for the same reason that preserving the exact performance of an actor in a play is impossible. We can't have our "computing is a process" cake and "but we can treat it as an object" eat it too. See, for instance, the CRT example in the thread. The performance of an actor is an execution of the play by the human performer. The performance of a software work is an execution of the code by the machine performer. The capability of the performer in both cases comes as a result of their unknowable complexity.
See also, the trouble creating cycle-accurate game emulators, and how the speedrunning community continues to limp along on original hardware as much as possible. Those classic games had zero dependencies other than a very tightly bounded hardware spec, and even when open source are difficult to preserve. You can port DOOM to run on your toaster, but it's not the same as playing it on a 486 with a CRT.
k
I certainly agree bitrot on 'pure data' and executable software operate on different scales, but perhaps not in the same way as you. Data always requires a reader, and there are encodings to be understood. This can be simple, but compression complicates things. The simpler part of the state space degrades over a long time -- and so I actually consider it the harder problem. Things will get imperceptibly worse, and we won't notice until it's too late. This is how you get Linear B. Executable software is like the canary in the coalmine. Solving for it is like a gateway drug that puts us in much better shape to solve the harder problem.
i
That's an excellent point. Though I'm pretty sure digital audio and video are archived as raw samples / frames. There's as much encoding trickery there as there is on a vinyl record.
👍 1
s
@Ivan Reese Can you help me understand why you consider digital data encoding and executable software to be on different levels? Making data encoding work reliably across different platforms isn’t something I’d describe as “trivial” unless we’re only looking at i386-compatible architectures. Also you’d need to make sure you agree on a standard format and know how to detect it if you only have a blob of bits. And then there are file formats that keep evolving not unlike executable software (e.g. Microsoft Office data formats over the years). For executable software emulation seems to work quite well. It usually takes a few years until emulators and the underlying technology are able to recreate an older platform reliably, including the option to run at original speed. Emulation might not be a perfect solution, but it isn’t impossible either. Maybe the framing as artwork plays a role here. I understand that for instance an artist might only consider his work properly recreated if the exact same CRTs are used. That I can see as performance and there I’d agree that it shouldn’t (can’t?) be about recreating performances. But I was immediately thinking in broader terms, e.g. old games that don’t get updated anymore — I see tremendous value in trying to preserve these executables and still be able to run them even if the experience changes over time as the original hardware is no longer available and potentially different interaction mechanisms have to be used. It might not be exactly the same thing anymore, but for me that is far from pointless to try to preserve it anyway. And the way we write software currently, is not just suboptimal but downright hostile towards preservation.
k
I've been chatting on this thread on Twitter, and Jon Ippolito pointed me at a very interesting paper: http://www.digitalhumanities.org/dhq/vol/12/2/000379/000379.html
i
@Stefan Your last comment is a gradient of feelings for me, starting from "completely skeptical" at the top and progressing to "completely on board" by the bottom :) To the best of my knowledge, digital audio when encoded as uncompressed linear PCM WAV (for example) is nothing more than a series of numbers representing how far to displace a transducer. It's simpler than even ASCII text, since you don't need a reference against which to compare the numeric values — they're just a measurement of physical displacement, like a seismograph. Sure, you need to know the number of bits per sample and the byte order, but that's likely the absolute easiest thing in the world to reverse engineer if needed. Audio is so profoundly simple. You can encode it in complex ways, but I don't believe archivists do that.
❤️ 1
@Kartik Agaram
🙃 1
Oh wait, turns out it's just an error in the paper. http://brandon.guggenheim.org is live.
💡 1
k
@Ivan Reese @Stefan The difference between audio and executables is gradual, not fundamental, but it's not small either. One way to quantify it is by the Kolmogorov complexity of the software needed to interpret the data. Pragmatically, approximate the Kolmogorov complexity by the number of lines in a good C implementation. The audio decoder for a WAV file is a lot shorter than an x86 emulator with a minimal operating system.
👍 2