"On the usability of editable software" <https://f...
# thinking-together
j
"On the usability of editable software" https://flak.tedunangst.com/post/on-the-usability-of-editable-software https://lobste.rs/s/qkpwpa/on_usability_editable_software Lots of room for thinking about how language design affects the ability to customize software without the anticipation of the original developer. Eg emacs lisp allows redefining functions without having to fork the original library. Eg languages with private/public settings that are enforced by the compiler completely prevent that kind of reuse/rediting, forcing the user to fork which is a pretty heavy-weight operation.
💯 1
k
I don't follow why public/private prevent 'rediting' (nice coinage there). Isn't it just a simple edit to change 'private' to 'public'? Redefining functions that weren't really designed to be extended is just as much a reason to fork as anything else, because you often want to reuse some part of the body of the function. It becomes a game of Russian Roulette if you start updating the library without updating your copy of the redefined function. So the leap from editing copies of things to editing things in place isn't that big, I think. The key here is to make forking a lightweight operation. And version control systems have already done a lot of heavy lifting for us by refining the erstwhile-heavyweight operation of branching into something light and inexpensive. I suspect making forking lightweight is mostly just a switch in mindset. There are no chains here except those we place on ourselves.
j
rediting
Been a long day 😄
don't follow why public/private prevent 'rediting'
The distinction I was getting at is editing something "from the outside" vs having to fork. Eg if there is a function in a libary whose behavior I want to change in... • ...Julia, then I define a new method of that function in my code • ...Rust, then I search for the repo, checkout the correct version as a git submodule, change the Cargo entry to point at my local repo, edit the code in the repo, then rebase my changes when I want to upgrade the library The end result is that same - I've changed some logic and I have to maintain that diff. But in the latter case there is a lot more busywork involved. Making forking more lightweight would definitely help, but I think there is an additional point of friction in maintaining the diff as text vs maintaining the diff as a language mechanism. Eg if the code gets moved around a bunch in the file then rebasing the diff is painful, whereas nothing has to change with the override. Also version control and package managers don't play together very well at the moment. If the original version of the package gets updated the rust package manager won't even warn me about it, let alone help me rebase my changes. I think we're agreeing in principle - we both want forking to be a lightweight mechanism. You could come at this from either end - a version control system that understands the language and helps you manage forks, or a language with built-in mechanisms for composing code with changes. This kinda sounds like http://akkartik.name/post/wart-layers - did you continue working on that idea?
👍 1
I guess https://www.unisonweb.org/ has gone pretty far done this road - providing builtin tools for editing a function and updating all of it's callsites to point to the new version.
k
I still use layers in my projects. But the idea with layers is to emphasize convenience and rely on the programmer to preserve composability-related properties. In other words, you can easily make changes to a function that create arbitrarily-difficult-to-debug holes for yourself. I rely on the people using layers to use them tastefully. I think this is an irreducible trade-off. If you want extending functions to always be nice and safe, you'll be restricted in the number of places the safe mechanism is available to you. In my current project I instead focus on catching any possible breakage in the project using some combination of tests, types and correct-by-construction design. This is the top priority, and I give up as much as possible while preserving it. Now others can modify functions all they want, safe in the knowledge that something will complain if they break something when forking a function.
I absolutely agree with this:
I search for the repo, checkout the correct version as a git submodule, change the Cargo entry to point at my local repo, edit the code in the repo, then rebase my changes when I want to upgrade the library... there is a lot more busywork involved.
To me this is the lethal problem with packages: by making them easy to consume we make them harder to modify. And that seems like a hard, black-or-white, us-vs-them trade-off. And if that's right, if there's no place for compromise here, I prefer keeping it easy to modify. Even if that is a harder sell, even if it means most people will gravitate towards the competition. Same goes for binary distributions of anything. Always keep source along for the ride, have the binary rebuild automatically if the source ever changes. Anything else makes the world a worse place, IMO. (I've been feeling a lot more strongly about this in the past couple of months.)
❤️ 2
I ❤️ every single one of your comments on that Lobste.rs thread, @jamii. We are of absolutely one mind here. I responded to a few comments there, but I have nothing to add to yours.
❤️ 1
i
@jamii the system Josh and I are working on revolves entirely around a remixing model 🙂
Not sure if you remember, but this was one of the footnotes in my essay on modeling: https://www.chris-granger.com/2015/01/26/coding-is-not-the-new-literacy/#fn4
❤️ 1
s
@Kartik Agaram
I feel very cynical/fatalistic about Emacs/Lisp/Smalltalk lately. Yes, the core design choices exhibit a high trust for users. But then it seems to be inevitable that the layers on top start chipping away at this trust. Vim's package managers introduce hurdles for anyone who wants to modify sources. (See my recent war story.) How does the Emacs eco-system compare? Is it really common for people to modify packages? Racket's raco feels the same, additional complexity between me and the libraries they want me to ‘use'.
(Responding here since I don't have a lobsters account) When I first started using emacs I had the same thought, since it's really hard to override a package with the default package manager
package.el
, but there is actually a quite sane package manager that makes modifying packages trivial,
straight.el
, it's based on the nix/guix model, but it's even simpler—to edit a package, you just edit the files in the git checkout it creates, that's it. maintaining your own branch is as simple as... making a git branch, and so on. Recently I had a problem when I upgraded my packages, and to switch to an older version, I just checked an older version of the package from git, and the package manager handled rebuilding and everything automatically. It felt very nice. In a way, it's like the package manager uses git metadata as its database, so you don't need extra cruft on top.
💡 3
k
https://github.com/raxod502/straight.el Wow, look at that Readme! Thanks for showing me this.
j
@ibdknox Are you guys still at rai?
Wow, straight.el does address a lot of the things we were complaining about.
k
@Kartik Agaram Easy to consume and easy to modify lead to almost opposite criteria for software architecture. I see that as the essence of Knuth’s reusable vs. re-editable.
i
@jamii nope! We stepped away at the beginning of the month. Going to try our own thing for a bit and take it from there. 🙂
🆒 3
r
This is another great comment from the Lobste.rs thread https://lobste.rs/s/qkpwpa/on_usability_editable_software#c_qxnh4i
User modifications in computer games are a lot more common than other software types and are even possible in some big budget proprietary software. I think this can offer some insight into how this can work for other software.
The most basic version is to move all content/assets (3d models, textures, sound files, etc.) into its own subfolder tree and give it clear human readable names. Users can then simply swap out files.
The next level is to abstract how the metadata for high level objects that use these assets is stored out to a text based format like json or xml so that the assets can be reused and new high level objects defined.
This can be extended to more and more engine content, including defining behaviours and simple functions in json/xml
After text editors and terminals (which have the unfair advantage of explicitly targeting programmers, who are the best position to do the editing), computer games might be the most widely-edited software there is? I did an analysis of the most popular creative apps across industries, and one of the patterns is extensibility (https://blog.robenkleene.com/2019/08/07/apples-app-stores-have-failed-creative-apps/). Now that I look for it I start seeing it everywhere, for example as far as I can tell, this chain is possible: Open an After Effects project, add a Cinema 4D 3D model to it via a plugin, then open a Houdini project in Cinema 4D via a another plugin, then render it all via a third-party renderer like Redshift. Now I'm not sure if all of that would actually work, but it's fascinating how flexible the workflows for professional creative apps are, especially compared to consumer software. Like compare that to just trying to get something like Apple Notes to talk to Excel...