I’ve been thinking about text recently — as many h...
# thinking-together
o
I’ve been thinking about text recently — as many here often do 😛 Something that comes up a lot here is that we represent things with text, building atop it as a kind of “lower level” representation. For this to work, there needs to be some degree of coordination and communication in how these lower level representations should be interpreted. When digital text first emerged it was standards organisations who made these coordination decisions and they were able to convince or compel other groups to normalise around their decisions. This had limits even back then — see the history of the USA, ASCII, and non-english-speaking world for examples. The computing landscape today is radically different, and it seems unlikely that similar approaches would succeed. So my question is as follows: If we want to someday go “beyond text” as it currently exists, or build other low-level representations, is this even conceivable today without addressing that low-level coordination? As in, perhaps the only viable way forward is to enable the collective development of these systems of representation. This puts the challenge of (general purpose, low-level) representation firmly into the space of distributed systems, socio-technical systems, digital governance, coordination, communication, etc. I’m looking to sharpen my thinking here, all thoughts welcome!
🤔 2
l
Redefine "language" as information with interaction, presentation, translation, and nesting, with other languages. Separate the specification from implementation/concept from substrate. Compress specification across languages. Result: Common mode of input/editing, unified while specialized data model, combined with [the rest], also portable, translatable, adaptable.
o
@Leonard Pauli I’m not sure I understand what you’re getting at there or how it applies to what I wrote above, could you elaborate?
g
I firmly believe that we need to go beyond text. At least to using rectangles, ellipses, lines and text (i.e. minimalist SVG). 0. Rhetorical Q: How did Galileo get us to change from Ptolemaic Cosmology to Copernican? 0. Rhetorical Q: How did we make the switch from assembler to C and Pascal? Assembler programmers hated HLLs and gave all sorts of reasons not to use HLLs. 0. Rhetorical Q: How did we (normal people, not programmers) switch to using spreadsheets instead of ??? 1. I think, the “new” way has to be an order of magnitude “better” than the old way. 2. I think that, in the interim, one has to incorporate the “old” into the “new”. Somehow. To appease the “assembler programmers”. (IMO, today’s “assembler mindset” is (1) the belief that there is “one language to rule them all”, (2) “everything must be synchronous”, (3) CALL/RETURN). 3. Maybe we need to ignore the current crop of “programmers” and target a different kind of programming? On the technical front, we need to overcome the fear of compiling non-text to running programs (I have example WIPs of compiling draw.io diagrams, technology isn’t the biggest issue). We need to overcome the mindset of using ASCII Art instead of drawing boxes (“{...}” == box). Programming is hitting an asymptote caused by in-the-box thinking. There are things that we could express if we didn’t limit ourselves to text. Rhetorical Q: why do CEO’s draw things out on whiteboards? Why can’t we compile whiteboards? “a = b+c” is well-covered by textual programming. DaS (mini-VPL) ain’t “a = b+c”. Rhetorical Q: what “tells” are there that current programming is not good enough? The accepted mindset that “multitasking is complicated” is a tell...
y
The idea of magnitude/the gradient of adoption and the question of the whiteboard I think could be really interesting if thought of in tandem. What is the gradient between modern IDE programming and wacom programming? how does one measure the impact of that evolution? what are tools that could be adopted by the modern IDE programmer that could exist in their workflow that would exist on that gradient? My mind instinctively goes to the classic cli commands. What could be built where a programmers desk includes a keyboard a mouse and a wacom tablet?
k
Formal standardization, as with ASCII, Unicode, or PNG (to name something non-text), happens when there are many small players but no dominating one. De-facto standards originate with dominant players who find themselves at the center of an ecosystem that is beneficial for them. The worst environment for emergent common representations is probably the digital feudalism that we see in today's platforms. Another dimension is formal vs. informal representations. Formal representations encourage formal standards, informal representations encourage informal standards with many variants. Markdown is probably the best illustration of the latter. My guess is that this is the preferred mode of converging on representations for today's Open Source communities. It's well aligned with the spirit of forking and adapting existing things for your own needs.
o
The worst environment for emergent common representations is probably the digital feudalism that we see in today’s platforms.
Yes I’d be surprised to see these kinds of things emerge under the current environment, I’d assume changes to the dominant model of tech development would correlate with new (hopefully better) common representations emerging. As in, beyond the silicon valley style, platform model, etc, this becomes a necessarily political so I’ll avoid that tangent in this thread. Your point makes me think about the ways we can spur this kind of change. I’ve been trying to articulate recently the difference between the Smalltalk-style theory of change, a kind of “build a system based on vision/principles and show the world it’s a good idea” approach. This is in contrast to the systems that eat away at the old, culminating in a similar kind of transition through different means. This was recently articulated far far better than my attempts by Stephen Kell in a talk (slides). He notes that the ‘Smalltalk school’ has leaned towards a ‘design and replace’ approach, whereas Unix was able to “infect” its host system and eventually became so dominant that it becomes the host itself. He advocates the latter as a more viable approach in the modern day which I agree with. In this community there are differing theories of change and it’s always interesting to see how people view these things. Personally I think the “design and replace” approach isn’t viable (at least when it comes to systemic shifts in how we represent things)
k
"Design and replace" was reasonable in the 1970s, but it no longer is today, except for well-defined niche use cases. We have a huge ecosystem of information processing systems that new technology has to interoperate with. We can try to orient the evolution of that ecosystem, but we cannot decide to replace it. Stephen's approach works along these lines, but there is room for many different ones.
💯 1
w
Hot-take: first and lowest bar is adversarial interoperability / competitive compatibility. In flowery language, combat digital feudalism by tearing down the walls of the lords against their will. No technical niceties matter if making stuff work breaks the law.
2
o
@wtaysom fully agree here. Interoperability is ultimately a political issue, not a technical one.