What do you think about this? > Despite all th...
# thinking-together
m
What do you think about this?
Despite all the effort that has gone into it, it doesn’t look like programming language design has any real compounding power. There are better and worse languages, but other market and technical forces can swamp language choices.
https://twitter.com/ID_AA_Carmack/status/1689297553408315392
k
That's true, but not only for languages. Network effects in particular are often dominant for the adoption of technology. This reminds me of a discussion on Mastodon about Git being awful (from a UI perspective) but difficult to improve on because of network effects: https://mastodon.social/@gvwilson/110859585196043772
d
Meh, if you have a job to do and 90%+ of the work has been done "good enough" in some language you'll probably use it. Ecosystem and developer availability trump most things.
c
git was awesome from day one, and is why it won hard <> hostile, FP is hard too and people don't say it hates devs on langs, I like this reply https://twitter.com/0xmmo/status/1689330851694391296
w
Network effects in particular are often dominant for the adoption of technology.
My language choice these days is almost 100% dictated by the community contributed library ecosystem. That said, Carmack was talking specifically about languages themselves, so I think I agree with him
a
What precisely is meant by "compounding power"? Exponential as a function of what? Anyway, it's important to remember the staggering levels of programming power that have become our baseline expectation. We take higher order functions for granted, forgetting that people used to program in assembly, to say nothing of plugboards. So, "compounding"? I don't know. And it's not impossible that we've picked all the order-of-magnitude fruits. But to say that there are no order of magnitude gains in language tech at all ignores history.
i
But to say that there are no order of magnitude gains in language tech at all ignores history.
I hadn't thought about it on a long time scale, but that's a good point. It also made me think even if "better" languages are developed and not widely used, they can still have a big impact since they can act as proof of concept and good ideas from them can make their way into more popular languages.
d
exactly, and every change to a better language requires more effort to be useful
j
Yeah, will just echo that it’s a phenomenon common across tech (e.g. Windows being the most common OS) and that to be fair most popular languages have been evolving and adopting stuff from PL research, it’s just that the pipeline is something like Theory -> Research implementation (Haskell language extension, SRFI, or a whole language like Koka) -> Adopted by a more mainstream language -> Added to Java (10 years after this last step)
The most obvious case being garbage collection, but even strong typing + type inference seems to be going through that with things like Typescript, Type annotations + python type-checkers, Sorbet (Ruby), Concepts in C++, Rust, Dart etc
s
I agree that reuse is the real multiplier. How many language or frameworks projects are genuinely focused on reuse? How many writers of these tools ask which bits of code get rewritten the most and why?
b
Git CLI is hostile, but it has so many survivors that discounting it as merely "survivor bias" misses something 😄 Yes it under-invested in simple polish e.g. consistent flags, very much a case study in worse-is-better & network effects, but I feel more is going on. The plumbing/porcelain boundaries are fuzzy, and it pretty much forces you to grok the messy reality of History Space being a graph - and learn to formulate your actions in terms of that reality. In so doing, git arguably gave the survivors more power than a user-friendly tool could(?) It's what https://www.ribbonfarm.com/2022/02/10/tools/ calls a "physics-friendly tool". (Or maybe i just want to believe that, as a survivor with significant sunk cost... I do hope that smooth learning curves to equal power are possible, but to me Git stands out as success story of the opposite philosophy.)
Dont remember if it was on FoC podcast or some other where somebody complained how unfriendly
git rebase -i
is, compared to some hypothetical direct-manipulation UI, and i thought to myself "what do you mean, rebase -i is direct manipulation on the conceptual level I care for".
On languages, it may depend on how narrowly you define "language design". Some examples where I see large variances in: • Support for introspection & live coding. • how much the language lends itself to internal DSLs. • Crisp vs. fuzzy definition of language: can YOU tweak C++ syntax/semantics in your particular project? Well #define but nowhere near as much as FORTH or Racket ... • Community & inclusivity around language evolution. ◦ Python-ideas & PEP process is, to this day, more accessible that the C standard committee [I see why "C is done" and evolution is pretty much a non-goal, not judging just pointing out difference]. ◦ GHC goal to be "research compiler platform" by definition wanting to accept experimental extensions is a curious example. ◦ (As Don & João said, every early-stage language feels different in this aspect from large long-lived ones, but there are elements of "community design") • Presequisite knowledge. Some languages require you to learn manual memory management, or some category theory, etc... There are pros & cons, but surely a user automating their email flow should not need either of them :-) If you see End-user programming as "the great unsolved problem of computer science", then language design has real simplifying power. • Run-time interoperability with other languages.
(The observant reader will notice all the above axes can be brute-forced by professional tech-giant teams. If you view say maintaining your own compiler+debugger+IDE as "simple matter of budgetting", than all languages become more similar 😎 )
d
(add onto that training, package repository, and governance body and you can see why even big tech companies usually only have 1-2 home-built languages)
b
I didnt mean home-built from scratch so much as breaking the boundaries of given existing language. E.g. for Facebook, once they set out to morph PHP into their own Hack VM/language, PHP became way more malleable than it is for the average PHP users.
Hmm, there is also a middle ground, where almost anybody will build some preprocessor/generator/linter around a language they dont "control". That option reduces how much "original design" matters. Some designers actually shoot for that as the escape hatch from day one. Still, some choices make that harder/easier. Those can be syntax choices (part of "language design" by all definitions) but also implementation/ecosystem choices like having the official AST [un]parser in stdlib or easily consumable open source... I view the latter as part of my stretched definition of language design.
k
@Beni Cherniavsky-Paskin I do a lot of teaching that includes "git for scientists". An audience that has no problems with DAGs, so I start explaining git concepts before I move on to practice. And that's where people get frustrated: they know exactly what they want to do to the commit DAG, but figuring out how to do it via the inconsistent CLI makes them waste a lot of time reading through documentation which moreover isn't formulated in terms of DAG language.
w
I swear when I first learned Git, I had a least 200 command lookup Google searches in a month. Now I use Sourcetree for basic things, and Chat API lookup.
a
Rereading the original post, I think it's a non sequitur. Yes, market forces can (unfortunately) swamp almost any technical concern in the wrong circumstances, and then there's a wide swathe of powerful "technical" forces that can swamp various other technical forces. Expecting any one of those technical forces, including language design, to absolutely dominate the others was always unreasonable (retrospectively obvious even if some PLT zealots believed otherwise, I'm not guilt free), but that doesn't make working on any single one a waste of time. They're orthogonal axes. So, alternate hypothesis: language research has had exponential returns (say vs effort or time invested), but it hasn't resulted in an exponential differential in influence because many competing factors have also been growing at various exponential rates at the same time.