Is the command line on the decline or rise? Hear m...
# linking-together
r
Is the command line on the decline or rise? Hear me out. Conventional wisdom is that the command line died when the Mac was introduced in 1984 and Microsoft followed suit shortly after with Windows. And I say that narrative is accurate if you're only looking at a time span of say 1984-1994. But over the course of my programming career, which spans from 2001-2019, over that time the relevance of the command line seems to be increasing. I'm guessing this stems from
git
, which was introduced in 2005, and once that was in place it laid the groundwork for package manager usage to explode, especially with
npm
being introduced in 2010. Once package managers were in place, that set the stage for an explosion of new CLI applications, such as linters and style formatters, which laid the foundation for the next generation of text editors to emerge, i.e., VS Code and Atom, some of whose core features are integration with these new CLI utilities. Tack onto that the increasingly sophisticated ways of managing development environments, such as version managers like
nvm
. So what do you think, is the command line on the rise or declining?
🤔 3
p
I don’t have any sense that command line use is losing shares of users or use cases. The story I’d tell has more to do with Linux winning the Internet, and with OS X being built on BSD and exposing a command line.
👍 3
The unix way is one where programs share data and communicate (more so than proprietary platforms), which is absolutely critical to supporting the complexity of modern networked computing. The tools you mention are successful in large part because they provide a more modern UI while still participating in that cooperative tooling environment.
👍 1
r
Absolutely, OS X being so successful with developers would be a great starting point to this narrative before the introduction of
git
.
👍 1
p
As for git: the only important thing it introduced was making it easier to build something like github, which was the real innovation. Github introduced a radical model that made it safe and okay for people to fork projects for their own use, or to contribute back. I think this has been important for opening open source up socially to new people.
👎 2
👍 1
And as for the CLI, your original question: part of this is the legacy of unix, and part of it is that you need a really free-form environment (the filesystem, pipes, plain text files) to manageably stich together useful things out of the plethora of tools.
👍 1
r
Agreed re GitHubs importance, disagree with the "only important thing" framing though 🙂
p
Like, it has abstractions (content addressing, cheap branching, etc) that enable specific, very useful features. But I don’t see them as structuring computing practices in the large.
k
The only important thing the Bronze Age introduced was the Iron Age 😛
😆 1
👍 1
🍰 1
p
…?
r
I think the continued significance (and arguable rising importance?) of the command line is important, because it's used as the defining example of the argument that technology becomes obsolete. E.g., like saying the deesktop will be like the command line, i.e, relative to the web and mobile. If being like the command line is being one of the core tools of one of the largest and most influential industries there is, that doesn't exactly seem like a failure? And if the command line is not an example of a technology becoming obsolete, then what is an example?
p
IMO obsolescence has more to do with the technology industry than technology per se.
z
Definitely command line use is exploding from what I see
d
When I was in University, all the cool people were using the Unix shell, so I did too, and I just never stopped. CLI use has been central to every development job I've ever had. My last company transitioned all its developers to Ubuntu Linux. Now I do strictly open source development with Linux as the primary development platform. I still have a Mac, but my interaction is mainly via Terminal and HomeBrew open source packages. When I use a computer, I have terminal, firefox and 3D graphics windows open.
s
We’re just talking about our own little developer bubble here right? Because by far more people using computers do so on devices that don’t have a command line (eg. smartphones) or they have no idea what a command line is. For us developers it seems to make a lot of sense that it stays popular: it allows all the interaction we need, but with a massively simpler interaction model very close to what we do everyday in programming. The bubble you look at plays a big role in what is considered obsolete: regular people using technology will very much feel like command line interfaces are obsolete, we developers think it’s even increasing in popularity. Just like in audiophile and DJ-ing communities vinyl and turntables are still thriving, while regular people listening to music would consider them obsolete.
r
I think you’re nailing the discrepancy on the head here, but would you really describe something that is almost universal in one of the biggest and most important industries there is obsolete? Obviously developers are a small percentage if the population, but say for example you wouldn’t call Ableton Live “obsolete” because non-musicians don’t use it. Specialize certainly, but specialized doesn’t imply that the passage of time and invention has suppressed its usefulness. And obsolete is further wrong because within its niche the importance of the command line is growing.
👍 1
People use the argument that the command line got replaced by the GUI as evidence to predict a whole range of things that the history of the command line doesn’t support. You can even use the command line itself as an example, e.g., people say that we'll figure out a way to program on iOS without a command line. Wrong, the transition from the GUI to the command line does not support that, its far more likely people will just avoid programming on that platform unless it gets a command line.
s
@robenkleene Here’s a pattern at play that causes a lot of problems: you try to find an absolute truth in a general context, eg. is the command line obsolete, but it only makes sense if you consider the context, eg. for user or developers. What you tried with the Ableton Live example is flawed because you take non-musicians as the other context. That makes as much sense as if I said people who don’t use computers at all wouldn’t call the command line obsolete. However, if you consider musicians who don’t use Ableton, it makes more sense — it could be on a downturn because most musicians adopt other tools now or maybe it is becoming more popular, I don’t know. Will we figure out a way to “program” on iOS without a command line? Well, you might be right that it’s more likely that it will eventually get a command line (I think that’s unlikely), but ruling out any innovation seems like you’re limiting yourself for no good reason. Why not think different(ly)? ;-) I’d say there’s plenty of hope for new programming paradigms especially on that platform, precisely because of the constraints put in place. The fact that iOS doesn’t have a command line and that Apple doesn’t seem to want to change that is already pushing developers to explore different ways. Meanwhile, on classic operating systems there is no pressure to do so at all and we can comfortably stick to what we’re used to. Where do you think innovation is more likely to occur?
r
Where do I think innovation is more likely to occur? On the platform without artificial constraints. Just like I listed in my original post, there's a clear line of continued innovation, it just seems that many people don’t like or don’t want to admit that the innovation is happening on the command line? iOS on the other hand, has created new approaches to programming, they’re just not popular (this is based on a number of observations, but mainly surveys of programmer's tools).
I’m not sure I follow the context paragraph. Wouldn't the parallel to “musicians who don’t use Ableton Live” be “programmers who don’t use the command line”? But that’s one of my main points, that that group is practically non-existent (based on similar sources as above)? So wouldn't that be more evidence that the command line is not obsolete? (Sorry if I got this part wrong, I’d love to be corrected here if there’s another way of looking at it?)
I think what leads me down a different path here is that I think I seem to be interested in answering a different question. E.g., if you tweak your innovation point slightly then I’d agree with it: If instead of asking “where do I think innovation is will occur”, it's asking “what’s more likely to move some programmers off of the command line”, then I’d agree with it. Offering a platform without a command line is a great way to do that. (Although I don’t think you’ll be able to shift an industry through artificial constraints, it’ll just assure programming isn’t popular on that platform. There’s actually a precedent for this, that's how Classic Mac OS worked and that’s what happened). But personally, I couldn’t care less whether I’m using a command line, I just want to use the best tools, and programmers as an industry seem to have chosen the command line as the best tool. Similarly, I want to know where 3D, audio editing, video editing, motion graphics, raster/vector graphics, programming, and game development are going. Because those are the things that have tools that are very difficult to learn. So if I do learn them, I want them to be the right tools. I research how people do these things and it’s big complex desktop software packages, and then I hear pundits saying desktop is only “legacy workflows”, so I look at mobile or web ways of doing these things and the silence is deafening.
And you if you use the command line as a precedent to predict these other industries then they’re also never moving
s
Constraints can be catalysts for creativity and designers often use artificial constraints to push themselves further. Of course, that doesn’t mean it’s the only possible way to innovate. I also don’t necessarily think that it’s good that iOS is constrained in various ways. But all things considered iOS would’ve just turned into an Android or Microsoft Surface kind of platform where things are more flexible but also done in more classical ways. For instance, Android had background execution before iOS, but iOS came up with ways to significantly improve battery consumption and security by restricting certain things until better designs were available (eg. platform security, push notifications, or the URL downloading system that are available system-wide, don’t need re-implementation per app, and contribute to energy savings). So I guess that’s where we agree to disagree: to me it seems without constraints you tend to get more of the same and it needs a challenge, like a constraint, to get people to spend more energy thinking differently. Sure, some people find other ways to challenge themselves and that works too, I just don’t believe that overall it’s a more successful strategy. Hot take I just made up: we did see more innovation in the early days (1960s-70s) because there were a lot more constraints.
r
What’s the evidence that this constraints based approach is working? E.g., do you think there have been innovative new approaches to programming that have emerged on iOS because of these constraints?
👍 1
p
The relative lack of active security exploits for iOS, and its superior energy economy.
The responsiveness of its UI.
I expect some ios devs have internalized the principles that promote those outcomes, but mostly it’s about Apple enforcing certain policies and architectures at the platform level.
React (with Elm etc) is a good example of a constrained programming model that has really changed how many programmers approach their work. I know that Haskell’s type system and Rust’s move semantics have me writing dramatically better code, even when I’m not using those languages.
Dallying with Erlang and its implementation of the actor model taught me about structured concurrency generally.
I hadn’t read Stefan’s post carefully, sorry for just repeating what he said.
s
Oh, just came across this posted today which seems to fit here:
One of the most remarkable aspects of the original Mac in 1984 is that it shipped without any sort of character-based/terminal mode. That meant not only that it wasn’t compatible with the then-wildly-popular Apple II, it wasn’t compatible with the fundamental way most developers and users thought about a “computer”. I firmly believe that in an alternate universe where the 1984 Mac shipped with an Apple II-compatible text mode — even just a single app akin to MacOS’s Terminal app today — the product would’ve failed. Developers are lazy — a compliment! — and they would’ve been drawn to that crutch. And users, familiar with the Apple II and other command-line PCs of the era, might’ve been more comfortable at first too. With no such crutch, developers and users alike had to get on board with proper GUI Mac apps.
https://daringfireball.net/2020/01/ruddock_chrome_os_stalled_out
💡 3
r
Just to be clear, I’m only really talking about creative apps here, I think iOS has been very innovative in other areas. But what I want to figure out is the future of creative apps. To quote myself from earlier: ‘3D, audio editing, video editing, motion graphics, raster/vector graphics, programming, and game development are going. Because those are the things that have tools that are very difficult to learn. So if I do learn them, I want them to be the right tools. I research how people do these things and it’s big complex desktop software packages, and then I hear pundits saying desktop is only “legacy workflows”, so I look at mobile or web ways of doing these things and the silence is deafening.’
Re the great post about Classic Mac OS, the evidence that that model was working were immediate, and similarly the benefits for some apps on iOS were immediate. But they have been disastrous for creative apps on iOS.
👍 1
p
And to be clear, you’re asking if the pundits are right and web or mobile will consume desktop creative apps?
👍 1
r
E.g., if Classic Mac OS didn’t move people off of programming with a command line, that means not only that it’s unlikely that iOS will, but also that iOS is unlikely to move any creative industry from the desktop paradigm.
s
So the conclusion is that humans are creatures of habit and change is hard? 😉
r
Well, but we do have evidence of change VS Code has became the most popular text editor very quickly, Sketch’s rise was meteoric for design, and the same with Figma now. It’s just more like iOS for creative apps and moving programming off the command line that have been duds.
p
Are these creative apps part of your everyday work? Have you tried the mobile apps?
r
Yes I’ve tried them ‘maybe’ for everyday work, I’m a programmer first, so everything else is lighter, but mainly I’ve looked at industry surveys. There’s a good summary here https://blog.robenkleene.com/2019/08/07/apples-app-stores-have-failed-creative-apps/
p
To be honest, I can’t think of what mobile offers as a platform that would draw people away from desktop, and I can think of several ways mobile severely constrains interaction and workflows.
👍 1
r
For an industry transition there’s actually even a rule of thumb: It takes 5 years for a new product to become the market leader, that holds up for VS Code, Quark to InDesign, Photoshop to Sketch, and if Figma overtakes Sketch this year like its projected to, it’ll hold up for that to. Those are the major creative industry transitions I’m aware of. iOS was released 12 years ago.
s
Is a platform transition from desktop to mobile of the same quality as a transition from one software product to another on the same platform? Is extrapolating the past in this scenario a good strategy for drawing conclusions about the future? If prediction is the goal here, we should talk about disruptive innovation and overlapping lifecycle S-curves as part of the predictive model. But that’s not really what I’m after. I think it’s important to be aware that looking at what’s popular (a) depends a lot on the context within which you define success and (b) comes with the risk of overlooking where the interesting things happen before they become popular (and quantitatively measurable). Are we looking at the right things to inform our designs? Is it helpful to celebrate the command line as the epitome of what we have achieved in programming tools because we can measure its popularity and it’s really high if we pick a convenient context of just programmers who have used text interfaces their whole life? Should we stick to interactions that more and more people are leaving behind or are not even exposed to anymore? Are we at the risk of loosing touch with users and widening the gap between developers and users? Desktops, notebooks, keyboards, mice, and all the programming tools depending on them aren’t going away anytime soon. These are the incumbent technologies today and we likely have many more years to just keep going with them and even innovate a little. And if popularity or (financial?) success are your goals, then welcome to the “milking the cow” quadrant. Reassuring success and avoiding risk is, however, what market leaders do before the “totally surprising” disruption happens. More and more people are using phones and tablets to take photos, record and edit video, produce music and podcasts, write essays and books, and sketch and draw and illustrate and… are these all professionals? No, not today. But while we discuss what makes somebody a professional, more and more people learn how to do these things and some of them make that their profession and some day there will be professionals using these platforms expecting their tools to just work there. Is this inevitable? Nobody knows. Place your bet. Do you want to go where the puck is or where it’s going to be?
👍 1
r
All great points. There’s plenty of room for me to be wrong here. At the end of the day, I’m forced to make a guess here about what’s going to be the best investment in software to learn, and based on history and the available data I just think the best bet is for the same stuff that keeps winning to continue winning.
g
Just a random data point. I wrote some system that required users to open a Terminal/command line and install node, npm my system and run it from the command line. It got chosen by a university to be part of a class mostly full of artist students using Unity for interactive art projects. I became very clear almost immediately that any interaction with the terminal/command line was a nightmare for these students. Simple things like typos, understanding current directories, etc were all foreign concepts. I ended up re-writing my system as a unity plugin to remove any need to touch the command line. That doesn't mean I think the command line is dead. Just passing on an anecdote. As a dev it might be normal but looking around at non devs I certainly don't see much command line use. As a dev though I have a hard time seeing the command line disappearing. command lines are somewhat cross platform. non-commandlines are using app specific. command lines have history, scrollback, search, search through history, command completion, filename completion, ... Maybe there is some magic UI that could do all of that kind of stuff as proficiently as the command line but I'm not sure what it would be. Picking "file open" or "execute file" and selecting from some kind of list doesn't sound efficient to me. It certainly hasn't been in the past. Further, in dev I'm often automating things via scripts. Scripts are often just multiple command lines. That is what bash is really. It's a REPL for bash scripts. So, to me, it makes sense that I will need to automate various build steps, I'll do that with some scripting language, I don't see what not having a command line would help here except to make it harder to test small things that I build up my scripts with. I am bullish on VR or AR changing dev environments. Even if just lots of virtual monitors. But, assuming they figure out finger detection and more, UIs could change a lot. Working though touchpads, mice, and touchsceens is basically like 1.5 fingers in use. With full finger recongnition I can push, pull, twist, stretch, squash, and many other things for which there is no easy analog with current common input devices. Also a 3D workspace that VR/AR opens gives a lot more room for innovation than our 2D monitors.
👍 1
r
Agreed on all points. But the same usage difficulty is shared by almost all creative apps. Put a novice user in front of Blender, Final Cut Pro, Logic Pro, Cinema 4D, Maya, After Effects, and you'll get the same results. The same is true for say a trumpet. Some creative apps and physical instruments are easier to use, e.g., it's easier to take your first steps in Photoshop or the piano. But what makes the piano so famous isn't that it's one of the easiest instruments to pick up, it's what Mozart, Beethoven, and Bill Evans could do with it. Personally, I'm only focused on predicting the future for expert apps. This is because I have to invest work in learning those apps. I don't have to invest work in learning apps for non-experts because those apps are by definition easy to use. Therefore, if I have to switch, it's no big deal because I didn't have to invest anything to learn them. But if I have to switch from say Blender to some new fangled iOS 3D rendering program, that's literally years of work down the drain. These are extraordinarily difficult programs to become an expert with, just like any artists tools like a piano, oil paints, etc...
@gman out of curiosity, were the artists using Unity just using the Unity GUI features, or were they also writing C# code? I’d generally put using the command line at the same complexity as writing code, but that would be an interesting data point if artists found writing code easier than using the command line.
🤔 1
d
Late to this conversation. I want to comment about text interfaces like the command line. I feel command line UX is so bad. Copy/paste from it or anything involving a cursor is so hard. I still like it, a lot of times is faster to type than to move the mouse and click. A second text interface that I feel is getting adoption is Slack! With linking apps I feel people are starting to get things done a guy that mostly only programmers were used to