I'm sure this has been discussed here (appreciate ...
# thinking-together
r
I'm sure this has been discussed here (appreciate if someone can redirect me), but... What use cases do you see people have for creating their own applications? Do you think people are willing to pay for it?
n
In my worldview, the distinction between creating an app and using an app is artificial. • When someone uses an Excel spreadsheet, they create programs (formulas) that will be invoked whenever new data is added. • When someone uses a note-taking app, they create a hierarchy (or a web) of pages which allows them to later navigate between pieces of knowledge. The links between these pages amount to a program: a link is an instruction that says "when someone clicks me, navigate to this page". • When someone uses a calendar app to send reminders, they are programming the computer to provide them with customized notifications about upcoming events. Under this lens, everybody wants to be able to create their own programs. The problem with today's world is that we silo programming into two realms: the "end user" realm, where programs are small and easy to create (but are extremely limited), and the "elite coder" realm, where programs are large and are very challenging to make (but are very versatile).
The route to the Future of Coding requires us to stop believing that this dichotomy is "good" or "natural".
g
Most people don’t /think/ that they want to create their own applications. Yet, a select few went for Hypercard and spreadsheets. Dentist office software began happening only when spreadsheets became available. Spreadsheets were invented for Accountants, but innovators with domain expertise in Dentist Officery picked up on the new paradigm. This makes me think of Franz, Inc.’s business model (free, but % royalty on all apps).
r
@Nick Smith that's a cool worldview. So, if I understand correctly, you see "end-users" extending apps instead creating them from scratch? *at least, most of them
@guitarvydas in that example, were the dentists themselves creating the apps, or was it "programmers" along with dentists and their domain knowledge creating them? if the latter, is the end-user programming effort just lowering the barrier for developers to create apps?
g
At first, it was the dentists - the domain experts - who began to see that they could make computer software without having to “learn how to program”. The “experts” - programmers - came later to make the apps more robust and scalable. Programmers only have domain expertise in programming, not in running medical offices, nor accountancies, nor ... I use(d) a book-writing tool called “Scrivener”. One of its selling points is that it was not created by programmers. Having experience with software, I see all sorts of warts in Scrivener, but, I wouldn’t have been able to invent Scrivener. Spreadsheets, Hypercard, VB, etc., etc. are like gateway drugs. Non-programmers used these tools to express automated versions of their processes. Later, expert programmers cleaned up the ad-hoc messes. This effect can even be seen in programming itself. The “experts” tell everyone to use FP, recursion, monads, etc., etc., but the majority of “real programmers” prefer HTML, JS, Python, Perl, etc. There /should/ be a huge market for enabling invention, but, it must not appear to be complicated, nor expensive. Borland targeted developers and disappeared. VisiCalc did not target developers and its ideas morphed into Excel, etc. Did the VisiCalc company have an exit strategy or was it simply overtaken by newer versions of the ideas? (idk). The inventions came from so-called non-programmers.
Henry Ford (?): “If I had asked people what they wanted, they would have said faster horses.”
n
@Ricardo A. Medina I would phrase it as: end users are likely to make simpler programs (like the examples I mentioned), purely because they don't have the time or expertise to make more complex ones. I wouldn't think about the Future of Coding in terms of "apps". An app is a hard boundary that discourages interoperability and thus inhibits creativity/flexibility in programming. This is by design: companies want to lock you into using and paying for the programs they write and the services they provide. There's no commercial incentive to break down the artificial boundaries. Apps aren't natural. CPUs don't execute apps — they execute instructions. Apps are a result of companies trying to adapt the notion of a _product—_something you can put into a box and sell to a customer—into the digital realm. In fact, until the late 2000s, most apps were literally sold in physical boxes with a price tag attached. The Future of Coding is not about encouraging companies (in vain) to allow their users to make modifications to the apps they sell. At least, not if I have anything to say about it. Software should feel like a continuous fluid, not a collection of impenetrable boxes that have small holes for a user's data to trickle in and out.