<@UC2A2ARPT> at <https://futureofcoding.slack.com/...
# thinking-together
k
@Ivan Reese at https://futureofcoding.slack.com/archives/CEXED56UR/p1652719357709289?thread_ts=1632872466.023400&amp;cid=CEXED56UR:
what even is a computer and what will we do with it?
I've actually been struggling with this question a whole lot outside this community. Perhaps I should bring y'all in: • https://lobste.rs/s/i1b6tw/computer_science_was_always_supposed_be#c_2mffpqhttps://forum.merveilles.town/thread/35/what-are-computers-for%2c-anyway%3f-38https://merveilles.town/@akkartik/107896035093628757https://merveilles.town/@akkartik/108043961340591227 (and thread) The conclusion I've currently arrived at is: • The kind of computers we have today prioritizes large organizations that pay people with money. The influence of this mindset is deep and infects almost all our tools. This "computer industrial complex" is often useful on short timescales, but it is also blind to a lot of the value people create with it. As a result, future decisions by the computer industrial complex often destroy value at people scales. • Right from the start (suspense alert), there's been a shadow computer with a different, more convivial purpose. Confusingly, this shadow computer often looks just like other computers. You have to look closely to look past the camouflage. • It's hard to see what people would do with these shadow computers if we weren't immersed in a world created by organizations. After some time thinking about it, I can't find a better answer than the one (drumroll) Vannevar Bush arrived at, right at the start: one thing convivial computers are definitely for is to externalize our brains. Paper expands memory. Computers expand both memory and modeling. This is a surprising, even shocking, conclusion for me to arrive at. I've always slightly looked down my nose at all the "tools for thought" conversations in this Slack. It's felt too close to productivity porn most suitable to avoid doing anything productive. Suddenly they're super relevant. But it's not enough to build more tools for thought. We have to think also about the process by which they're built. We have to ensure that the people-factory generating the convivial iPhone is also convivial. Because if it isn't, the conviviality will be short-lived as organizations kill or coopt it for their needs. The most important property to preserve, IMO, is to keep the raw code as naked and free from packaging as possible. It should be literally begging to be opened, inspected, tinkered with. Most software today fails to fit this bill. C programs come without source code by default. Browsers require big honking machines to be built from source. We write lovely naked Ruby code but then package it up into gems that go hide in some system directory where nobody can go look inside them. This is what my future of software looks like.
❤️ 5
p
Alan Kay had another answer. Paraphrasing here, the computer is a communications medium that allows us to share simulations of our ideas with other people which they can then run, modify, and test in order to better understand and question the ideas being communicated. It is personal in the sense that anyone can use it to create simulations of their ideas and share them. It is dynamic in that the simulation can respond to the recipient of the message who is attempting to understand it. The recipient can ask the simulation "what if?" questions of the sort that have only historically been possible when conversing directly with the person who has the idea. It can also allow others to respond to and critique messages by pointing out flaws in a simulation and publishing an improved version. Ultimately, by allowing us to think and communicate more deeply about complex issues, this could help bring about another enlightenment in the same way that the printing press helped bring about the last one by allowing people to communicate ideas and arguments that were too long to remember all at once.
💯 4
i
I can’t think of a cohesive final thought because there’s so much here. When someone builds a tool, it’s because their brain is attempting to do what you said: externalize a thought. They’re building a representation of their brain’s wiring. It’s amazing. It also means there are trillion-billion-billion ways to organize thoughts and ideas, which is also amazing but daunting. This is why I’m building my project. My version of “tech” is that everything is a cog in someone else’s machine. Everything. I used to flip boolean flags in iOS world hoping for views to come out differently, and would be surprised when they did - because there was no mental map I could rely on to help me understand what the hell this was actually doing. I wanted - need - the ability to and for myself look as arbitrarily deep or as shallow as a system allows. Big corp hates that, because then you understand what they did, and then you don’t buy it anymore. Boohoo. I want to click a file and see code. Then, knowing that every single token has a meaning - either as an identifier locally or someone else - and understand it in whatever context I wish. If you’ve seen the movie Arrival, with the Heptopod ink language, this is what I mean. Every curve and contour of your software does something and has a direct cause and effect. That’s what all the computer science research did for us, and is doing for us. Codifying cause and effect and strengthening the guarantees that paradigm offers. I want “the mechanic that grows up around a family shop” to be in exactly the same space as “the techie that grows up around a computer store”. These people are not different - they’re experimentalists. The difference is the tools they have and can create, and who - in this current world - owns and allows new ones to be created.
It should be literally begging to be opened, inspected, tinkered with.
The first moment I ran my project and saw the code that was running it fly up in space and just.. stare back at me, I had a feeling I didn’t understand. I still don’t understand what it was. It was kinda accomplishment, but my brain just… did something. I felt a click. I saw every single individual glyph and line of code all at once and just went, “…huh”. My brain had never before that day used its optical nervous system to simultaneously process the entire visual representation of a codebase other than as a list of “files” and “directories”, already abstractions I had to come up with visual metaphors for.
d
@Kartik Agaram when you say "large organizations," what you're referring to specifically is capitalist corporations. The development of computing in modern times has followed the same driving force as all other industrial technologies. From the absurd keurig coffee maker to the ever-present WalMart (in the US) to the roadways that consume land to the very design of cities themselves. It all serves one primary purpose: profit. The only reason PARC and Vannevar Bush were able to get as much amazing work done in so many creative directions was a relatively absent profit motive (read: mostly unhindered research). I shudder to imagine what the computing world would look like if early development were dominated almost exclusively by the profit motive via corporations, as it is now.
t
I feel like there is mainly 1. the scientific use of the computer, its a tool for thought or an instrument for massive number crunching. 2. general tool for automation, that's the business stuff we get paid for, it's not cognitive extension, it's free labour or helper. It's just a very convenient tool for stuff. You see the first case much more in universities, it's there, but its not connected to business. You get random software that a prof made for enumerating mathematical objects in such and such space. Thats the computer as a mind extender. Its lovely. I prefer programming computers for science but society as a whole finds more use in the second case, and rewards accordingly.
p
I'll also throw out Ted Nelson's definition, that a computer is a general device for dealing with symbols and following plans. A generalized form of writing and paper. He argues that we only call it a computer because as a historical accident the first people to create one were using it to compute by manipulating symbols that stood for numbers.

https://youtu.be/RVU62CQTXFI

i
“That’s why it’s called a computer. It’s for computation.” Some people just refuse to see outside themselves. They sound and present a willingness only for small and closed mindedness.
“Don’t you have file folders?” “Yes.” “Isn’t that enough?” “… no!”
How do we make those people move out the way of the future?
e
@David Brooks Jack Goldman, who helped run Xerox during the PARC days, once said that if it were only up to the whims of the profit motive, we would never have gotten a vaccine for polio. Instead, as he put it, we would have "gotten the best iron lungs you ever saw." I always thought that summed it up nicely. Are today's computers the polio vaccine or the advanced iron lungs?
k
I just remembered after many years this old website made by a friend of mine: http://whatarecomputersfor.net
p
I can't find a reference for it right now, but I seem to recall Seymour Papert saying something like "all adults are learning disabled." I also seem to recall this being part of Alan Kay's reason for focusing his research on children, because they were still able to learn new ways of thinking. Maybe Planck's principle applies to more than just science? https://en.m.wikipedia.org/wiki/Planck%27s_principle
k
I actually like profit. There's nothing wrong with profit. The trouble arises when our assessments of potential profit lack imagination. In particular, as group size increases groups have a tendency to focus on short-term, stable profits. They're easier to defend in debate, and probabilistic investments become risky. It's not just that ARPA gave us the internet while corporations couldn't. PARC at its peak was similar, and it came out of a for-profit company. I have a hard time imagining DARPA today accomplishing as much, though I'm not an expert. Power = Resources - Accountability. Sometimes people with power do amazing things. Sometimes they don't. (Sometimes they're Robert Moses who we thought for decades did amazing things, until Jane Jacobs opened our eyes.) (I also saw something recently that the rhetoric about companies having to maximize profit is a recent thing, going back perhaps to the 70s: https://www.nytimes.com/1970/09/13/archives/a-friedman-doctrine-the-social-responsibility-of-business-is-to.html. So while maximizing profit is always a difficult problem, perhaps we've made it harder for ourselves in recent decades by forgetting the value of long-term vision.) My response to all this is to avoid trying to decide what "we" should "all" do. Universal basic computation for all. Then make of it what you will.
i
what even is a computer and what will we do with it?

Collect, Question, Communicate

- we eventually changed Collect to Gather.
💡 1
d
If you honestly believe there's nothing wrong with profit @Kartik Agaram , I would encourage you to research some voices that have a lot to say about that. Suffice it to say for this conversation that the profit motive has given us an extremely twisted major use for computing: social media. If profit above all else is the motive, then Facebook/Twitter et al will do whatever it takes to increase their profits. They do this by way of advertising money. The suggestion / content algorithms maximize "user engagement." And since we humans are hard-wired to pay attention to extremes ("if it bleeds, it leads"), the algorithm has no choice but to suggest more and more extreme content. We have witnessed this again and again in mass shootings and in general political divisiveness around the world. This barely scratches the surface. And if you wonder "why doesn't Facebook do something about it," an internal FB group was instructed to research the extent of FB's influence on extremist activity and came to the conclusion that yes, FB is contributing to global instability and that the best course of action for the benefit of humanity was to fundamentally change FB's business model. Mark Zuckerberg waved the warning aside and told them to never bring it up again.
k
You're preaching to the choir there! Reread what I wrote. "Profit" != "maximizing profit" or "profit above all". I explicitly called out "maximizing profit" as harmful rhetoric.
r
Partly, I think the reason we've arrived at such an obviously local optimum is that consumers of computers and software typically go for the cheapest fastest device that can run the most stuff, and leave out the more qualitative aspects of valuation to their own detriment. B2B sales certainly doesn't lend itself to optimal purchasing decisions either, so we are all left waiting for revolutions while the products that we're stuck with gets moderately better over the decades that it's around. More than anything else, the discipline of engineering struggles to find a foothold in computer software, partially because it's so lucrative and powerful that "agile" unscalable tinkering wins out, and partially because making well-engineered software is especially hard in a rapidly transforming medium - at least now it seems Moore's law has migrated to GPUs! It seems we're stuck with Capital and all that brings, including the systemic drift to low performance that is only corrected by aperiodic paradigm shifts. What's exciting is this community seems as poised as any to offer one! I, and I think most others here, agree with your axiom to make software as free as possible - an important part of that, to me, is to extend that freedom to the end-user even if they aren't a "programmer" in the traditional sense. There are performance and complexity costs associated with that as well as the standard ugliness of proprietary systems, but things seem to be moving in the right direction, and the idea of such a system now isn't pure fantasy. Now, we just have to compete with AI-fueled black boxes that seek to forever murkify precious computing.
d
@Kartik Agaram always enjoy seeing your thought process. i get a lot of mileage out of ‘computers as externalized modeling & simulation devices.’ note that this is distinct from the actual models they externalize/simulate, the actual systems being modeled, and the actual systems they inhabit, not to mention the strong priors imposed throughout by the ‘ur-system’ (physics) & the ‘ur-platform’ (biology). conflation of these is rampant
👍🏼 1