anyone know of any programming models that explici...
# thinking-together
g
anyone know of any programming models that explicitly include a first-class concept of a program’s user?
👍 3
i
Interesting. Not sure if this counts, but in my visual language Hest you can (eg) select a piece of data and move it around with the keyboard like it's a video game character. So, there isn't really a single concept that represents the user, but there is the encouragement for the programmer (or the end-user programmer) to embody parts of the code & runtime data. So, in a sense, a first-class concept of the program's programmer.
❤️ 4
g
@ibdknox going to do some googling tomorrow but it seems like there’s not a lot of info available on mobile at least wrt the implementation—do you know any other sources offhand? otherwise i’ll just poke around
@Ivan Reese i know this wasn’t your intent but this reminded me to share this video if you haven’t seen it already:

https://youtu.be/vw9vjEB1S2Y

even virtualized physical constraints get very interesting for creating with particle languages (it’s a hack here but i think points to what hest might aim for)
i
Algodoo (the program used to make this sim) looks like it kicks so much ass. I've always been curious to try it, but haven't yet. Maybe someday.
❤️ 2
g
i’m looking for something that might be strictly lamer: a language that characterizes user input in terms of its own control structure (i’m completely making up my own terms here and very open to input). so if hest lets users embody different pieces of data, the language/environment/whatever would allow the programmer to express the rules of the game for the user—which pieces of data do they get to control (and how many at a time), how can they embody them, what can they do as each piece of data (fresh thought, loose wording)
i
Nah, that's cool. In the case of Hest, this "play with your code/data like it's a game" is a feature of the editor, not the underlying model. I am hoping to implement some parts of the Hest editor as live data, so you can have some of that nice Smalltalk/emacs-esq self modification. But I don't yet know to what extent I'll end up doing that. It'll depend entirely on perf. Ideally, yeah, you'd be able to (eg) define some new kind of object, and then implement some special behaviour for how this "embody your object" system interacts with it. But who knows, I'm still a ways off from exploring that space.
w
Certainly the part of the idea in Smalltalk and Logo was for the user to imagine themselves as the object or turtle. Doesn't exactly match rise to the level of the user being an agent in the program. More along the lines of "if you want to paint fire, be fire." https://overcast.fm/+BnTo0uB-k
g
this podcast is persuading me to pull my oculus out of my closet to try quill again
d
j
Perhaps not quite what you are looking for, but the game “Baba is You” could be seen as an example of this. 🙂 https://hempuli.com/baba/
i
I played Baba Is You for the first time earlier today. It's absolutely superb.
w
And it keeps going and going.
g
both of these are very good suggestions
d
My Object Net is like a cyberspace where the user is a key object, that observes objects it links to, and can interact with objects it sees.
@Garth Goldwater why do you ask?
basically trying to figure out what a language for creating interactive programs rather than batch processes would look like if we started from the ground up
the analogy i’m thinking of at the moment is games where you gather and construct greater abilities out of smaller pieces —you gain the ability to double jump or rocket jump by combining existing skills you’ve learned. in programming terms: what would it look like if we were designing things in terms of user abilities and the programming environment rather than building metaphors from scratch over and over again, so that instead of wiring together “enter” and “submit form” we were wiring together domain specific terms from the get-go with particular controls tagged on afterwards
this is super hazy and half formed
i
sounds like Dreams (PS4) and Little Big Planet before that to some degree
d
That sounds like what I'm doing in the Object Net: there is a user response object type which is at a higher level than events. Users spawn response objects when they interact with a viewed (rendered) UI or form object.
And it's like games because it's kind of a cyberspace, where the user "is" somewhere and can even see other users if that's what you want
State machines follow an explicit event-state-action paradigm, where effects are based on event and state, not just event.”
In Onex / the Object Net, it's based on state-state-state! There are no events or actions, just evolving state.
By tying a side-effect to a state rather than just an event, we can ensure that it is "canceled" if it leaves that state.
That's how it works in Onex
(a bit like spreadsheets in the state interdependency model, but adding evolution of state and circularity)