I really like how you thinking about a broader perspective,. Instead of going blindly towards terms like efficiency or productivity you think about purpose. The agent arena relationship.
Im so torn when watching this talk, on the one side I absolutely agree with the ideas presented, but on the other hand I realize it will take years to communicate all these shifting perspectives.
Thank you very much I really hoped this was presented to a wider audience, maybe @Chris Martens (they/them) you should consider some popular conferences.
Besides all the problems and challenges presented in the talk one huge problem i see currently is the debating culture. Even the anticipation of change seems problematic. Not even talking about complex things where one would need to find common ground with a lot of foreign ideas....
Chris Martens (they/them)
12/17/2020, 6:35 PM
Thanks for watching! Yes, I decided to post it here because I felt hopeful that a community called The Future of Programming could get on board with imagining a radically different future.
12/18/2020, 10:20 AM
I watched it as well. Thank you very much for introducing me to the following ideas:* notional machines
* competitive vs complementary cognitive artifacts
* casual creators
Here is a snippet of that comment:
the presented Thoughts are inspiring, cool, disturbing and unsettling:- We don't use technology as a Tool, but as an Environment, which we use to perceive other humans and our "real" environment/nature.
Such that we as humans only can function in this technology environment.
- We replace the perception of humans with machine related Terms and Algorithms. Not only do we treat machines like humans, but also humans like machines.
- technology by itself has no agenda. Only the model of operation in which we use the technology created that directed influence.
Most of the technologies today are used in a market oriented model with the goal to enable growth. All models of operation like markets, national states, education and science/academia are created by ourselves - so it follows that we are also able to change them again.
It is our Obligation as humans to ensure that all models of operations serve our well being
I do think Rushkoff make it a bit easy for himself ( and thus us) here
I see a couple of Problems:
• humans tend to like to create things, starting from scratch, but it is WAY harder to change existing things that to start something new
• our cultural evolution has become so complex that it is not simply possible to change one thing without creating unknown problems somewhere else
• the monopolies that were created through our cultural evolution created very problematic behaviour in our population, this in combination with a evermore capable technology ( nuclear , gene editing , digital tech etc..) created the potential for catastrophic events to be much more likely
Now while this does sounds very negative I did found some projects that do give me some hope