"Worlds of scarcity are made out of things. Worlds...
# linking-together
k
"Worlds of scarcity are made out of things. Worlds of abundance are made out of dependencies." -- https://alexdanco.com/2019/10/26/everything-is-amazing-but-nothing-is-ours
s
This is a great article. The things vs. dependencies dichotomy is a powerful insight and resonates strongly with me, who has always been vary of too many dependencies. The “getting an old web app to run” example describes succinctly why I think today’s web has serious issues beyond repair. I’m not sure about the link to scarcity and abundance. This seems more like describing the shadows thrown on the wall by the real reason. And as you might expect I believe the real reason to be rooted in business models (https://twitter.com/stefanlesser/status/1188384797510586368)*. That aligns with the points he makes about SaaS vs. Bitcoin. In summary I think each of us today is in a great position where we can choose to blindly build on infrastructure that “isn’t ours” and we only shallowly understand but make impressive progress, or go back and refine and simplify the stack, make it easier to understand, but solve boring infrastructural problems that have been solved already, just not well enough. *) I posted the wrong tweet before and changed it now, but the preview here in Slack is still the wrong one. Sorry.
g
That post is all over the place. Tengentially, In relation to node.js dependencies I'm starting to have a change of heart. I often find it insane the number of dependencies. But, ... I think for many older techs it's just that the dependencies are hidden. Many C/C++ projects require 50 to 100 libraries to be installed. Each of those libraries was probaby dependent on other libraries when it was compiled so its dependencies are hidden. That also means any latent vulnerabilities in those libraries are hidden since the resulting library has no meta data about its dependencies. node on the other hand, all dependencies are far more explicit. It looks insane but it's actually just seeing the how the sausage is made vs older techs which ends up hiding it. Not saying I'm totally convinced, just an observation dealing with C++ projects and having to manually and globally install libs and tools that would have all be handled locally in node.
s
I think each of us today is in a great position where we can choose to blindly build on infrastructure that “isn’t ours” and we only shallowly understand but make impressive progress
Isn’t that a summary of the history of computing though? Not unique to anything today in shifting toward cloud services/more dependencies. I would imagine people made the same arguments as new languages came to replace assembly for example, and then when languages with GC became more popular than languages where you had to manage memory explicitly
s
@Scott Werner I agree that in the last half century we as an industry have probably put an emphasis on that half, and that's perhaps also why many of us here complain about lack of progress and exploding complexity. But I also see small and big efforts today that focus on rebuilding infrastructure from the ground up. Some of the larger efforts: Swift for instance (and some other modern languages too) tries to tackle the Unicode and floating point challenges we discussed elsewhere (and more) on a fundamental level. Swift (and Rust) also don't use GC. LLVM (and maybe soon MLIR) has fundamentally improved the infrastructure for building languages, compilers, and development tools and environments. Reactive frameworks are bringing declarative approaches to web and native GUIs. Some of these frameworks even consider values like internationalization or accessibility as fundamental and build them right in. And both Apple and Google are re-building stacks down to the hardware level, for processing hardware that is not based on classic CPUs (Metal, image processing, TensorFlow, etc.). I’m sure there are more examples.