can it be generalized to "if someone needs an impr...
# thinking-together
m
can it be generalized to "if someone needs an improvement on dimension X, then they will look around and find it [on their current tool]"?
g
there's a cost to not be on the mainstream. if you're willing to run pypy, you might also be willing to rewrite in Rust or something that has less to do with python (edited) if the core team wanted to replace cpython with pypy, that could work, but it's probably incomplete.
m
changing an interpreter and maybe fixing/reporting a compatibility issue is not on the same ballpark as rewriting your project in a different programming language.
g
Another comparison might be jruby. It's faster, but people only switch to it when they have to target a JVM, I think.
m
I agree with you on the cost of not being on the mainstream, that's my comment, if people are not taking minor inconveniences for big improvements on efficiency, then asking them to take major inconveniences for the same or slightly more should be at least as hard
👍 1
g
I think the risk assessment of a change is like an n^2 algorithm. If it's sufficiently like stuff you have done before, it's pretty easy to assess. If not, then, you're going to need to put in a ton of effort. If you're rewriting to a new language, and one is 2x as hard as another, it doesn't seem to affect the decision.
In that case, it's best to have someone experienced with the alternatives.
another way to look at this might be, people have many problems, and they have to prioritize them.
not just competition at the supply level (solutions) but at the demand level (problems)
g
i wouldn’t be surprised if the situation was like jruby, where c extensions (and occasionally better concurrency support) can cause really weird and annoying issues
e
We standardized on python in my company for most scripting use. It eliminated many different scripting languages, and by having one main language it lowers the training cost of new employees. Even though some of our python scripts process huge data sets, we have dedicated machines for those processes, and since they are mostly batch job that run at specific times of day, we don't even care how long they take as long as they finish before the next day! We don't bother to compile using some fancier python alternative, because it just doesn't matter. And when it comes to performance, whether you are running against SSD vs. mechanical HDD matters way more than the programming language or what compiler you used. If it takes 4 hours in plain python and 1 hour in pypy, who cares? Sure there are people with giant data sets, but how much new data does even a medium sized company generate new per day? Not really that much compared to computer speeds.
People whinge about performance and claim that some new language is X% faster or slower than language Y, but the ease of learning and use, and maintenance costs cost way more than computer power. I can buy used Supermicro or Dell servers for a few hundred that are only a few years old and costs thousands before. We are awash in CPU power today, but human skill is hard to get a hold of (and retain!). So really the future of computing is about making things easier for humans, not faster for the computer.