Last talk by Rich Hickey. Do you agree with what h...
# thinking-together
d
Last talk by Rich Hickey. Do you agree with what he considers downsides of Maybe and Either?

https://youtu.be/YR5WdGrpoug

👍 1
s
I agree in the sense I feel 'having or not having' is a very different thing than 'int or string' and fusing them into the same type system seems off. I'm also more comfortable with the set oriented thinking that Hickey discusses. Relatedly, I'm also not a fan of the usual ADT types anymore. I think they impose too much of an implementation detail onto what should be a conceptually pure relationship model. They define not just the relationship but also the bit representation and normalization/denormalization strategy. I feel the entities and relationships should be defined separately from how they materialize - what parts are in contiguous bits of memory, whether they are row oriented or column oriented, etc.
💯 1
i
Do you agree with what he considers downsides of Maybe and Either?
Yes.
I feel the entities and relationships should be defined separately from how they materialize
Clojure is actually pretty good for this. The abundant data manipulation functions are all coded to interfaces that just care about things like "Is this sorted? Is this a seq? It it associative?". You can add your own concrete datatype with, say, a different storage strategy, and conform to whichever interfaces make sense. Very a la carte.
e
It is a kind of law of the evolution of computer languages, that when a language has a design flaw, particularly an omission, it will then evolve to fix that original omission. In this case you can see how Clojure which does not have the concept of a record, something that goes back to Assembler/FORTAN/COBOL/PL1/PASCAL/C/MODULA-2/ etc. has invented spec/keys to fix this. Records are the most commonly used data structure in business programming. You might have a customer in a database, and you have fields name, age, etc. The record definition makes it clear what data is to be stored. if you define a_person to hold name, age, etc., then in the code if you copy a record, in one operation you have copied a whole set of fields, thus achieve some leverage. The lack of structure and strong typing which were a hallmark of Clojure's flexibility is now being perceived as a problem, and so they are augmenting the language. Clojure and esp. Clojurescript are among the most powerful languages extant today, however, i wouldn't call them that easy to read. In some aspects, the dogshit (pardon my french) simplicity of COBOL represents a simpler programming universe. Isn't the most important thing in programming helping to eliminate programmer error? Don't the errors in programming dominate our total time spent, and thus anything we can do in the language to help catch errors early before the program is run, a positive thing, and shouldn't we therefore start to measure how well a language prevents error compared to another? Isn't that the real future of coding? I think the graphical interface prevent a lot of invalid operations from occurring, then you have solved a lot of errors. No missing commas, or unbalanced parentheses, so many errors go away in graphical space, which is why i believe they are so attractive to everyone. But isn't it also true, that the most difficult errors in programming are never syntactical or simple in nature, and that incorrect order of evaluation, and subtle dependency errors are the real spenders of elapsed time? And now that we are doing networking, and multithreading, aren't the timing and sync issues now the big issues? Those are not really visible in a graphical representation, which emphasizes connections over all other things.
👍 1