Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Another strategy: Instead of creating new abstractions willy-nilly (which will probably be "wrong" in some sense you won't discover until much later, when it's too late to drop it), you base your abstractions off something that selects for "correct" low-entropy abstractions, e.g. by stealing ideas from math.


By all means, steal ideas from math. (Great artists, etc.) Math has had a couple thousand years practice in learning how (and how not to) express abstract ideas as formal written text.

But don’t abuse math. Like redefining summation to mean any old thing under the sun. Least of all when it’s a fricking union, for which the math symbol is ‘∪’, not ‘+’.


Both set union and number addition are examples of monoids, a maths concept from the field of abstract algebra. So there is a maths abstraction that unites these ideas.


Uhh quick limiting thing here (although I agree that the addition symbol is fine): the existence of a unifying math concept alone does not justify using the "+" symbol. By that argument, notating permutation conjugation with a "+" is fine, but that would be a notational sin as the "+" operator is reserved for commutative operations.

However, with that disclaimer, noting that both operations form monoids and both are commutative justifies the "+".


Haskell doesn't use the + symbol for monoids, it uses <>.


Union symbol is not on the keyboard.

Sure, make it easy to type then we can use it instead of +

However, anyone who has been programming for a while should be able to puzzle out what + is doing.


Emojis aren’t on the keyboard either but folks don’t seem to have a problem typing them, even on desktop.

Funny that software engineers can figure out complex parallelism and algorithms, but typing ∪ as a function name is next to blasphemy.


"Computer Science could be called the post-Turing decline in the study of formal systems."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: