> "Don't Repeat Yourself" versus "Repeat Once or Twice"
This is a common misconception, that principle is not about code duplication but about conflicting logic.
From Wikipedia:
> The DRY principle is stated as "Every piece of knowledge must have a single, unambiguous, authoritative representation within a system".
So if the insides of that loop are tightly coupled to a specific business logic you want to be 100% sure don’t diverge and it applies to multiple different types of data: go for it. Otherwise, DRY supports Carmack’s point here.
> When the DRY principle is applied successfully, a modification of any single element of a system does not require a change in other logically unrelated elements.
> This is a common misconception, that principle is not about code duplication but about conflicting logic.
Ah the good 'ol, "Nobody really implements DRY correctly, a true DRY follower does X", or the No True Scotsman Fallacy. OOP advocates love to use this too.
Unfortunately, there comes a point when the Wikipedia definition really doesn't matter. What matters is what's advocated and the most widely used definition of DRY, which is simply "Don't Repeat Yourself", with little to no qualifications. And unfortunately, this usually leads to overzealous use of the principle and the abstraction of two separate pieces of logic into one shared piece of logic, when it really should be two separate pieces of logic.
So I think Carmack is spot on in his assessment of how DRY is used in the day to day, and why it can be a bad idea to apply it to every situation. As with every programming "principle", the DRY principle should be used judiciously and the benefits/drawbacks of applying it in any particular scenario should be analyzed accordingly.
I mean, yes you could argue for no true scottsman here, but it is a little different when the initial book or whatever that spurred the movement is the one being quoted.
Any movement distilled into only three words is gonna have a bad time right?
> I mean, yes you could argue for no true scottsman here, but it is a little different when the initial book or whatever that spurred the movement is the one being quoted.
I honestly have no idea whether this makes any difference. I'll take SOLID
as an example. I know I've debated several people about the drawbacks of SOLID and how I believe those outweigh any proposed benefits. And almost every time I do this, people will tell me that I just didn't use it correctly, even when I am basing my opinion off the original blogs etc that started the principle.
When that fails, I'll point to code that I've seen in production, in real world codebases, and they'll just tell me those people are using it wrong, even though it seems that no matter where I work code that follows SOLID looks awfully similar to what I described. I think this still falls into the No True Scotsman fallacy.
I guess I'm basically saying, does it matter if the original authors of a principle had a certain ideal in mind when they coined the principal if the colloquial use is vastly different? I think OOP is a perfect example of this, because most people mean something very different than what Alan Kay meant. However, if I argue against OOP it's generally understood I'm not arguing with Alan Kay's idea of message passing even though that was the original intent.
Edit: but overall I do agree with your sentiment haha. A 3 word principle is bound to have many interpretations. It just irks me when people try to call out an argument with a "technically it really means this..." when everybody understands that "sure while it technically means this everybody really ends up using it like this...".
And I have my own opinions of course, but to summarize I usually see SOLID code bases lead to extraordinarily overabstracted code bases that are impossible to debug or explore because everything is an interface that's injected through some DI framework. Exploring these code bases is a nightmare because everytime you Ctrl+Click a function, you're taken to an interface. The best part, the interface only has one implementation 99 times out of 100. But boy does it mess up your flow when your actively tracing and trying to understand a code path.
I would argue debugging + maintenance are the majority of a programmer's job. So any ideology that obfuscates your code and makes it increasingly difficult to maintain is not something I want people I code with to follow.
When I wrote my comment I was excited "I can help people understand a thing that was hard for me 10 years ago."
When I read your comment I feel attacked. I need people to hear that the original intent of the DRY does not mean "no duplicate lines of code" and that such application leads to the problem you're describing, while the new definition I'm attempting to introduce does not.
> As with every programming "principle", the DRY principle should be used judiciously and the benefits/drawbacks of applying it in any particular scenario should be analyzed accordingly.
Of course. I agree 100% and didn't realize this was in question. Therefore I think this is a good point. I would have appreciated a "yes, and" comment instead of a "no, but" argument.
This is a 100% fair criticism and I'm sorry I took the tone I did with my response. After re-reading through your original comment and my response, my response seems unwarranted.
> Of course. I agree 100% and didn't realize this was in question. Therefore I think this is a good point. I would have appreciated a "yes, and" comment instead of a "no, but" argument.
This is perfect advice, and put very succinctly. Sorry again for the combative nature of my first response and I'll try to do better in the future framing any remarks like these as a "yes, and" comment :)
> "Don't Repeat Yourself" versus "Repeat Once or Twice"
This is a common misconception, that principle is not about code duplication but about conflicting logic.
From Wikipedia:
> The DRY principle is stated as "Every piece of knowledge must have a single, unambiguous, authoritative representation within a system".
So if the insides of that loop are tightly coupled to a specific business logic you want to be 100% sure don’t diverge and it applies to multiple different types of data: go for it. Otherwise, DRY supports Carmack’s point here.
> When the DRY principle is applied successfully, a modification of any single element of a system does not require a change in other logically unrelated elements.