Excessive deduplication is an anti pattern.
The point about removing duplicates is to make the maintaining of the code easier. If it makes it harder, then it’s the wrong abstraction.
Trying to abstract everything away in multiple layers of classes and function calls, just to avoid to write a 2 lines of code twice is the biggest evil in programming. Well probably not the biggest, but a huge problem in my opinion. Adding unnecessary complexity due to being clever is evil.
stuff like this tends to happen, if people look only at the operations the code does, but not what it’s doing it with, those abstractions are often not clean because we lost the sight on what things represent.
e.g. calculating prices for selling stuff to a business might have enough differences to calculating prices for selling stuff to a private person, that having two different mehtods for that is easier to understand than having a single one with a genreous amount of ifelse thrown in there for cases that only matter for one type of transaction.
I was there too doing lot of generalized functions that do everything. And I always hated the code. I am just hobby commandline guy, but even that is horrifying code. Doing this in complex code bases with real world impact is just bad.
which is why many (oop)-patterns exist, what i described is a prime candidate for the strategy pattern, which basically means to turn different algorithms for a task into extra classes with the same interface that can be given to the transaction depending on it’s kind or something else in the context. the question is allways: does the pattern improve my code now or will it improve my code soon, because i know i have to implement 3 more of “almost the same”
blindly applying a pattern or a guideline like “DRY” has allways the chance to turn into messy ball of code.



