Never try to abstract something until I'm on my third implementation of something like it. The first time I lack context for what is likely to change. The second time you have to watch out for second system effect, adding all of the things you wanted in the first version. But the third time you tend to have the experience and knowledge to do it right.
This is a rule of thumb that should be drilled into the heads of anyone who is even thinking of drinking the patterns kool-aid.
I have the same personal rule. Too often junior engineers think that the worst cardinal sin is to have a small amount of duplicated or less abstract than it could be code.
This really isn't the case. Until you really understand the commonality of the problem, attempting to abstractly define the solution will make your future work all the more difficult when it turns out you didn't really understand all the edge cases and exceptions.
Three use cases are a good number -- by that time you're much more likely to understand the problem, and by avoiding premature abstraction, you'll have an easier time doing the work.
Another problem is that Java and C++ developers often use class inheritance to share implementation code, even if the classes are not modeling proper is-a relationships.
Actually I think a bigger problem is that Java programmers (not C++) create hundreds of interfaces. Thus Java becomes an artifact hell (EJB 2.0 was a nightmare).
Abstract classes on the other hand are very useful but most Java programmers think... "well what if in the future I need...".
If Java just had traits/mixins, closures, literals, and some syntax brevity it would be a pretty damn descent language... oh wait that's Scala.
You don't have to create an interface! One of my favourite interview questions is why do you use interfaces in Java? The majority of people I interviewed said that it a good design used interfaces and had nothing more to say. In code review if you have an interface and only one implementation I'd make you remove the interface - I call this the unnecessary Impl pattern since the class is traditionally called NameOfInterfaceImpl.
Also you don't have to name property accessors getX() and setX() if you don't want to.
I believe your line of thinking is wrong.. interfaces are meant to act as "interfaces" in the traditional definition of the word, they are for providing an interface between groupings of code (for lack of a better phrase) that are mean to be loosely coupled from one another. They are defining a contract between these groupings so that the one grouping that uses the object implementing the interface doesn't have intimate knowledge of the other grouping of code. For example, if a project uses a 3rd party library, of which only a few features are being used, yet they know they are most likely going to replace this with another 3rd party library at some point, it would make sense to write an interface that wraps around this library. Then, it is easier later on to replace the 3rd party library by simply writing a new implementation wrapper.
I can understand that you don't like candidates not going into detail but I have worked on projects where using interfaces, even if there is not more than one implementation is still best practice as otherwise you are just hard-coupling your dependencies.
> In code review if you have an interface and only one implementation I'd make you remove the interface
which changes the contract from "this component will accept and work with anything implementing this interface" to "this component will accept and work with anything of this class or inheriting from this class". And without multiple inheritance ... Do you feel the difference?
If there is only one implementation of an Interface, "anything implementing this Interface" and "anything of this class" are the same thing.
Coding to an interface is a good thing - but it's the concept "interface", not necessarily the language feature named "Interface" that's important. If/when you have multiple concrete objects which share the same interface, it's trivial there to switch references from the concrete type to that of the Interface (you were coding to a conceptual interface representing a consistent abstraction of a single responsibility in the beginning so your Interface is interchangeable, correct?)
But you understand interfaces, you have just given an explanation. The candidates I interviewed didn't explain themselves at all, I think if a language has a feature you should be able to explain when and when not to use it.
I think that all Java IDEs make it trivial to introduce an interface when it is required, so unless you are producing an API for consumption by a third party you should only introduce an interface when you have multiple implementations, or a third party can legitimately produce their own implementation, and not because you think you may have multiple implementations at sometime in the future.
Also if you have a single implementation naming it InterfaceImpl is just lazy, you almost always have more information that you can use to name it, it might be a InterfaceUsingJdbc or an InterfaceFileBased for example.
> If/when you have multiple concrete objects which share the same interface, it's trivial there to switch references from the concrete type to that of the Interface
you're a lucky person that it has been trivial for you, your teammates and all the known and unknown clients of your code and components. I can only envy your experience.
Lots of Java development is carried out for internal business applications and like it or not they would typically be integrated with third parties using something like SOAP or a shared database. So in this type of application the choice of a concrete class over an interface probably has little impact on the known or unknown clients of your code.
However, as you point out, if you are presenting a Java API to your known and unknown clients you had better think carefully about which abstractions you are going to expose and interfaces are probably going to be very helpful.
No, that's Groovy. Scala is the equivalent of C++ - trying to be everything to everyone and creating an expansive behemoth that everyone picks their own subset from.
A dynamic language that is 100-1000x slower (http://stronglytypedblog.blogspot.com/2009/07/java-vs-scala-...) does not equal Java plus closures and a few other syntactic niceties. You would have been much more accurate referencing Xtend, Kotlin, Gosu, or Ceylon.
I used to think that way too but actually inheritance doesn't have to imply an is-a relationship. Inheritance in its basic form in programming language theory can be thought as a code inclusion mechanism. To be honest, I often find that composition is a poorer and clunky way to reuse code
I've never seen a java team that actually bothered to write something three times. You do it once, it sucks, you shrug because of your deadline, a ball of mud gets wrapped around it, development pace slows way down, and oh shit, it's too late to change. IMO get it as right as you can, the first time. (i work for a small, competent, enterprise java shop)
I don't believe OP is saying right the same thing three times. He's saying that quite often, one has to write something that is almost the same as this other thing, and later yet another thing that is almost the same.
The first time you write it, just make it work. It works or it doesn't work. Chances are, yes, there's only going to be one of it. So making it "as right as you can" I assume means "it works". "Its reusable" however, is not necessary.
As soon as a "ball of mud" is being created, that is your moment to refactor. If you aren't doing that, then that is your choice. It is not possible to argue that taking an action which causes "development pace to slow way down" is a situation where refactoring would cause a deadline to be missed. Not refactoring will cause the deadline, and many more, to be missed.
I'm in agreement with OP and nupark on the rule of 3.
the "big ball of mud" that gets wrapped around factoring-1 IS the re-use. its not like a team writes one component, then says "OK we're done!" and writes the next. you write your infrastructure hand-in-hand with the code that uses it. but as you understand your infrastructure problems, you don't fix them, because "we'll just write a few more things around it for this milestone". for the next five milestones. software human process doesn't happen in nice, discrete chunks which are easy to understand, talk about, and justify.
Because something designed to be reusable was probably designed with an eye towards reading the code a lot and may also (incidentally?) be more well factored?
Which circles back around to the point above; until you have a few examples, trying to make something reusable will probably cause you more pain in the long run. That's not justification for bad code, just an argument against premature factorization.
Like recommendations against premature optimization, it's just a good rule of thumb.
I was about to say the same thing myself. For some reason, a lot of developers (including me) go through a phase in which they generalize as much as possible as early as possible. This completely misses the obvious: It is harder to plug into an abstraction, especially a bad one, than it is to do the simplest thing and generalize later, when you need to.
Oh, and by the way: death to injection frameworks. I want my java code to be readable, tangible java code.
Totally agree with the death to injection frameworks. Where I work a lot of the devs love to talk about "dependency injection" and ways to have some abstracted thing in .NET handle it for you.
My preference has always been have a constructor which takes in the interfaces required. Its very explicit, makes testing a breeze and with no magic is much easier to read.
I used to feel the same way, but it's really nice to be able to easily plug in a mock for testing. With autowired in spring, it's really not that bad. The way Grails handles it is pretty nice, actually.
Maybe I haven't had that phase yet but I tend to believe it occurs when there isn't a finite plan for how everything will work. You start creating abstract classes with the intention that it will do this or that or even whatchamacallit!
When you have a strong idea about the direction of the project then you start creating less branches that lead no where.
DI is useful in the situation where you'd like to mock 3rd-party web-service (that usually don't have test environment themselves).
e.g.: Integration with Twitter. If you don't mocked Twitter client API, you won't be able to write unit-test (or someone will make a lot of excuses of how hard to write unit-tests).
You can work-around DI with having a factory/template method that you'd override or whatnot but that situation won't work 100% in all situations. You may end up writing more code.
This reminds me of Donald Knuth’s advice in the TeXBook: anything that you find yourself repeating three times is a good candidate for wrapping up in a macro.
The "I Point the Finger at You, I Have Already Pointed it At Myself" section calls me out for sure…
I personally have to live with the shame of a very clear and vivid memory
wherein I circled objects that covered nearly four sheets of this paper
and wrote, in my own hand and with a clear-head: "A good candidate for an
abstract factory! :)"
Eventually I realized that the emoticon was mocking me. Laughing at my
presumptuousness, laughing at my premature optimization.
10-12 years ago, I was that guy trying to spot "patterns". Ugh. Hopefully my Clojure hacking will tip my karmic balance back to neutral.
I've got a serious mental contingency bomb setup to explode whenever my mind wanders and thinks about reflection. I think about it, the bomb explodes, I stand up and go for a walk and when I come back the whole thought process that lead me to consider reflection is wiped clean, ready for a fresh redesign from scratch.
It's not that I don't see value in Reflection, it's just that I'm sick of debugging it.
Abstractfactories are not that bad, actually if you're in the habit (like me) to always step in and write an interface before you start any abstract, you pretty soon understand why it's a good or bad idea - interfaces of factories are so damn trivial that even an abstractfreak will see this as redundant work and will want to write an interface of this factory interface and then a factory interface factory through code injection, preferrably through annotations, but it could be from other means such as a properties file or xml configuration, so they think about a factory interface abstract factory (and of course its interface). By then, even the craziest persons give up and just do their job.
It doesn't need to be so bad, but in larger organizations, it often is so bad. It just takes a few bad seeds of pattern-zealotry in an organization to poison the codebase into a God-awful mess. Once the patterns become a common practice, and every constructor has been dependency-injected ad nauseum (despite most interfaces having only ever having 1 implementation), there's just no going back.
It's not just larger organizations. I've seen exactly what you're citing — where even the most trivial objects were injected — in an organization with roughly 5 developers.
Non-reified generics with their ridiculously verbose syntax are the first thing that came to my mind. God forbid you should try to write your own class which uses them.
Not to mention that since spending almost decade of my career working with Java I have since worked with languages that do not have generics, and have not suffered thereby. Generics are, IMO, largely unnecessary.
Yes, this is a small subset of Java, but it is one that continually annoyed me.
> Not to mention that since spending almost decade of my career working with Java I have since worked with languages that do not have generics, and have not suffered thereby. Generics are, IMO, largely unnecessary.
This needs to be drilled into the heads of all the people complaining that Go has no generics before they have even tried to use it. (Also Go's interfaces overlap with many common uses of generics, and Go's useful builtin containers take care of most of the rest).
Well, I for one frequently encounter places where the factory / interface / adapter paradigm is a good solution. And, it's not just to be clever.
This typically occurs when 1) I am doing enterprise application development, and there are many different instantiations of a particular application, or 2) I am doing enterprise integration, and I have many different libraries and services to work with, sometimes legacy, and often with complex business logic that selects some over others.
Granted, I've seen this pattern used all too often and seen over-abstraction, in general. But, I need to be brave and defend this, given that no one else seems to want to. It does have some legs.
I use this when working with a lot of different teams / companies and coordinating them all and agreeing to how things will work in production. That's especially true when some of those teams / companies may be interchanged with others. This type of solution makes a lot of sense.
However, I would much prefer a dynamic typed solution to avoid the unnecessary boilerplate. And, even better, I would prefer to work only with self-contained software companies that have complete control over their code-base and restrained domain logic. That would be awesome! (but wishful thinking).
We could have a poll... What percentage of folks use Java because they want vs because they must? Would you use Java for your startup? With which stack?
Yes. In fact there's one idea that I'm sort of prototyping right now in which Java would be a nice fit due to WebService integration.
Do I want or must use Java? Does not matter to me. The libraries (and their quality) help me write less code and make me feel comfortable (it is psychology, but it matters).
Never try to abstract something until I'm on my third implementation of something like it. The first time I lack context for what is likely to change. The second time you have to watch out for second system effect, adding all of the things you wanted in the first version. But the third time you tend to have the experience and knowledge to do it right.
This is a rule of thumb that should be drilled into the heads of anyone who is even thinking of drinking the patterns kool-aid.