From the late 1500's onwards you start to see a trend appearing the the great estates of the time. A lot of the lords of the manor would construct whimsical or extravagant and typically useless structures to serve as conversion pieces or decorate a view. They were known as follies.
The pepper-pot tower in Powerscourt Gardens, Co Wicklow, Ireland.
Some of the best follies are the sham ruins, which pretend to be the remains of old buildings, but which were in fact constructed in that state (e.g. the Temple to Philosophy at Ermenonville ).
Every time somebody brings up Semantic Versioning I am reminded of these follies.
Now don't get me wrong, I love the ivory tower ideal that semantic versioning gives us.
The patch version must be incremented if only backwards compatible big fixes are introduced.
The minor version must be incremented if new backwards compatible functionality is introduced to the public API
The major version must be incremented if any backwards incompatible changes are introduced to the public API.
Lovely. Perfect. What could possible be wrong with that?
Well for a start there is the real world...
We rely on humans to decide on version numbering. Humans make mistakes. It's very easy to make a mistake and have a method signature change in a non-backwards compatible way. And once you find out that 2.4.5 is actually not a drop in replacement for 2.4.4 with some minor bug fixes and should really have been called 3.0.0, your trust is gone. You are not going to trust that the project you are depending on understands semantic versioning and you are back to the old way.
What's that I hear you calling? Tooling to enforce semantic versioning? Hmmm, yes an appealing siren's call is the tooling solution. Especially in Java, where we can use bytecode analysis. We download the previous version of the artifact and compare the bytecode of the two to ensure that the public API classes signatures only mutate in backwards compatible ways unless the major version number has changed. And if there are not changes to the public API, we allow the minor version to remain unmodified. Brilliant. I like that tooling. Very helpful. Still isn't going to guarantee semantic versioning compliance though. In version 2.4.4
new FooBuilder().withoutBar().build()
worked because you could build a Foo without a Bar in the implementation, but in 2.4.5 you can only build a Foo without a Bar if it has a Manchu, so
new FooBuilder().withoutBar().withManchu(manchu).build()
works as does
new FooBuilder().withManchu(manchu).withoutBar().build()
only my code is not written that way.
To state it more bluntly, the public API is not just the classes and method signatures but the allowed patterns of usage of those classes and methods. If an API now requires that you make all calls to one set of methods from the same thread or while holding a lock on some other resource, that's a backwards breaking change. Tooling cannot help you catch those changes.
So yes tooling is great, but it will not stop the problem, namely that we have humans writing the code and humans deciding what the next version number is, and humans make mistakes.
Then there is marketing...
The version numbers that developers want to use have absolutely no relationship with the version numbers that Marketing want to use. Just look at Microsoft Windows:
Version number | Release |
---|---|
5.0 | Windows 2000 |
5.1 | Windows XP |
5.2 | Windows XP 64-bit edition |
5.2 | Windows Server 2003 |
6.0 | Windows Vista |
6.0 | Windows Server 2008 |
6.1 | Windows Server 2008 R2 |
6.1 | Windows 7 |
6.2 | Windows 2012 |
6.2 | Windows 8 |
What's that? You have a simple solution? We let marketing call it what ever the eff they want and we'll call it the internal version number that we know it really is supposed to be! Hmmm, yes, another appealing siren is calling. Do you want to know what the problem is? I have a really simple example: I had to go and look up the mapping of Windows marketing versions to version numbers to get the table above. Now for something like an operating system it's not too big of a deal, but when you get deep into the dependency hell and you have to select FooBar Ultra Pro 2012.7 from the issue tracker drop down in order to file a support ticket against foobar 2.4.3 telling them that there are issues when using it with Manchu 7R2 also known as manchu:5.3.9 you may start to feel the pain.
At a former employers of mine we had a big A0 sheet on the wall with each of the supported release versions of all the various components that made up the different product lines. When you get to that level of insanity you know that something is wrong.
Finally, on JVM, there is the classpath...
Anyone who is not on the JVM, you have the great link time issue for dynamic libraries, though static linking can sometimes get you out of jail, it only works for so long...
This issue crops up with Major version changes. As soon as you make a Major version change you are saying
I no longer promise that the old API even exists let alone behaves the same as before
Well that is fine, but if you don't change the namespace of your API at the same time, then anyone using third-party code that happens to depend on the older version of your API is dead in the water. They have their code requiring the new API and also simultaneously requiring code that requires the older version of your API. That is the ultimate dependency hell.
Changing namespace allows both versions to co-exist… but it also means that everyone has pain to adopt the new version… and you may be left trying to support two versions: the one you want to support and the one you didn't want to with it's ugly old API.
So does that mean Semantic versioning is useless?
Nope. It is actually a noble goal. An ideal we should all aim and strive for. Just keep in mind that you will never reach that goal. You will make mistakes.
No amount of tooling or process you put in place will prevent the mistakes you will make, so when putting tooling or process in place be sure to evaluate the gain it is really giving you against the pain it is causing. For example using tooling to validate the bytecode changes of your public API against the previous version, when done right, is quick and an easy win… so adding that to your build is nice… you may not want it for every build, only a pre-flight check before releases or as run by the CI build server. On the other hand mandating code reviews of all code paths changed where each line of code changed is assessed for impact may not be a process you want to introduce for every project… (I'd hope it is there for the JVM runtime libraries though ;-) but the risk of breaking changes is very high in that public API)
Hopefully this has got you thinking, and hopefully you will start to use some of the best practices encapsulated in Semantic Versioning within your own projects…
but if you think I am going to trust a version range over semantic versions to let the machine automatically decide what version of a third party dependency my code will use… you are sadly mistaken. Version ranges are a hint to the human to let them know what range they should consider when manually specifying the version to use.
—Stephen Connolly
CloudBees
cloudbees.com
Stephen Connolly has over 20 years experience in software development. He is involved in a number of open source projects, including Jenkins . Stephen was one of the first non-Sun committers to the Jenkins project and developed the weather icons. Stephen lives in Dublin, Ireland - where the weather icons are particularly useful. Follow Stephen on Twitter and on his blog .