I don't see anything in the spec that says what a packages version should be if it needed to be published solely because a dependency changed. For example, let's say my package is at version 1.2.3 and a dependent product just moved from its version 2.3.4 up to version 3.0.0. That version change signals that my product needs to be rebuilt and published since the dependecy has changed it's public api.
What would my product now be? Is it up to what changes I had to make? I mean, if I changed my public api to get things working, then obviously I would release 2.0.0. But otherwise, would it be 1.2.4 or 1.3.0?
Both seem a little "off". My package technically is the same "version" it was before. I would almost rather have a "build number" at the end. My package is just a rebuild, maybe no code changes, so it would go from 1.2.3-1 to 1.2.3-2, for example. Users still know they have the same "version" but there is still something in the version number to help the build system with its dependency management.
What is the "semvar way" though?
#80 covers a subset of this, but I believe that it is still unanswered and is core to what you are asking.
If you completely hide/internalize all types in the dependency, then the version update to 1.2.4 or 1.3.0 should reflect your level of changes needed to consume the dependency.
If on the other hand you expose ANY types from the dependency, and those types had any breaking changes, you now have a breaking change and should release 2.0.0.
I think it is more related to #146. In this case (assuming the api didn't change), you actually have a reason to publish an 'unchanged' version (externally, internally it did change because of the updated dependency).
The question is then; how to version an unchanged external API, with a changed internal implementation.
I wouldn't say more related, but equally related. Both are concerned with dependencies changing, each having different outcomes based on changes reflected in your public API.
I guess the point is that a lot of times the "changes" required to absorb a dependency change, is simple a recompile. Nothing changed in your public api, and you didn't add any new features.
The rules seem lacking here.
It isn't a bug fix/patch so 1.2.4 seems wrong.
It isn't an addition of functionality so 1.3.0 seems wrong.
It is still backwards compatible so 2.0.0 seems wrong.
Now, one could ask the dependency what the change was for. If it was a bug fix then maybe bump your version up by a micro. If it added new features (that you obviously aren't using) you could possibly bump up the minor. If it was a major change, that somehow affects you such that you are now no longer backwards compatible, then use a major.
But those all seem to be using your version to indicate changes in other packages, not yours. Odd...
If you had to recompile, then it is a patch (you edited linker metadata), otherwise you should just use binding redirects or the likes.
This is the core of dependency management, if you don't have to do anything to use an updated dependency, then do n't do anything at all. Let your consumer decide.
If you want to force people to use the new dependency, then you have made a change and need a new version of some form.
An example of the options available assuming you do want to take an update to a current dependency follows below. If you do not want to update, then do not and you are done.
Lib A is at version 12.3.4 and has type Awesomeness with 2 public methods FixAllTheThings() and DrinkAllTheBeer().
Say you have a dependency on Lib A and your current version is at 1.1.1.
Lib A is updated with a patch release, to 12.3.5, what should you do?
Lib A is updated to 12.4.0 with some minor non-breaking additional features, what should you do?
Lib A is updated to 13.0.0 because 'FixAllTheThings()' has been removed, what should you do?
question: if you only need to recompile, no code changes whatsoever, shouldn't the update go into the build metadata? The source is the same, only a different build.
if you have to recompile, something changed
To throw a little gasoline on the fire: What if your dependency change increases the minimum required version for the dependency? And moreover, what if that dependency was not a minor, easily-upgraded component but a broad piece of the platform, such as dropping support for Ruby 1.8?
@wcooley would you consider dropping a platform a breaking change? I think I would, which would mean the next major version should be used. Don't be scared that version numbers go up, that's what their job is.
@EddieGarmon Oh yes, I definitely would consider dropping a platform a breaking change -- but I am mainly looking at this from the ops/deployment side, where "public API" is either less meaningful or means something different.
The v2.0.0 spec does not really address this in the FAQ "What should I do if I update my own dependencies without changing the public API?" or perhaps says the opposite: "That would be considered compatible since it does not affect the public API."
I am motivated to seek clarification because I am seeing Github's own projects dropping support for Ruby 1.8 and only incrementing the patch level and wanted to make sure that I am not misunderstanding something. See, for example gollum/gollum-lib#44 (which propagates up to gollum itself).
I don't like to talk in terms of a "public API" when it comes to versioning, for just this reason. It's too limiting. Instead, I consider the entireity of what my code provides and relies on, and how others view that, from the point of view of published information (or perhaps "reasonable expectation"; I'm not sure which is more appropriate, but I do know that all published information would form part of a reasonable expectation). I don't have a single word to describe what I'm thinking of, but it is essentially the "universal everything" that isn't the code under consideration. I'll call it the "gestalt" for the purposes of this comment, because it's a cool word. You'll hate it by the time you're finished reading this, though.
To expand a little further, I like to think about this problem in terms of the set of "valid gestalts". A gestalt is considered valid if it conforms to every piece of published information and/or reasonable expectation about the code. If the other parts of the gestalt rely on the API in the manner it has been described, and the gestalt contains a set of other components that the code has stated are required for its proper operation, then the gestalt is a part of the valid set. If a gestalt is missing something essential, or it relies on undefined behaviour of the code to get its job done, then it is an invalid gestalt, and is not part of the set.
I apply this "set of valid gestalts" principle like this:
Some examples, borrowing from the gollum-lib example from @wcooley, because that's how I came across this issue and decided to drop my (very wordy) 2c in:
If I drop support for a version of Ruby, then that's a major version bump (nokogiri should have done this in the first case). The set of gestalts previously included Ruby 1.8, and now it doesn't.
If I change the dependencies in my package such that a new version of that dependency must be installed, that's a major version bump. Gollum-lib should have done this when it bumped the nokogiri dependency (whether or not nokogiri did the right thing).
I expect this to be controversial, but it does follow logically from the "gestalt view" (in that the set of valid gestalts in the old version included an old version of the package, but the set of valid gestalts in the new version does not, therefore the new set of gestalts is not a superset). This isn't as theoretical as it appears, either -- if I bump a dependency, that could have some nasty ripple effects (if my package is a dependency of something that also depends on one of my dependencies, and I major-bump my dependency, I've now broken things).
If I modify my code so that I now run under a newer (or older!) version of Ruby when I didn't previously, then the set of valid gestalts with the new version contain all the old ones -- all the previously supported versions of Ruby still work -- and also contains gestalts that weren't valid previously (ones that are identical to the existing ones, but also include this new version of Ruby). This can be released with a minor version bump.
If I make a modification to my code, such that things now run faster, this illustrates the "published information" principle. If the previous version made no statement about performance, then that's just a patch bump -- the set of valid gestalts is unchanged (because performance was not an axis along which performance could be measured).
On the other hand, if I did promise some certain level of performance, then I need to make at least a minor version bump, if my promise was "it'll run at least this fast". In that case, the set of valid gestalts included all those which were OK with your code taking up to the time it used to take, and also those gestalts which needed the code to run as fast as it does now. If, however, you stated a valid time range, and you're no longer within that range, then that's a major version bump -- gestalts which previously assumed that the code would take a certain time to run will no longer be valid.
(As an aside, before anyone says, "that's a ridiculous example, why would anyone have a problem with your code running faster?", I'll just point to PC XT games running on an 80286... those machines had a turbo button for a reason...)
Anyway, this is potentially the longest all-text comment ever to be posted to a Github Issue, and I do apologise. In summary, I think that just considering the "public API" of a piece of code is insufficient to be able to properly version the code in a way that allows others to safely rely on the semantics of the version number when making decisions.
So this just happened: quartznet/quartznet#229, anyone more familiar with semver can comment on this?
If the dependency whose major was updated is linked or bundled statically and does not impose changes for the project's API it should be okay to just increase patch or minor if you suspect few issues to arise.
If, however, you are linking or just working dynamically in general, I think increasing minor is a must and probably even major since a library can have two types of interfaces, incoming and outgoing. Upgrading the logger dependency would be an outgoing interface change and must be considered part of the public API.
I agree with @FichteFoll here:
"If the dependency whose major was updated is linked or bundled statically and does not impose changes for the project's API it should be okay to just increase patch or minor if you suspect few issues to arise."
The trail of thought is: single responsibility principle.
A dependency should only be responsible for the API it exposes itself.
(Even if it's API can be augmented by another dependency).
Interpreter ~> 1.0
So, Library bumps it's dependency on Interpreter to Interpreter 1.1.0 to side-step the bug.
What should the version of Library be?.
Library 1.0.1 (patchlevel change).
Because the API didn't change - specifically, there are no code changes necessary in Tool (or apps using Tool), so there's no need for even a minor bump in Library.
Even if Library's API is augmented by a different interpreter version (Interpreter 1.1.0), it doesn't matter.
Although Interpreter 1.1.0 can "extend" the API of Library, it isn't Library's responsibility to document and manage that API. That's because Tool can further bump the dependency on Interpreter to Interpreter 1.2.0 - outside the control of Library.
The interesting thing here is that users typically see Interpreter as a "special case", which is "outside" SemVer. So they expect Library 1.0.0 to be bumped to Library 2.0.0, because the following case seems "broken":
So it might help if this case was accurately covered by SemVer.
(That according to single responsibility rule, a change in patchlevel does not guarantee that dependencies will stay satiable).
I think it's an error to suggest a major bump in this case because: you no longer can patch the previous major version with automatic upgrades.
E.g. If Library was erroneously bumped to 2.0.0 (just to avoid the "install breakage" above), then Tool is still using Library 1.0.0, which not only contains the bug, but needs a dependency change to to the major version of Library 2.0.0 to be fixed. And that misleadingly suggests an API change. (Which never happened).
In short: a backward-compatible bugfix should be reflected by no more than a patchlevel change, regardless of the circumstances (dependencies or dependencies of dependencies needed for the patch).
I'm wondering how to present this in a generic way (without the notion of an 'interpreter').