Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.Sign up
Let's install peer deps again! #43
While I could be convinced to make missing peer dependencies exit with a non-zero install code, I have significant concerns on the "automatically install peer dependencies" part.
To summarize, I think peer dependencies are one of the most elegant and powerful concept in the way our package managers work, and I'd be careful about touching it.
Luckily, pnpm does not have the described issue caused by hoisting. In the described usecase, pnpm will link react 16 to the node_modules of ink.
I would agree to clone npm's behavior into pnpm and exit with non-zero exit code on unsatisfied peer dependencies. We already have a strict-peer-dependencies config that is
I agree with arcanis that this can/should be fixed without the autoinstallation of peer dependencies. Even if this will be accepted I don't think pnpm will ever change the way peer dependencies are treated. It would require a huge rewrite on our side. pnpm resolvs the peer dependencies during a separate installation stage that happens after all the dependencies were resolved and fetching started.
In npm, a dependency on a version will already be satisfied by a git dep if it has a matching version.
I'm not sure if that means it's "tied to one registry", but installing from a different registry is fine, too. But clearly, from npm's point of view, this is not "obviously broken". It's normal, and a pretty common workflow for using git deps to rely on a fork that floats a patch while awaiting a PR merge.
It does open the door for having to do actual constraint solving, it's true. This occurs when A, B, and Z all land at the same level in the tree (ie, they are all dependencies of the same package).
The simplest answer is that if there is a peer dep that satisfies all predicates, we use that and continue; if not, the install fails. The harder answer is to search for an instance of the predicates where there is overlap, using pubgrub or something very much like it. Again, if there is no overlap, the install fails.
Also note that this would only occur when A, B, ..., are all added simultaneously. If a user does
Most packages today only have 1 or 0 peerDependencies, so the impact here is likely to be far smaller than you're suggesting. And while failing an install is not good, failing at run-time is much worse. And that's what we do today. While you're right that there's a lot of documentation about this, there's also a lot of users who are forced to learn implementation details unnecessarily, when the package manager could resolve it just fine.
Note that peerDependencies were installed by npm when originally implemented at the end of 2012. It only changed in npm v3, first released on 2015-06-25, because it was harder to get them to play nicely with the deduping logic. So, 2.5 years of installing them, 4.5 years of not installing them, then some unknown number of years installing them again. It's not like it's some huge tradition we'd be bucking here, one way or the other.
Though I think we may just end up disagreeing, this is very helpful feedback, thanks. I intend to update this RFC with more use cases and algorithm examples prompted by this discussion.
Wouldn't it be possible in pnpm's installation approach to just treat peerDependencies as if they were normal dependencies, since everything is already maximally deduplicated with symlinks? I am sure I'm missing some subtlety, but I'm curious what problems you'd see if pnpm auto-installed peer deps.
There is no tie in this case because the package isn't installed. If the package becomes installed from the registry by default, then it's a whole other story.
Consider a package like
With auto-install, the package managers will instead try to install
When writing these words I notice it's even more problematic than I thought, because an attacker would be able to publish on the default registry a malicious package with the same name as an optional peer dependency that was meant to be fetched from a difference source. Any package that would omit fulfilling the peer dependency would be vulnerable to this form of "dependency injection".
No, this isn't the case. With regular dependencies, A B and Z all have their own set of dependencies that have no guarantee to be related in any way. Package managers will likely try to optimize (deduplicate) them in some capacity, but aren't forced to do so because there are many different ways you would want to optimize apply your optimizations. Optimize for package count, optimize for package size, optimize for file count, ...
The mechanism you're referencing would significantly break this model because the auto-fulfillement of the peer dependency would have to be the overlap of all its dependencies for the semantic contract of peer dependencies to be correct, making the architecture much more complex because of the parent↔child relationship that I described.
Pubgrub and other problem solvers would be potential candidates, but from my experience such systems are hard to get right, hard to maintain, hard to understand, and only provide a debatable value for a hefty cost. I'm not against them per se, but I don't think they should be the only way to implement a correct package manager in the Node ecosystem. It should be an optimization, not a requirement.
I mentioned that, but I have a different interpretation. The few numbers I can find about npm show that the number of packages and users have each doubled every year - which would imply that the ecosystem size is
When you consider that a lot of people aren't even aware that npm is a company, that reveals how hard it'll be to change this kind of behavior without causing a lot of confusion.
So, just to make sure I'm understanding you:
And, let's say that we install
Iff the repo at github:user/foo has a version matching 1.x, then the
I don't see the problem, am I missing something?
Are you saying that the one in my existing dep graph is the one from git? Ok, well, if the version in the git repo's package.json matches the peerDep's requirement, then it'll just be deduped, and never fetched. It seems like a stretch to call it an attack vector. Even if it is, I fail to see how it would be any more dangerous with peerDependencies than it would be with regular dependencies. Wouldn't development of the
Says who? That's not how it works today for any other dep type. If you want a git dep, install the git dep.
They are an optimization over "I can't install this, fix it out yourself". Which, again, is the current status quo, so I don't think anything is lost even if we don't add that optimization.
A failed install is preferable to a broken/incorrect install.
I'm curious. Please identify a case where advanced constraint solving would be strictly required, though A, B, and Z are not dependencies of the same package. As far as I can tell, the algorithm described in the RFC would land them at different points in the tree, with their different peer dep resolutions, if they were incompatible and required by different packages. And, it is only the top level where this is guaranteed to be unresolvable, though it may still be unresolvable deeper in the graph.
Likewise, if P depends on B and Z1, then there's no way for B's peer dependency to be met, because the Z at or above P's level would have be Z2 to satisfy B's peer dep, and Z1 to satisfy P's dep.
Many of these scenarios will be very unlikely and hard to get into, however, because typically every dependency is the root of its own development project, and the restriction is strictest there. If P depends on Z1, then the installation of B will fail in the P project while in development. It'll be hard for P to ever depend on both A and B, because in the development of P as a root project, the installation of either A or B will fail once the other is present. So, P depending on three packages with incompatible peerDependencies is wildly unlikely, as it would be guarded against in the development process.
But peer dependencies are not any other dep type. You don't specify where the package should come from; you specify which version it should be. I believe it's where the problem in your example lies. The range attached to the peer dependency in
Which means that if you install
It's very different because it means that a user that would forget to list a peer dependency would automatically download from the default registry a package they don't intend to. With regular dependencies they would only get what they declare from where they declare.
The only way to fix this would be for a peer dependency declaration to somehow be able to both declare the expected version for the package (
And I agree with that, which is why I would be supportive if the RFC was to make unmet peer dependencies generate errors by default. Yarn and pnpm would both implement this, because it's a simple idea, with a known scope, that makes the ecosystem stricter and safer while being generally easy to implement. It's a win for everyone.
By contrast, automatically installing peer dependencies would be backward incompatible with packages as they got published so far, wouldn't play well with existing implementations, could have unforeseen effects with regard to the ecosystem, and would answer no particular pain point that I've seen reported. I would not implement it in Yarn in this situation.
Aha, I think this is the fundamental nut of our disconnect here. Other than not installing them, and requiring that they be found at or above the dependent's level, npm doesn't treat
That is, in npm's model, a semver range dep is about what the thing is. It says "wherever you get this thing from, it should match this version range." If there is not some other specific origin providing that dependency already (ie, a git dep that it can dedupe to) then it'll fetch from the registry. Once it's been fetched, it has a
If you want to specify that you have a git dependency at a specific semver range or version, then you can use the
A peerDep is no different (to npm) from regular deps except in where it has to be installed in the node_modules tree relative to its dependent. Maybe that's not how Yarn works, but that's fine. Yarn isn't npm.
At this point, I think this particular issue has been fully explored. I am comfortable with disagreeing.
I'm having trouble understanding this bit. If you declare a regular dep on
I reported a pain point in this RFC. More:
In fact, searching stack overflow for "peer dependencies" is mostly people wondering what to do when they encounter the error that the peer deps are missing. This is just from the first page, but the issues started about 4 years ago when npm 3 shipped, and it's still a regularly asked question today.
This is a legitimate source of user pain that I intend to address.
Nothing stopping you from doing that :)
And that for more than four years it got specified as not being installed unless explicitly specified by the consumer. That sounds like a significant difference in the context of this discussion
Two more questions then:
At the very least, if you're still adamant about doing this, I would appreciate you making sure this doesn't split the ecosystem in half with packages not being installable for other projects simply because we wouldn't have teached them well enough that this would be a fallback, not a feature.
Printing warnings when automatic installs happen seems like a great idea; that would lead towards users doing what they should Always Have Been Doing anyways; explicitly specifying peer deps as top level deps/dev deps.
I dearly hope that
One way we could thread this needle is to not completely fail the install if a peer dep can't be installed. Instead, any unresolvable peer dep would be a warning and a non-zero exit code from
So, no package that's currently installable wouldn't be installable in npm v7, but they would cause the install to exit with an error if they have a peer dep conflict.
I'm not sure if that's the best approach, since it does leave the tree in a broken state, but it could be a helpful step towards strictly requiring peer dep resolution in a future npm version, while minimizing the ecosystem breakage. It would also leave the door open for a ui to walk the user through deciding on a resolution, which then could be saved to a
My intent is to just go ahead and quietly install peer deps if they can be resolved successfully. The warning leads to unnecessary user confusion.
I don't want to be annoying, but I think you have misunderstood me. When I said:
I didn't mean peer dependencies with no possible resolutions - I meant dependencies with resolutions that will end up automatically installed on npm but not other package managers. Which will mislead at least some users into thinking that omitting peer dependencies will be fine because it'll work on their package manager so it must work everywhere, and to crash on their users' setups because no, it was never fine. And it'll be up to us to explain it to your users.
Now there are solutions to this conundrum. I see three:
The problem with that is that a strict following of this requirement would allow to just always return an exit code 0 from
Both are fine by me, but not both at the same time...
Why would automatically installing peer deps lead people to think that it's fine to omit peer deps? It seems like it would make it less likely, in fact. Currently, if you omit peer deps, you save your users a confusing error, and can give them something useful at run time. "I see you have react 15, but this module can only be used with react 16". Or alternatively, just break, because some variable is not found (as is the case with ink).
If npm makes this work, that seems to me like npm doing its job.
I think it's clear that's what Yarn will do, and not what npm will do.
We're going to install peer deps without warnings. Part of the point of this is to make it just work most of the time, without requiring users to fix stuff by hand. If the peer deps can be resolved and installed, then the exit code will be 0.
That doesn't solve the problem that users are facing. So, no.
Pretty clear that isn't what @ljharb was suggesting.
Because they will be silently installed even though they are not added as dependencies.
But if the autoinstallation will add the autoinstalled peer dependencies as direct dependencies (to the project's package.json) then it will be clear that peer dependencies should be installed. And this projects will work with pnpm/yarn.