Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Built-in support for semvered specifiers on the deno.land/x registry #17495

Closed
dsherret opened this issue Jan 22, 2023 · 40 comments
Closed

Built-in support for semvered specifiers on the deno.land/x registry #17495

dsherret opened this issue Jan 22, 2023 · 40 comments
Labels
suggestion suggestions for new features (yet to be agreed)

Comments

@dsherret
Copy link
Member

dsherret commented Jan 22, 2023

Extracted from #17475 and rewritten with added context

Problem

On the deno.land/x registry, it's difficult for module authors to publish a module and specify the version constraint of their dependencies. This often leads to module consumers ending up with duplicates of almost the same dependency (ex. 1.2.1 and 1.2.2 of the same module).

The recommended solution today is for module consumers to examine their dependency tree, then do the "module deduplication" themselves. This is hard for several reasons:

  1. Users need to remember to inspect their dependency tree.
  2. Users need to know if having multiple versions of a dependency in their tree will cause issues. When it does, this can be really painful and hard to figure out.
  3. Usually module authors have the best context of what versions of their dependency work well with their module.

Proposed solution

To allow users and module authors to easily optionally depend on version constraints of dependencies, built-in support of semvered specifiers for the deno.land/x registry could be added in order to provide a recommended solution to this problem.

// https://deno.land/x/example/mod.ts
import { compile } from "deno:path_to_regexp@^6.2/mod.ts";

The deno CLI could analyze these semvered specifiers and based on them, create an internal implicit import map:

{
  "imports": {
    "deno:path_to_regexp@^6.2/mod.ts": "https://deno.land/x/path_to_regexp@v6.2.1/mod.ts"
  }
}

This would solve all the issues above out of the box and allow specifying more terse imports for the deno.land/x registry.

Other tools wishing to use code that has these specifiers, could run them via an import map that could be generated from the output of deno info --json.

  • The registry needs an endpoint that specifies all the versions of a package. We already do this for LSP auto-completions, but we should have something explicitly for this.
  • deno: or something else?
  • How could we support other registries? Environment variable for sure, but maybe a way to do this in the specifier itself (ex. deno://crux.land/path_to_regex@^6.2)?
@dsherret dsherret added the suggestion suggestions for new features (yet to be agreed) label Jan 22, 2023
@dsherret dsherret mentioned this issue Jan 22, 2023
13 tasks
@ayame113
Copy link
Contributor

I think the disadvantages of the complexity introduced are greater than the advantages gained by implementing this.

Since npm modules are designed with semver resolution in mind, having multiple versions of dependencies tends to cause problems. Therefore, I think that npm specifiers need semver support.
However, the deno.land/x module was not designed with semver resolution in mind. It is designed so that there is no problem in operation even if it has multiple versions of dependencies. Rather, I'd like to avoid semver's security issues by definitely pinning the dependencies.

If there's a use case for semver, it's when I want to bundle front-end code to reduce size instead of running it locally on Deno. In that case, I'm not running any code on Deno, so I don't think having built-in semver support would help. (Maybe a clever bundler that automatically resolves semver would be useful here.)


What if another runtime starts implementing import specifiers like node:, deno:, cloudflare-workers: or vercel:? And what if there are subtle compatibility differences between them? I think the developers will be probably confused.
To make the future JS ecosystem better, instead of inventing your own specifiers here, I think it's better to do the same solution as web standards and Node.js, and keep module resolution as simple as possible.

@albnnc
Copy link
Contributor

albnnc commented Jan 23, 2023

Ideological Concerns

Built-in support for semvered specifiers on the deno.land/x registry

Could you please remove deno.land/x from the title if it's not exclusive for /x/?

it's difficult for module authors to publish a module and specify the version constraint of their dependencies. This often leads to module consumers ending up with duplicates of almost the same dependency (ex. 1.2.1 and 1.2.2 of the same module).

As @ayame113 noted, it should be completely OK for non-legacy code. Yes, you still have some exotic cases nowadays, but these are exceptions rather than the reason for such a controversial change.

Users need to remember to inspect their dependency tree.

Even if you have these problems somehow, inspectable deps tree was a part of the ideology, which is good. It's good for developer to know which deps does his project have.

Usually module authors have the best context of what versions of their dependency work well with their module.

Yeah, but the main idea of semver is that any version under the same major number should be compatible. If it's not, library author should deprecate incompatible version of his library and create a major release. I mean, specific @{version}-like release in URL is a good enough indicator of what library versions should be used.

The deno CLI could analyze these semvered specifiers and based on them, create an internal implicit import map:

This doesn't feel normal, since it's not simple and not stable because of the possible changes of registry index, which will affect your dependency tree possibly.

Many of the ideological concerns were stated by the community in the original roadmap issue. I personally think that @aapoalas was exceptionally strong in this comment.

Technical Aspects

There is a common use-case when module from deno.land/x uses another module from crux.land and so on. Are you going to consider deno:lib_x@18.0.0 and custom:lib_x@18.0.0 the same modules? Sometimes that's not true and still has to be solved manually. Probably, this is why you can't use the same package from different registries in Node. But in Deno it was perfectly fine because of the manual inspects which were considered ideological decision.

What if one ends up with having deno:lib_x and https://cdn.custom.com/lib_x in the same dependency tree? I mean, you still have to monitor your dependency tree and solve these problems manually from time to time. I feel like the usage of deno:-specifiers is going to make people drop the usage of classic CDNs because of that. Isn't it?

The registry needs an endpoint that specifies all the versions of a package. We already do this for LSP auto-completions, but we should have something explicitly for this.

This is going to increase the complexity of self-hosted CDN / registry. Moreover, it seems that self-hosted registry is going to be vendor-locked to Deno's implementation, which is not good. Nowadays we have a wide choice of full-featured CDN implementations, which is nice. Yes, you kinda could still use these even after, but at same time you could not because of the previous point.

deno: or something else?

It feels that deno: could be used for either Deno namespace or deno.land/std, which might be misleading for some cases.

How could we support other registries? Environment variable for sure, but maybe a way to do this in the specifier itself (ex. deno:crux.land/path_to_regex@^6.2)?

There is a common use-case when module from deno.land/x uses another module from crux.land and so on. So, in my opinion, environment variable doesn't suite here quite well. However, deno:crux.land/... looks very ugly and I don't think we're going to be happy with this also. Moreover, it underlines the "badness" of other registries / CDNs compared to deno.land/x. Are you going to postulate the usage of single registry for particular app (dependency tree)?

Maybe we could register custom: URI schemas via the extension of import map spec, but this would add even more complexity. Any other ideas?

@aapoalas
Copy link
Collaborator

aapoalas commented Jan 23, 2023

  1. Users need to remember to inspect their dependency tree.
  2. Users need to know if having multiple versions of a dependency in their tree will cause issues. When it does, this can be really painful and hard to figure out.
  3. Usually module authors have the best context of what versions of their dependency work well with their module.

I like this problem statement. Nice to have this.

First a couple of arguments against it:

2: Users needing to know if a duplicated dependency will cause issues is unavoidable: Some dependencies will, some won't. Some of those dependencies will be used by other dependencies with a static version which may clash with other static versions defined by yet another dependency. Semver resolution doesn't make this issue go away, except if everyone uses semver and they define similarly loose semver ranges. Manual deduplication (or an automatic tool to help with this) will still be needed for any other case.

3: Most modules will probably only be tested to work on the specific version that the developer used locally when they were writing it. A module with a large support crew like Nest.js or React might actually have tested at least some alternative versions in the semver range but I doubt that's a very common occurrence. eg. Is Fresh tested against all Preact v10 series versions? Most likely not: We just trust that breaking changes do not occur within the same semver. That exact same trust can be used by an end-user to deduplicate locally.

Now, it's definitely true that deduplication currently would be done manually using an import map. However, I do not see this to be a particularly horrible state of affairs in general. Import maps are a great and powerful tool, and they are a standard so it makes sense to use them. The Deno team proposal would effectively leave the world of import maps: Yes, internally it might still be an import map resolution or something that can be turned into an import map, but it would still be an internal resolution algorithm. You might as well argue that Node is using import maps since for any node_modules configuration it is possible to write an import map that resolves the same way.

Alternative proposal

I would instead propose that Deno would simply make generating this proposed internal import map easier. That is, provide a tool (either in Deno CLI or as a script in std) to both view the dependency tree information in a more human readable format (suppressing details in favour of readability), and to generate an import map from the dependency tree with flags controlling (automatic) deduplication.

A rough sketch would be something like this (all the command names and flags here are horrible, don't mind them specifically):

Human readable format for detecting duplicates

deno info --human-readable mod.ts

This would output the dependency tree (taking into account existing import map if detected) but dropping out all individual code files from the listing, as well as flattening the dependency tree: In this view we do not care if dependency X is imported by A or B or both, nor if it is X/mod.ts or X/deps.ts or both that we are importing. This would be a simplified list of which HTTP URLs we're importing code from with similar paths joined together so that X/mod.ts and X/deps.ts simply become X.

EDIT: The intention here is to prove an easy form from which a user can manually detect duplication, eg:

https://deno.land/std@0.164.0
https://deno.land/std@0.172.0

This command could also optionally highlight (potential) duplicates with a different color or such.

Generating import maps with automatic or semi-automatic deduplication

deno info --import-map --dedupe=automatic mod.ts

Same as above but now write a new import map and request automatic deduplication. Given that the /x/ registry does not force semver, nor does it force deno: imports or anything else this would need to be somewhat conservative, but that goes for deno: imports as well. Similarly of course this couldn't dedupe between different registries which again goes for the proposed deno: imports as well.

This command would generate an import map where eg. /x/X@0.1.0/mod.ts and /x/X@0.1.1/mod.ts get deduplicated to X@0.1.1/mod.ts (or possibly to just X/mod.ts), ie. it would handle semver deduplication. The basis could be somethingl ike BASE_URL@VERSION/PATHNAME, where BASE_NAME determines the "name" of a library, VERSION determines the version of course and is checked for semver-likeness, and finally PATHNAME is the "ignorable" part that determines an individual module within the library. If two imports with the same BASE_NAME but differing VERSION fields are detected, then a deduplication is done.

This tool could then be extended to control various parts of this deduplication and import map generation, such as:

  1. Never deduplicate but print a warning instead.
  2. Always prompt for deduplication (offer choice of keep deduplication, choose higher, choose lower, ...?).
  3. Automatically deduplicate to higher, lower.
  4. Create plain import map (turn each dependency import into a plain import using BASE_NAME.split("/").pop() kind of thing).

Additionally, this tooling could be built to take into account registry-provided hints of existing import maps in dependencies (I outlined an idea of registries opting to provide a path to library import map through a custom header.), maybe also including loose semver definitions in this. A dependency could then opt to eg. do loose semver plain imports with an import map as:

{
  "imports": {
    "library": "library@^3.0.0",
    "library@^3.0.0": "https://deno.land/x/library@3.1.2/mod.ts"
  }
}

This would still mesh with a local import map that defined a similar loose semver range.

@aapoalas
Copy link
Collaborator

One thing to consider as well:

The Deno team's proposed solution uses an implied internal import map to handle the deno: imports. It's not mentioned that external import maps would be then forbidden, so I presume that external import maps can still be defined in deno.json. In this case Deno will need to implement an algorithm for composing import maps, which is not (yet) supported by spec. That already makes this a custom, out-of-spec solution. If we were to take a page from System.js' import maps usage, they explicitly mention that import maps composition via an array of import maps should be considered an unstable feature. The same would apply here: If the spec for import map composition finally arrives and it turns out Deno guessed the final algorithm wrong, then either Deno will need to break existing software or will need to bow out of spec-compliance.

@albnnc
Copy link
Contributor

albnnc commented Jan 23, 2023

@aapoalas, I also tried the same (or quite similar) deduping idea here. The code is undocumented and quite unoptimized, but I just want to say that this idea seemed to be working well.

This command would generate an import map where eg. /x/X@0.1.0/mod.ts and /x/X@0.1.1/mod.ts get deduplicated to X@0.1.1/mod.ts (or possibly to just X/mod.ts), ie. it would handle semver deduplication. The basis could be something like BASE_URL@VERSION/PATHNAME

One'll probably want to have different bases or even URL patterns for different CDNs, since some CDNs produce different prefixes for the same assets sometimes. This should be configurable probably.

But yes, the idea of controllable resolving to the highest already available version in dependency graph is what I'd like to see also.

@KnorpelSenf
Copy link
Contributor

  • How could we support other registries? Environment variable for sure, but maybe a way to do this in the specifier itself (ex. deno:crux.land/path_to_regex@^6.2)?

It used to be a value of Deno that deno.land/x wasn't special in any way to the CLI. I'd like it if things could stay that way. So whatever we settle for, the solution should regard deno.land as just another registry, as well.

I think the idea by @aapoalas is great! It does not care about the implementation details of the registry as long as there is semver in the URL.

In addition, it will work well with how libraries are written today, it is easy to understand and maintain, and it seems to align pretty well with Deno's values of not doing any magic and following web standards. No more custom deno: protocols, just a simple tool to efficiently solve the problem in a straightforward way. I like this.

Deno is easy to learn. What I like especially about this suggestion is the fact that it can be used optionally, which makes it very approachable. When people read module code or application code, they don't have to know anything about what using deno: instead of https: implies. The code just imports something and that's it. As soon as one must dedupe some deps (which isn't exactly the first problem to solve when learning how to code) then one can use an import map, and it's generated automatically for you—done.

@4513ECHO
Copy link

Please try tani/lib.deno.dev (https://lib.deno.dev) for redirecting std or /x/ registry with semver.
I think built-in support for semver with deno: is not needed.
We can have redirect or other like these:

  • https://deno.land/semver/std@^0.170.0
  • https://semver.deno.land/std@^0.170.0

@lilnasy
Copy link

lilnasy commented Jan 29, 2023

deno specifiers is a concerning API. I would much prefer deno.land handling more complex module resolution rather than the CLI, as others have suggested.

There was a discussion maybe over a year ago, in which ry suggested that deno.land should transpile modules automatically to js, presumably to align Deno with browsers and make things simpler for those who write for both. This feature seems to be going the other direction, and frankly it's worrying as a long-time user.

@lilnasy
Copy link

lilnasy commented Jan 29, 2023

Credit where it's due, esm.sh already does what deno: is supposed to do (and npm: for that matter).

@rajsite
Copy link

rajsite commented Mar 20, 2023

What happens for version resolution failures?

ie:

depA@1 -> depC@>=3
depB@1 -> depC@<3

As a plugin library author (i.e. depC is a library with a plugin mechanism and depA / depB register with depC), I want this to fail. I don't want this to resolve to two different versions of depC. As an app author I want this to fail understandably so I know the "plugins" that are failing resolution and can resolve it before the app can run. What would help me is knowing if there are upgrade or even downgrade paths for depA and depB that can resolve successfully.

As a non-plugin library author, as long as these module uses are isolated I don't think I care how many copies are in an app. As a library author I want tooling that makes it easier to keep my library up to date and tested against the latest versions of dependencies. As the app author it's nice if the app doesn't duplicate dependencies. As an app author what I want is insight into my dependency tree and tooling for easy updates of dependencies.

If the community and applications had the necessary tooling and a strong ingrained practice for staying "up-to-date" that would lead to natural de-duplication for cases where de-duplication isn't critical for execution (minimizing app size) and make resolution easier for cases where de-duplicaiton is critical (plugin libraries sharing transitive dependencies).

@dsherret
Copy link
Member Author

dsherret commented Mar 20, 2023

What happens for version resolution failures?

ie:

depA@1 -> depC@>=3
depB@1 -> depC@<3

I wouldn't consider this a "version resolution failure" as it's a valid scenario that should be allowed. This would not error and resolve depC >= 3 for depA & depC < 3 for depB similar to what npm does and many other package managers. The output of deno info, deno info --json, and the lockfile (deno.lock) would describe this and tooling could be developed on top of that to be stricter about not allowing multiple versions of dependencies.

If someone really wanted they could use an import map to override one of the specifiers to point to a different version and remove the duplicate dependency if they find the code still works.

@rajsite
Copy link

rajsite commented Mar 20, 2023

I'm arguing as a module author an important constraint I need to express is of being a singleton (the plugin system use-case). I'm re-iterating it because the response doesn't acknowledge that use-case.

If that use-case is not intended to be improved as part of semver that's fine to state. I'd probably argue the feature design of semver-aware modules is missing an important use-case that should be considered / captured in design discussion somewhere.

The npm ecosystem had to do a bunch of twisting in the wind to express the singleton / plugin use-case out-of-band from semver as peerDependencies. Would be nice for deno to do it better / have recommendations of how module authors handle that better.

@lilnasy
Copy link

lilnasy commented Mar 20, 2023

this optimizes the server side bundle size but is it that much of a concern?

@KnorpelSenf
Copy link
Contributor

I'm still not a fan of deno: imports and I would prefer it if the registry would resolve this for us. I have come up with a few ideas on the matter, but your recent blog post at https://deno.com/blog/package-json-support rather sounds like you've made an irreversible choice already.

Is there any point to talk about this further?

@KnorpelSenf
Copy link
Contributor

KnorpelSenf commented Mar 23, 2023

I'll just drop my thoughts here in a somewhat brief fashion.

Problem

We want to deduplicate dependencies. A imports B and C, B imports D, and C imports a slightly different version of D. Now D is largely or entirely duplicated, even though B and C could very well share the same dependency D if only they knew of each other. (Peer deps basically.)

What the solution should offer

IMO we should try to:

  • solve the above problem
  • follow web standards

What the solution should avoid

IMO we should avoid:

  • inventing new protocols or other standards
  • treating https://deno.land any special (not even by making it a default or a global setting or anything, it's a URL like any other!)

What I think we should do

Example time.

Let's say we have A@1.0.0 and B@1.0.0 and C@1.0.0 and D with three versions: 1.0.0, 1.0.1, 1.0.2

  • A needs B and C
  • B needs D@1.0.1
  • C needs D@^1.0.0 (any version of D, that is)

Then what could happen is this:

  1. CLI requests https://url.com/a@1.0.0/mod.ts
  2. Server delivers /a@1.0.0/mod.ts which imports https://url.com/b@1.0.0/mod.ts and https://url.com/c@1.0.0/mod.ts
  3. CLI requests https://url.com/b@1.0.0/mod.ts
  4. Server delivers /b@1.0.0/mod.ts which imports https://url.com/d@1.0.1/mod.ts
  5. CLI requests https://url.com/c@1.0.0/mod.ts
  6. Server delivers /c@1.0.0/mod.ts which imports https://url.com/d@^1.0.0/mod.ts (note the version)
  7. CLI requests https://url.com/d@1.0.1/mod.ts
  8. Server delivers /d@1.0.1/mod.ts
  9. CLI requests https://url.com/d@^1.0.0/mod.ts
  10. Server replies with 302 Found to https://url.com/d@1.0.2/mod.ts and sets a header x-deno-versions that tells the CLI it also has /d@1.0.1/mod.ts and /d@1.0.0/mod.ts which are compatible with the specified semver (order implies priority)
  11. CLI finds that it already has 1.0.1 cached from before, so even though the redirect pointed at latest (1.0.2) the CLI can decide to request 1.0.1 instead
  12. CLI loads /d@1.0.1/mod.ts from cache

That way, we resolved all deps and deduped them correctly.

Comparison with deno:

Does it solve the problem?

  • deno:: yes
  • ambiguous redirect: yes

Does it follow web standards?

  • deno:: no (not really)
  • ambiguous redirect: yes

Does it invent yet another standard?

  • deno:: yes
  • ambiguous redirect: no

Does it treat deno.land any special?

  • deno:: yes (most likely?)
  • ambiguous redirect: no

So what?

Well, as you can see, using deno: is just a bad idea in many ways (others have pointed out good reasons before me). What's more: there are clean alternatives that actually follow web standards.

IMO npm: is a great idea, but please don't get carried away.

@roj1512
Copy link
Contributor

roj1512 commented Mar 23, 2023

It may sound rude, but how about this:

  1. Change the issue title to “Support for semvered specifiers”.
  2. Transfer it to https://github.com/denoland/dotland.

@KnorpelSenf
Copy link
Contributor

I don't think it is possible to deduplicate dependencies purely server-side. Only the CLI knows which other modules it needs, so it will have to be responsible for resolving the module graph.

But yes, we will still need an issue for the feature at https://github.com/denoland/dotland.

@KnorpelSenf
Copy link
Contributor

In case you're now thinking “yeah so the duplicate dependency problem can be solved better without deno: but whatever it's still cute to have bare specifiers so let's just add them anyway”—let me make a few more points.

  1. URLs in imports are not bad. Yes, they use more characters, but being explicit about where your deps come from is good.
  2. If you still must support shorter, more implicit things, consider allowing deno.land/x/module (implicit protocol, redirect to /mod.ts). That's short enough and still rather intuitive because browsers implicitly add the protocol, too. I personally wouldn't need this, but it could be a compromise.
  3. The point that deps.ts flattens things can be mitigated easily by making it a folder not a file. import "./deps/module.ts" works very well and doesn't have this problem. I've seen some people do this already.
  4. Verbose imports are only a “problem” for library authors because everyone else can just use import maps. The number of people who will benefit from a new deno: protocol is small.
  5. For application developers, consider building deno add URL. It can cache the module and add/create an import map with the respective entry. “Extract dependency to import map” would be a useful LSP feature. You can get pretty far by building better tooling.
  6. Other languages like Go also permit URLs in imports. People will get used to it.
  7. Justifying the addition of deno: by saying that we should happily invent even more protocols like github: and what not (as stated on the blog) is just bad. We went from “let's all unite on the web standard” to “let's create protocol hell” in no time.

I hope you dare to stay true to the philosophy of Deno, even though I understand that there's pressure to ease the migration from Node. Adding deno: would be a mistake :)

@KnorpelSenf
Copy link
Contributor

@ry I've seen the announcement for your talk at NodeCongress in Berlin. Did you decide to ignore all concerns about bare specifiers and protocol hell, and turn down the above suggestions? If so, why?

In case you guys actually pull through with deno: (and thereby do something that I and everyone I talk to see as an obvious mistake), I would appreciate if you could be transparent about this. The silence in the past weeks is not comforting. This drastic change of direction of the project tells me that the initial values of Deno were wrong. I want to understand why.

@bb010g
Copy link

bb010g commented May 5, 2023

Why is Deno reaching for a custom import specifier over experimental/unstable extensions to import maps that provide composability & dependency deduplication? They're both going to be custom, but the latter has the potential to become standard web technology. "Web-standard APIs" are advertised on Deno's homepage, and it'd be disappointing to see Deno move further away from those efforts.

@dsherret
Copy link
Member Author

dsherret commented May 8, 2023

Did you decide to ignore all concerns about bare specifiers and protocol hell, and turn down the above suggestions?

@KnorpelSenf no, we've just been busy working on other things lately.

Back in February, we internally discussed something similar using redirects where the CLI passes along its resolution state so the server can provide the correct redirect, but the CLI ignoring the redirect body is an interesting option. It would need to be slightly more complex to work and handle some edge cases, but I think it's something that could work.

@KnorpelSenf
Copy link
Contributor

Did you decide to ignore all concerns about bare specifiers and protocol hell, and turn down the above suggestions?

@KnorpelSenf no, we've just been busy working on other things lately.

Awesome!

Back in February, we internally discussed something similar using redirects where the CLI passes along its resolution state so the server can provide the correct redirect, but the CLI ignoring the redirect body is an interesting option. It would need to be slightly more complex to work and handle some edge cases, but I think it's something that could work.

That would be very cool to see! Please LMK if I can be of any assistance here.

Do you feel like elaborating on said edge cases?

@dsherret
Copy link
Member Author

That would be very cool to see! Please LMK if I can be of any assistance here.

@KnorpelSenf we discussed modifying how https imports work in more detail last week. We can't do that because it's not standard with how https is fetched and would differ from how browsers load https imports. From my understanding, both of these statements should be the opposite "Does it follow web standards? ... ambiguous redirect: yes", "Does it invent yet another standard? ... ambiguous redirect: no", but I'm not super familiar in this area (discuss with Luca if you're interested in following up).

Do you feel like elaborating on said edge cases?

By needing to be more complex, I mean:

  1. It would need a way to tell what resolved versions it encountered so the CLI could decide what version to select in order to deduplicate. This could be done via pattern matching the url or perhaps response headers (ex. for https://deno.land/x/oak@1.0.0/mod.ts the server could have a response header to let the cli know this is the module name and version oak@1.0.0).
  2. It would be better to be faster. Potential solutions that would pre-emptively get and cache a list of versions could help achieve that.

I can't remember the other points atm, but it seems like we can't do this anyway.

@lilnasy
Copy link

lilnasy commented May 21, 2023

Curious if a non-standard way to compose import maps was discussed: one where library authors provide suggested sources which will be picked if application developers don't override them.

Cascading import maps?

Fresh does something similar manually.

@dsherret
Copy link
Member Author

dsherret commented May 21, 2023

@lilnasy we've discussed that in the past, but import maps don't have a way for module authors or anyone to express semver resolution. They're only a way to map specifiers from a single root config file.

(On the note of composable import maps, I personally hope import maps are never composable. It would be complicated and it creates a package.json scenario where you need to consult a dependency's manifest file instead of just importing a script. I think dependencies should be consumed with their import maps already unfurled ahead of time (denoland/website_feedback#3) and everything can be figured out from the javascript files. Again, this is just my personal opinion)

@lilnasy
Copy link

lilnasy commented May 21, 2023

import maps don't have a way for module authors or anyone to express semver resolution.

Not natively. However, a library author can always include the desired semver in the keyed bare specifier.

import { colors } from 'std@^1/fmt/colors.ts"

...and "std@^1/": "https://deno.land/std@1.0.2/" (as specified in the registry) gets written to the user's root import map when they invoke deno add x:dax, for example.

This may come across as too much co-ordination, but what I want to avoid here is deno becoming the "odd" runtime that needs a lot of special treatment from third-party build tools. No one builds alone.

@KnorpelSenf
Copy link
Contributor

KnorpelSenf commented May 29, 2023

  1. It would need a way to tell what resolved versions it encountered so the CLI could decide what version to select in order to deduplicate. This could be done via pattern matching the url or perhaps response headers (ex. for https://deno.land/x/oak@1.0.0/mod.ts the server could have a response header to let the cli know this is the module name and version oak@1.0.0).

I just stumbled upon HTTP 300. I don't know why I didn't suggest this initially. HTTP has a native way of telling clients that it should choose to redirect to one of several provided options. It's even possible to give a preferred redirect, which can be used to point out the most recent version! I'll leave the MDN link here: https://developer.mozilla.org/en-US/docs/Web/HTTP/Redirections#special_redirections

From my understanding, both of these statements should be the opposite "Does it follow web standards? ... ambiguous redirect: yes", "Does it invent yet another standard? ... ambiguous redirect: no"

I see your point that not following the redirect for 301 would be non-standard. This isn't the case with 300, though.

What I meant by inventing a standard for bare specifiers is that you'd effectively come up with a protocol called deno. It would comply with the URL scheme, but it would compete with HTTP and that's what I don't like.

So no, it's not the opposite :)

@KnorpelSenf
Copy link
Contributor

@lucacasonato perhaps you wanna join discussion at this point?

@lucacasonato
Copy link
Member

Any extensions to HTTPS specifiers are non standard. Even this HTTP 300 approach would NOT work in browsers.

Also, to make this actually work one would still need to teach the HTTPS loader what "packages" and "package versions" are.

Imagine I import https://deno.land/x/preact@^10/index.js and https://deno.land/x/preact@^10/hooks.js. There is a separate file that imports https://deno.land/x/preact@=10.5.1/index.js. /index.js will be resolved to 10.5.1, while /hooks.js is resolved to 10.10.5 (or whatever the latest version currently is).

To ensure this does not happen, we somehow need to entangle all files imported from the same "package" to a single version constraint resolve. Without a non-standard extension this can not work.

Pretending you are using HTTPS specifiers, but actually you have a layer of magic underneath that makes it behave differently from "standard" HTTPS specifiers is very bad. It confuses users. Deno always takes the approach of "explicit magic". We don't implicitly do some magic internally. If you want magic you explicitly ask (for deno: specifiers). If you don't want magic, you use https:.

@wojpawlik
Copy link

wojpawlik commented Jun 12, 2023

deno: specifiers would force libraries to choose between deduping and browser support.

@lucacasonato
Copy link
Member

So would https:// - if a library requires de-deuping it won't work in browsers.

Also .ts does not work in browsers, so libraries can not be directly used in browsers right now.

@dsherret
Copy link
Member Author

dsherret commented Jun 12, 2023

deno: specifiers would force libraries to choose between deduping and browser support.

There will be a way to generate an import map which maps these specifiers to their resolved location for environments that don't support it. That said, most websites will need to transpile typescript in remote modules and want to bundle anyway.

@KnorpelSenf
Copy link
Contributor

Any extensions to HTTPS specifiers are non standard. Even this HTTP 300 approach would NOT work in browsers.

I fail to see how using HTTP 300 in the specified way is an extension to the standard. Perhaps I'm reading the specs wrong:

The server desires that the user agent engage in reactive negotiation to select the most appropriate representation(s) […]. The user agent MAY make a selection from that list automatically if it understands the provided media type. A specific format for automatic selection is not defined by this specification because HTTP tries to remain orthogonal to the definition of its content.

https://httpwg.org/specs/rfc9110.html#status.300

All of that sounds to me like we can safely redirect to multiple targets, and the Deno CLI is allowed to pick the one it likes best. We are free to implement this redirection as we like, and all of this is within the specs (as far as I understand it). How does it deviate here?

one would still need to teach the HTTPS loader what "packages" and "package versions" are

I am not so sure about that. If the two scripts somehow reference each other in any way, be is directly or transitively, then we will discover one through the other in the dependency graph resolution, so we can make the respective adjustments and resolve all files to 10.5.1. If the two scripts do not reference each other in any way, then why would it be required that they have the same version?

It feels a lot like deno: is trying to force npm mentality onto ES modules, which just feels wrong on so many levels.

So would https:// - if a library requires de-deuping it won't work in browsers.

The difference is that by using https:// your library CAN still work in browsers (unless it has as a requirement that only ever a single instance of it is loaded, which is a very broken design). In contrast, using deno: will prevent the code from ever being used anywhere else again except in Deno. (This could be an effective strategy of the Deno company to create lock-in effects btw, you should factor that in.)

Also .ts does not work in browsers

Deno can run JS.

@lilnasy
Copy link

lilnasy commented Jun 12, 2023

It would be reassuring to know if there's interest in this feature, from either enterprise or independent developers.

I am assuming you guys made this decision based on conversations that the community is not privy to. Without that context, this change comes across as bullheaded.

@dsherret
Copy link
Member Author

dsherret commented Jun 12, 2023

I fail to see how using HTTP 300 in the specified way is an extension to the standard.
...
The difference is that by using https:// your library CAN still work in browsers

Have you tried this? From my tests, browsers don't seem to support 300 redirects.

Again, as mentioned by Luca, and to emphasize: doing this with https would be implicit magic, but deno: specifiers are explicit.

In contrast, using deno: will prevent the code from ever being used anywhere else again except in Deno.

This is not true. As I mentioned above, specifiers can be mapped via import maps and these specifiers are optional to use.

@KnorpelSenf
Copy link
Contributor

Have you tried this? From my tests, browsers don't seem to support 300 redirects.

Sorry for the delay, I have tested it now. Status 300 works with Firefox, but not with Chrome, so there is some support. Either way, I am aware that 300 is seen very rarely in the wild. I guess I would personally prioritize following web specs over shipping a solution that is convenient for Deno to implement (obligatory link). I see why you guys have different incentives, so while I still disagree that deno: is a good idea, I understand your motivation to start your own protocol :)

@dsherret
Copy link
Member Author

dsherret commented Sep 15, 2023

There is an experimental preliminary PR for this here: #20517 (though it's not usable at this point)

Some changes are that it is no longer deno: specifiers, but rather generic jsr: specifiers. These specifiers currently load packages much faster than https: specifiers because the module analysis is done ahead of time (so in addition to this analysis not needing to be done on first run, there's also no waterfalling within a package). Under the hood, they are https: specifiers though so this works quite easily with existing Deno tooling and other tooling or browsers via a generated import map.

@albnnc
Copy link
Contributor

albnnc commented Sep 16, 2023

I want to reply to the PR description, but I think it makes more sense to continue the discussion here. Quotes are taken from the PR description.

[...] but making the specifiers non-Deno specific.

Sounds more pleasant.

  • So, will it be something like "package registry for the web / js"?
  • jsr: = JavaScript Registry?
  • Will the related documents / specs released later? I mean, the specs for custom registry implementations to be conformant with.

The default registry url can be configured via the DENO_REGISTRY_URL environment variable.

  • Is the default value of the default registry URL going to be set to deno.land/x/?
  • How are the different registries going to be used at the same time in the same code? This is an extremely valuable use-case.
  • Is the self-hosting registry solution going to be released before the stabilizing deno: / jsr: specifiers? If so, what time window the community will have to check it? If not, how are we going to understand that Deno registries are not vendor-locked?

The underlying https: specifiers can be imported without duplication of a module.

  • Could you please explain the rules of deduplication? It feels like this process has to be well-documented at least.
  • Are the deduplication process is going to be configurable?

Tooling to generate an import map that can then be used in browsers and other tooling will be added in the future.

It looks to me that the tooling of this kind is a must for jsr: to be felt by community kinda based on web-specs. In case of close release of this functionality, please

  • Make corresponding documents / specs a priority.
  • Make corresponding tooling a priority.
  • Make corresponding tooling usable from the code (including JS/TS API), not just CLI. Like deno_emit, for example.

@so1ve
Copy link

so1ve commented Oct 3, 2023

I think this shortcut cannot increase productivity… What about third party CDNs? Should they get their own specifiers? If not, why should deno.land get its own specifier?

Plz add something like // deno:auto-import foo library from web :D

@dsherret
Copy link
Member Author

dsherret commented Mar 2, 2024

This is no longer relevant with the release of https://jsr.io

@dsherret dsherret closed this as not planned Won't fix, can't repro, duplicate, stale Mar 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
suggestion suggestions for new features (yet to be agreed)
Projects
None yet
Development

No branches or pull requests