Skip to content
This repository has been archived by the owner on Jun 29, 2023. It is now read-only.

Rethink the Third Party Modules registry #406

Closed
vwkd opened this issue May 13, 2020 · 40 comments
Closed

Rethink the Third Party Modules registry #406

vwkd opened this issue May 13, 2020 · 40 comments

Comments

@vwkd
Copy link

vwkd commented May 13, 2020

This issue has already been brought up multiple times in the old repo #26,
#117 and in this issue comment on the main repo. In the meantime this new repo came up, and the old issues were archived. Now with v1 dropping and users starting to adopt Deno, this becomes more and more important, which is why I want to open an issue for it.

The Problem

Modules can be added to the database.json to make them show up on the Third Party Modules page https://deno.land/x/ and available under the URL https://deno.land/x/MODULE_NAME@BRANCH/SCRIPT.ts.

There are a few reasons why this is a suboptimal idea.

  1. Scale

If Deno becomes even only half as popular as Node, there would be hundreds of modules added each day. It will totally flood the commit history and make the repo hard to maintain. Every PR needs to be verified by a maintainer, otherwise people could manipulate entries in any way. This is bad. Just looking at the recent commit history most of the commits are already related to adding entries for Third Party Modules.

  1. Unique module names

Since module names must be unique, this will give rise to bad naming conflicts. Only the first module can be added, leaving anybody else with the same module name left out. This will be more like a domain name registry where all the good names are already taken. Even worse, entries for old unmaintained modules will block forever, because somebody out there might still depend on the URL so they can't be removed. This is even worse than domain names, which can at least expire.

  1. GitHub only

The current system allows only links to GitHub registries. This might currently not be a problem, since GitHub is the defacto standard Git repository host, but inherently being locked into one host by choice of a too narrow system is never good. Also limiting the registry to GitHub is effectively centralizing it, which achieves the opposite effect of what Deno tries to achieve by using the ES module system.

In summary, the current module registry doesn't scale. It is high maintanance, requires unique module names, and is centralised.

A solution

What people need is an overview of modules compatible with Deno and a convenient way to link to them.

Currently the Third Party Modules registry solves both, albeit in a very limited way as highlighted above. Entries in the database.json are used to build the overview, as well as generate the links available through https://deno.land/x/....

I believe these two features shouldn't be made dependent on each other, but looked at individually.

  • Module overview

There needs to be a way for developers to add (and remove) modules that scales well for lots of modules each day. Also it shouldn't be to the burden of the Deno maintainers, there should be an automated system of some sorts.

One idea would be to generate the overview dynamically by some form of a search algorithm. Essentially a Google for Deno modules. For example, a table with entries of all repositories that are tagged "deno" (https://github.com/topics/deno). The single source of truth is the repository and the generated table would adapt dynamically to any changes of the repo, like name, description, or even deletion. Repositories wouldn't need to duplicate the name and description in another place and keep maintaining it if something changes. This would scale much more nicely, as no manual work is needed anymore and Deno maintainers could get back to writing code instead of verifying PRs. Repositories just need to add the tag "deno" to publish their modules which most already have anyways.

There are still some drawbacks to this that should be mentioned. There will be false positive matches by repositories adding the search tag for an unrelated reason. This could be mitigated by a sufficiently specific tag, but never fully solved. Also it doesn't solve the centralization problem, as other hosting providers could be added to the search algorithm, but only so many. Modules hosted on a private website would be hard to add to the search algorithm.

  • Module URL

If the module overview from above had a convenient way to obtain the URL of a module, then there would be little use case for an URL shortener like https://deno.land/x/.... For example, there could be a "Copy URL" button in the dynamically generated table which directly copies the github.com URL such that it can be pasted into the code, similar to what Google Fonts does.

Direct URLs also solve the issue of centralization. Obtaining all modules through the same domain makes the registry fragile. What if https://deno.land/x/... experienced outages, or even attacks? Also the URL shortener goes against the idea of the decentralization that Deno tries to achieve with the ES module system. Making a whole ecosystem depend on the uptime of a single domain is not good. Only raw direct links to wherever the module is hosted are truly decentralized, with no dependency on a single URL proxy.

Btw. GitHub URLs are actually pretty readable already making another point for abandoning the URL shortener https://deno.land/x/....

https://raw.githubusercontent.com/USERNAME/MODULE_NAME/BRANCH/SCRIPT.ts

Demo

I made a simple demo of repositories tagged with "deno" on GitHub which however still has the shortcomings mentioned above.

There are several false positives, like personal projects, editor extensions, deno itself, etc.

The mod.ts link serves as a standin for the "Copy URL" button. It also shows another problem of the search algorithm approach, as there is no predictable way to link to the entry script. Relying on a convention would not work, and the link would need to point to the repo where people need to read the actual documentation to find out what to import.

Also the console throws an error after the 10th page, since GitHub API limits unauthenticated requests to 10 per minute. Sicne each page only loads 30 items, the script would need a GitHub authentication to show more than 300 results.

TLDR

The Third Party Modules page https://deno.land/x/ could generate an overview of compatible modules with Deno dynamically from the "deno" tag on GitHub, to not depend on manual addition of modules to database.json. The URL service could be expanded to work as a full URL alias or just be abandoned altogether in favor of convenient links to GitHub directly from the table.

Summary

If Deno grows just as huge as we all hope, then the current Third Party Module registry page just won't do. We should fix it rather soon with the few hundred modules, while we still got a chance.

EDIT: When I started this issue I favored the dynamic search approach. However by now I believe an actual registry would be better, due to decentrailzation not being limited by the search algorithm. The registry should directly link to the modules, wherever they are hosted, be it on GitHub or a tiny personal website (not like NPM, which actually hosts the modules). If the registry makes it easy to obtain the URL, imagine Google Fonts, then there is little reason for using a centralized URL shortener.

EDIT: There is a great proposal by
@hacked for a registry with an example implementation.

@tkgalk
Copy link

tkgalk commented May 13, 2020

Anything to stop namesquatting, even if Deno packages are not dependent on being in deno.land/x/.

@AvnerCohen
Copy link

  1. Re "GitHub only" - if automatic inventory is based on "https://github.com/topics/deno" (as an example), wouldn't that mean same logic will be required to be applied on all source repositories just as well (bitbucket, gitlab, codeberg etc). and than, how about self hosting modules? not possible at all?
  2. I think having a good and official way to help people find the best package should be something deno should support as soon as possible.
    Basing the logic on stars, might not be the best way because it means a bitbucket package (as an example) is doomed to be less popular (Less platform users >> less stars).
    There are many works on how to evaluate such things in a more scaleable and automatic way.
    Maybe I am missing this part of the discussion, but outside of naming conflict, a platform that have "security first", should also be focusing on a secured module environment.
    At the end of the day, quite a few pretty critical vulnerabilities are originating in user land packages.
  3. Another thing to consider (and again not sure if was discussed already) is best ways to avoid things like leftpad incident. With security in mind, should there be a way for the ecosystem to enforce "integrity", a lesson can be learned from the rubygems system.

@axetroy
Copy link
Contributor

axetroy commented May 14, 2020

The problem is similar to cybersquatting

As long as a good module name is registered, others will no longer be able to submit the module with the name

Even if this module has been deprecated and will not be updated for several years, it cannot be deleted

@denoland developers should pay more attention to the development of core modules and standard libraries. Instead of wasting a lot of time processing database.json 's Pull Request

@AnandChowdhary
Copy link
Contributor

Also, making PRs to database.json will be super hard after there are thousands of packages because the JSON file itself will be several MB in size. If Deno grows like Node, we can expect there to be hundreds of thousands of packages wanting to be published in the third-party registry.

I'm all for using a GitHub repository as the source of truth rather than a dedicated closed registry like NPM, but maybe a better way to do it is to have a repository called "x" that has a file for each package, e.g., package_name.json with the same key-value pairs for username, repository, and git host. This is similar to Microsoft's @types repository, simply because the file system will scale much better than a single JSON file.

If we use something like Cloudflare as the primary CDN for this registry, we can:

  1. Have npm-like download counts by measuring the number of file requests on the CDN side
  2. Create APIs in the future to make this data available (e.g., workers fetching analytics on demand and caching them, like we do with file serving) to make this setup more open to the community

Of course, none of this is necessary if we decided to eliminate the registry altogether in favor of directly using GitHub URLs. However, I believe that not having a central registry will create a vacuum that a third-party registries will try and fulfill. Even from a "global bandwidth usage" and carbon footprint perspective, caching files in a central registry (thanks to the CDN) whose URLs everyone is using is better than the alternate of people using different registries to fetch the same files.

In summary: I agree that the current single JSON file doesn't work as we scale, but I see the need for a single source of truth for third-party modules, just because of the ease of use that brings to developers. My proposal is to have a dedicated GitHub repository with each file representing a package, just like Microsoft's @types repository.

@WA9ACE
Copy link

WA9ACE commented May 17, 2020

I agree. I had assumed, my own fault, that /x/ was similar to GoLang's x namespace where those packages are maintained separately from the core runtime standard library. As part of Deno's core features is decentralized url based imports it seems disingenuous to have (at the time of this writing) 293 packages under the x namespace on the official deno.land TLD.

@Willamin
Copy link

The Crystal community handled a similar problem by creating a few unofficial web apps that did no more than search GitHub for Crystal projects and return a list.
Anyone in the Deno community could create a similar web app that would search across popular git hosts for Deno projects.

@kidonng
Copy link
Contributor

kidonng commented May 17, 2020

The current Deno registry

  • deno.land/std: Hosting the standard library. Works as intended
  • deno.land/x: Proxy to GitHub/npm/etc. Relies on database.json.
  • Also function as a code browser.

The problems/features

  • Security: Auditing, security alerts...
  • Scalability/Maintainability: a single JSON file won't satisfy future needs, nor will a single repo (take CDNJS as an example).
  • Integrity: Registry should be fault-tolerant, avoiding the left-pad incident and such.
  • Single source of truth: Analytics, versioning, dependency info, discoverability...
  • Name squatting, Caching/Deduplication...

Thoughts

Deno is decentralized. We are not locked in a single platform. Git providers, OSS providers, CDN, self-hosted, there are countless choices.

Should we have a central registry? Most of us won't want another npm, but we have to tackle problems above in case of a healthy and thriving ecosystem.

Solutions

  • deno.land/x: Keep it as it is (a proxy), but we should make it clear it is not endorsed officially and users are welcome to use solutions other thandeno.land/x. deno.land/x functions as an identifier that a script/package is created for Deno.
  • database.json: Get rid of it. We can replace it with major community-driven projects (just like RoR for Ruby) and curation (such as https://github.com/denolib/awesome-deno).
  • Security: URL imports make it easy for people to find and verify the source. This is where the community comes in.
  • Scalability/Maintainability: We don't keep everything in one place, thus we won't encounter this type of problem.
  • Integrity: Unless we store all versions of every package and require everyone to use it, accidents can happen. But thanks to Deno's feature, it caches previous versions and URL import is simpler to handle than dealing with node_modules.
  • Single source of truth: Actually the same problem as the previous one. Either we ignore Deno's decentralized feature and build dpm (Deno package manager), or we won't really have a single source of truth. That said, we can provide some info from the de-facto platform GitHub when people visiting deno.land/x.
  • Name squatting/conflicting, Caching/Deduplication... Name squatting will disappear if we don't force a central registry. As for Caching/Deduplication, some helpers can be created in userland for merging different source of imports (for example, Git providers and CDN).

Conclusion

A central registry for third party Deno modules should be opposed. That said, we can still have unofficial websites that provides search and index function for Deno modules in various providers, and tools/helpers for managing imports/dependencies, examples mentioned above.
And we should not forget that npm (or other package managers) is not born with Node.js. It is invented in need of managing evolving Node.js projects. There's little doubt eventually some kind of dpm will come out and become main-stream, but it is never against Deno's philosophy, because it will never be required by Deno.

@AnandChowdhary
Copy link
Contributor

AnandChowdhary commented May 17, 2020

A central registry for third party Deno modules should be opposed.

I agree, we should then focus on completely independent packages. We are free to create a site (like go/x) where people can search for packages, like others can built too.

However, if we're removing "official" package names, I highly recommend getting rid of deno.land/x entirely, because it's not different from any other GitHub proxy CDN, and if the URLs are going to be deno.land/x/github/username/repo/brach/file, then even non-Deno related projects are going to be served from that service, which doesn't make sense.

There are several options today to serve GitHub URLs, ordered most to lease reliable (in my opinion):

  1. https://raw.githubusercontent.com
  2. https://cdn.jsdelivr.net/gh/user/repo@version/file
  3. https://cdn.statically.io/gh/user/repo/tag/file
  4. https://rawcdn.githack.com/user/repo/tag/file
  5. https://gitcdn.xyz/user/repo/master/file

Perhaps people should just use these, then? Might open up a healthy ecosystem of CDN URLs, and different Deno styleguides can recommend specific CDNs as well, based on their usage (like how in the jQuery days, Google's CDN because the de facto, overtaking jQuery's own URLs). In this case, I don't see the need to endorse a specific URL from Deno.

@parzhitsky
Copy link
Contributor

Hopefully, my two cents will be helpful.

Since there is a distinction between different types of database entries ("github", "npm", "url", etc.), I'd say this is reasonable to make this distinction in module system as well, — for example, with some kind of qualifier:

import module from "std:path/to/file.ts";
import module from "github:user/repo[@ref]:path/to/file.ts";
import module from "npm:[@scope/]package[@tag|@version]"; // resolves to .tgz, then unpacked to local cache
import module from "https://example.com/path/to/file.ts"; // would make sense to make "url:" qualifier implicit

This would let us reuse a lot of existing stuff in an (imo) clear and elegant way.

@parzhitsky
Copy link
Contributor

Some examples:

// $DENO_STD_LIB_REPO = https://raw.githubusercontent.com/denoland/deno/master/std

import module from "std:archive/tar.ts";
// "std" resolves to $DENO_STD_LIB_REPO
// $PATH = archive/tar.ts
// $DENO_STD_LIB_REPO/$PATH =
//   https://raw.githubusercontent.com/denoland/deno/master/std/archive/tar.ts

// also:
import module from "std:fs/copy.ts";
//   https://raw.githubusercontent.com/denoland/deno/master/std/fs/copy.ts

import module from "std:std/std.std";
//   https://raw.githubusercontent.com/denoland/deno/master/std/std/std.std
// $GITHUB_DOMAIN = https://raw.githubusercontent.com

import module from "github:microsoft/typescript:src/tsc/tsc.ts";
// "github" resolves to $GITHUB_DOMAIN
// $USER = microsoft
// $REPO = typescript
// $REF = master (implied)
// $PATH = src/tsc.tsc.ts
// $GITHUB_DOMAIN/$USER/$REPO/$REF/$PATH =
//   https://raw.githubusercontent.com/microsoft/typescript/master/src/tsc/tsc.ts

// also:
import module from "github:microsoft/typescript@master:src/tsc/tsc.ts";
//   (same as above, with explicit ref)

import module from "github:microsoft/typescript@v3.9.2:src/tsc/tsc.ts";
//   https://raw.githubusercontent.com/microsoft/typescript/v3.9.2/src/tsc/tsc.ts

import module from "github:github/github@github:github/github.github";
//   https://raw.githubusercontent.com/github/github/github/github/github.github
// $NPM_REGISTRY = https://registry.npmjs.org

import module from "npm:qs";
// "npm" resolves to $NPM_REGISTRY
// $SCOPE = (global)
// $NAME = qs
// $TAG = latest (implied)
// $VERSION =6.9.4 (calculated from tag, see https://registry.npmjs.org/qs)
// $NPM_REGISTRY/$SCOPE/$NAME/-/$NAME-$VERSION.tgz =
//   https://registry.npmjs.org/qs/-/qs-6.9.4.tgz
// $PATH = lib/index.js (calculated from archive/package.json)
//   $DENO_CACHE/npm/qs-6.9.4.tgz/lib/index.js

// also:
import module from "npm:express";
//   https://registry.npmjs.org/express/-/express-4.17.1.tgz
//   $DENO_CACHE/npm/express-4.17.1.tgz/index.js

import module from "npm:@angular/core";
//   https://registry.npmjs.org/@angular/core/-/core-9.1.7.tgz
//   $DENO_CACHE/npm/@angular/core-9.1.7.tgz/bundles/core.umd.js

@lucacasonato
Copy link
Member

@parzhitsky This is incompatible with the web platform. Also this is an issue about denoland/deno, not about the website.

@maximousblk
Copy link
Contributor

maximousblk commented May 17, 2020

Possible solution to database.json if you really want to accept submissions through pull requests :

Break it down into separate directories. Use a structure similar to deno.land/x/USER/MODULE/mod.ts. that means, create a separate directory and accept pull requests to your dedicated directory.

This would solve the review problem if you can write a check and automatically merge the pull request if the USER has only changed its own modules information.

This would create something similar to:

user1
  - module1.json
  - module2.json
user1
...

The MODULE.json file would have the same content it has now.

Solves:

  • unique naming
  • large database file
  • merge conflicts (to a certain point)

Doesn't solve:

  • GitHub only
  • scale
  • commit history filled with database edits

My solution to database.json :

Create a separate website like npmjs.org but don't host the packages. Let user define the name, repo and location of the module.

It can also be a CDN to help users and serverless functions to import modules faster.

Again the URLs will be of the format : cdn.denojs.org/USER/MODULE@VERSION/ which would return the corresponding mod.ts if a particular file is not explicitly defined.

This would not force people to register there modules like npm but will be a central registry for most of the modules. People can host them wherever they want and have a proxied URLs parallel to their own distribution method.

Solves:

  • unique naming
  • large database file
  • GitHub only
  • scale
  • all got related issues.

@kidonng
Copy link
Contributor

kidonng commented May 18, 2020

@maximousblk thanks for sharing your thoughts, which is similar to my solution, but there are some problems:

Possible solution to database.json if you really want to accept submissions through pull requests

Anything aside from building a real database (another npmjs.org) will be a disaster in the future. As you said, it won't scale either.

Again the URLs will be of the format : cdn.denojs.org/USER/MODULE@VERSION which would return the corresponding mod.ts.

Deno requires explicit file extension, i.e. import ends with .js or .ts.

This would not force people to register there modules like npm but will be a central registry for most of the modules. People can host them wherever they want and have a proxied URLs parallel to their own distribution method.

If we ditch an official registry and build one (or two) third-party index, there are two ways to do it:

  • People submit their modules to the index and discover modules on it
  • People put modules on their preferred platforms and the index search for them automatically

They do seem alike, but here's the difference: if we let people submit their modules, we will have troubles.

Who's going to examine them? How do we update the info? Should we have an account system? What about name conflicting? Should we have namespaces? Is namespace the same as the account (if exists) username or platform's username? What if different people have the same username on different platforms?

And by submitting we are implying that if you don't submit your module to some index, your efforts won't get much attention. You may know that npm can fetch packages from anywhere (Git providers, URL...), just like Deno. Will you stop submitting your module to the npm registry? Probably no.

See? We are just building another npm if we go for the first option.

@AvnerCohen
Copy link

  1. Another community to try and take input from is the brew community.
    They too take commits for each module (separate) and also deal with some level of curation.
    What they do for example, is if they after X month a package is not in use, it might as well be removed.

There are many other things going on, but I think it could be very relevant if the intention is to try and keep this under some control (for example, I have noticed this one - #269 (comment)) e.g - someone took an active stance to say "this module is not wanted")

Maybe there should be a committee or very clear rules on what goes in and what's not?

  1. Personally, I think insight into download stats and usage are critical, I already see this coming as an issue with deno.
    I want to use an express like web framework, which one should I use? there are already tens of options ! some are better than others, but as a new comer, it's a waste fo time to try and read through all if I know 2-3 are the ore popular ones.

@maximousblk
Copy link
Contributor

Deno requires explicit file extension, i.e. import ends with .js or .ts.

What if the URL doesn't have the extention but the server replies with the mod.ts file when not specified.

In one of my experiments, I discovered that this works with the CLI, but haven't tried it in a script. I'll do some tests and post it here.

@kidonng
Copy link
Contributor

kidonng commented May 18, 2020

I discovered that this works with the CLI,

You mean deno run/install? It's not import at all.

EDIT: Sorry, I just test the following code with Deno 1.0.0 and it runs fine. Either my memory is wrong or the rule got changed at some time.

import isInteger from 'https://cdn.pika.dev/lodash.isinteger@%5E4.0.4'

console.log(isInteger(42))

@maximousblk
Copy link
Contributor

I discovered that this works with the CLI,

You mean deno run/install? It's not import at all.

I'm new to JavaScript so, you can't expect much from me.

If that doesn't work, then URLs will be of the format : cdn.denojs.org/USER/MODULE@VERSION/FILE.ts

@kidonng
Copy link
Contributor

kidonng commented May 18, 2020

@AvnerCohen

Another community to try and take input from is the brew community.

Things are different between package managers for system and programs.

  • Software is way less in quantity compared to code modules.
  • Software manifest file needs to be written by hand and can take many efforts sometimes, while for code modules it can even be generated automatically.

I myself maintain an index for scoop (a Windows package manager) and know it well.

Maybe there should be a committee or very clear rules on what goes in and what's not?

Definitely better if we do have one, but the problem becomes who is qualified and willing to devote effort and time.

Personally, I think insight into download stats and usage are critical

When Deno becomes widely used people will go for CDNs (GitHub raw is not CDN and should only be used for simple cases), just check CDN stats.
And it makes no sense to analyze the modules to see which modules are used most. They are URLs.

I want to use an express like web framework, which one should I use?

This is where things like GitHub stars and curation list (like https://github.com/denolib/awesome-deno) comes in.

@Nexxkinn
Copy link

Nexxkinn commented May 18, 2020

I'm more interested with the idea of using bitTorrent protocol to store the module.
Essentially, the server acts as a tracker, and deno can download the module based on peer/seeders available. Obviously, one additional server acts as a backup if no seeders/peers detected.

The developer give magnet link ( or similar ) with the trailing @version at the end for version control.
The user can import it with something like this:

import module from 'magnet:?xt=urn:deno:c12fe1c06bba254a9dc9f519b335aa7c1367a88a@1.0.0';

@millette
Copy link

@Nexxkinn I couldn't resist plugging hyperdrive (or dat if you're familiar with that) which is essentially an updateable torrent.

@Nexxkinn
Copy link

Nexxkinn commented May 18, 2020

@Nexxkinn I couldn't resist plugging hyperdrive (or dat if you're familiar with that) which is essentially an updateable torrent.

Oh yeah, Hypercore protocol is much more suited in this case. But we might need an additional tag for version control like in my previous post.

@anuoua
Copy link

anuoua commented May 19, 2020

If there is no center registry,it means no registry mirror.
It will be much slower to download files in my country.
Sometimes, large project takes several days to download the file...

@ghost
Copy link

ghost commented May 19, 2020

I've been thinking about this for quite some time now and came to the conclusion that it's pretty much impossible to satisfy everyone with a single solution. I think it's important to remember what Deno is trying to achieve at the core, which is discard the need for something like npm.

That does however not elminate the need for a registry, which allows searching and browsing modules. For that reason I've been working on a service for some time now, trying to incorporate the idea of being able to host a module wherever you like and at the same time provide a consistent way of importing those modules in your project.

If you're interested, have a look at https://deno.to.
For now, there's only a landing page with an explanation of what the service might look like in the future.

To sum up the concept, you can publish your module on GitHub, GitLab or BitBucket and add it to the website, manually. It's more or less a vanity URL service, similar to https://deno.land/x, but improved. All it does is redirect to the URL you provided.

It also addresses most of the issues mentioned above, such as

  • Name Squatting - Duplicate module names are not a problem, as long as the namespace (the username of the account) is unique.

  • Scalability - We won't depend on a single database.json file, every user can manage their own modules.

  • GitHub Only - For now, the idea is to limit it to GitHub, GitLab and BitBucket. Theoretically, it is possible to virtually allow any website, as long as a certain URL schema is satisfied.

  • Consistent Imports - All modules can be imported with the same URL, keeping it consistent and easy to read, https://deno.to/<name>/<module>/<version>

  • Decentralized - The idea of not hosting modules in a central place remains. You are free to host your module wherever you like.

  • Searching - You would be able to browse and search through all modules that are available on the website.

The only disadvantage is the need for an account system, but it won't be possible to have a service like this without it.

I'm still working on getting an alpha version up and running, but my time is somewhat limited at the moment. The consideration could be made to just open source it and let the community build it as a whole.

If there are better domain name suggestions, feel free to let me know and I'll buy them if needed. I figured deno.to was clever because it redirects to somewhere.

I'm definitely going to keep working on it, the nice thing about Deno is that we don't depend on just one solution. If this turns out to not be the way to go, it can always be discarded.

Feel free to share suggestions, improvements or criticism.

@sithumonline
Copy link

@hacked solution is so good but can not solve scale problem. If we combine @millette and @Nexxkinn solution with @hacked solution, we can solve scale problem.

Big companies can invest for CDN but indie developer could not there for if we use hyperdrive, we can solve scale problem.

We need redirect service as Hyper URLs are hard to use,

hyper://a8109b835d27c30ad4d55b321dd7c4032c037eaf1f6f2b00de7b03b5a6cc504a

We can use @hacked slove for redirects to the Hyper URL and it will give solutions for other issues too. We just need support for hypercore protocol (hyper://) form deno runtime.

Maybe deno can download modules using Merkle trees and seed modules. Hypercore works like lightweight blockchain crossed with BitTorrent there for it much secure, speed and decentralized way.

import { Foo } from "hyper://deno.to/ryan/foo/4.17.1";

CC: @mafintosh

@millette
Copy link

Regarding hyper/dat, there's https://github.com/datrs in rust world.

@ry
Copy link
Member

ry commented May 20, 2020

I appreciate you sharing your thoughts. This isn't the highest priority thing for us, so I think we'll be continuing database.json for the foreseeable future. I don't want to complicate the system by adding a database until I really am feeling the pain of database.json.

@ry ry closed this as completed May 20, 2020
@alessandrorivero
Copy link

@hacked Please keep working on https://deno.to/. I think this will make a lot of things easier for the community and help manage packages better

@RobDWaller
Copy link
Contributor

I know I probably shouldn't mention PHP in these parts but @hacked's solution does sound similar to Packagist, which is essentially just a front door to GitHub. And to the scale point, I don't believe Packagist had any major issues with scale as ultimately they don't host anything either.

I feel if adoption of Deno is going to become widespread then a package manager, even if it is merely a facade, is inevitable. I would definitely encourage @hacked to continue his exploration of this topic.

@srdjan
Copy link

srdjan commented May 24, 2020

I realize this issue is closed, but wanted to add my 2 cents... maybe this can be referenced when the topic becomes higher priority or while experimented with in https://deno.to

@hacked proposal is nice, and I would like to just add to it by suggesting a way to guarantee package content integrity. I'm proposing that name, in addition to the readable part, also contains package Content Identifier (CID), enabling compiler, on the other side, to recalculate the content hash on the first import occurrence, and verify that the content of the bundle was not altered.

This would effectively have each bundle release represented by unique and immutable token:

<org-name>@<proj-name>@cid@version.pkg-name.ts  

This token would be published together with the package name & description to the deno.registry, a public/trusted website enlisting packages and proxying to the hosting cdns - by preppending serving cdn’s domain to it :

https://<cdn-domain>@/<org-name>@<proj-name>@cid@version.pkg-name.ts

Examples:

https://deno.land/myOrg@myPro@JnKLwHCnL72vedxjQkDDP1mX@1.0.0.myPkg.ts
https://github.com/proj-name/dist/JnKLwHCnL72vedxjQkDDP1mX@1.0.0.myPkg.ts
...

This doesn’t solve all attack vectors, but, possibly provides higher resistance to tempering with bundle content than current solutions. Downside is the need for trusted deno.registry database :(
Maybe in the future, there will be external solution that deno community can use.


Authoring side: publishing/releasing

deno bundle pkg.ts pkg.bundle.ts
deno publish pkg.bundle.ts --org-name=org --pkg-name=pkg --version=1.0.0

Consuming side: importing

so, for this POC, consumer would grab the URL of the package from deno.registry and:

import {MyFoo} from 'https://some-cdn/some-org@some-proj@JnKLwHCnL72vedxjQkDDP1mX@1.0.0.{pkg-name}.ts'
...

For better DX on the consuming side, shorter, readable package name could be used, using local compiler config files to get the full URL?

import {MyFoo} from 'org/proj/pkg.js@v1.0.0'
...

@vwkd
Copy link
Author

vwkd commented May 24, 2020

@srdjan I agree that there should be some sort of integrity check that what is downloaded is actually what was intended to be downloaded. But such a check needs to be implemented in Deno itself before any registry (or more generally URL) can make use of it. See #200 for the relevant discussion.

Also the shorter import syntax would need to be supported by Deno and is not up to the registry to decide. I doubt anything that would make import not browser compatible will be implemented which is a good thing torwards a platform independent scripting language.

@ghost
Copy link

ghost commented May 25, 2020

Thanks @srdjan @zjopy

I'm not sure whether or not I should proceed with building (yet another) registry.

Someone has started to work on something similar a while back, it's available at https://module.land/.
The whole idea however is built around a package manager, which in a way is not really what we're looking for in my opinion. It might be better to start contributing there instead of releasing something new, but we'll see how it goes.

@vwkd
Copy link
Author

vwkd commented May 25, 2020

As you said the frontend is built on top of the mindset of a typical package manager. This bring us right back into NPM land with all the configuration files.

But if we could consolidate efforts that would be even better.

Pinging @pagoru from @moduleland.

@pagoru
Copy link

pagoru commented May 25, 2020

Hey @hacked, you can help us with @moduleland if you want, we want to respect the principles of deno as much as we could, so, if people help with the project with they ideas, can be really useful.
npm has a lot of bad things, we can rethink how to manage the modules, but you can give effort to the current project and not start a new one, we are open to new ideas, right now, the project it's only an idea.

@hazae41
Copy link
Contributor

hazae41 commented May 26, 2020

If we need to keep everything decentralized, why is there a registry in the first place?

A search engine that could search deno modules would fully accomplish the needs of decentralization, namespace and search

deno.land could just do a custom Google (or whatever) search, or even use its own search engine

@vwkd
Copy link
Author

vwkd commented May 26, 2020

@hazae41 Searching is harder than it seems. How do you guarantee to 1) find only Deno modules 2) find all Deno modules on the Web? No search engine can do that, not even Google. My initial post focused exactly on this. And even limited to Github the search would end in the same issues.

To not disadvantage anybody the best solution is to offer a registry where anyone can sign up and publish the URL to their module, no matter where this URL is. The URL could even be on a web page to which there are no links on the web, meaning no search engine could find it (Except if you told the search engine where to look. But then you could as well just publish the URL on the registry.)

@pi0
Copy link
Contributor

pi0 commented May 28, 2020

Experimental support for using any GH/NPM repo landed (#659). Any other provider supported by deno.land, can be also mapped via virtual db patterns :)

https://deno.land/x

Experimental: Use npm:[package] or gh:[owner]:[repo] as module name to resolve any artibrary repository or npm package.

Example: https://deno.land/x/gh:maheshkareeya:a1

@jimmywarting
Copy link

All this npm:, gh:, and magnet: ideas are cool, but this makes it impossible to load a cross browser/deno module written in vanilla javascript 😕

@gold
Copy link

gold commented Nov 14, 2021

First, thanks for expressing your ideas in a coherent and organized manner! I agree with mostly everything.

  • GitHub Only - For now, the idea is to limit it to GitHub, GitLab and BitBucket. Theoretically, it is possible to virtually allow any website, as long as a certain URL schema is satisfied.

I would only be interested if there are no limitations on where I publish my module. And it does not matter if nobody ever can find my module and use it -- except me. I'm not hiding my module -- I'm just not interested in promoting and advertising it. I want to be free to have no limitation on my module's lack of popularity.

Feel free to share suggestions, improvements or criticism.

The following is a use case that was real for me and how I handled it; I may be revealing my naivete, but I will share anyway in case it brings a viewpoint that helps others to solve the Deno third party module problem.

I ported an NPM module that I wrote to use with Deno in a server-only context. It's an in-memory key-value storage module with parameterized ttl functionality.

Initially I did not publish it anywhere. It existed only on my local file system (in its own local directory). And the import statement in my Deno code imported it using local file system syntax.

Since I deploy the server as a compiled binary (via Deno's compile mechanism, which is fantastic, though ymmv), which is built locally, import statetements for third party modules don't execute at run-time.

However, I have two different API servers that use this ttl_storage module. In the spirit of reusability and Deno's import statement via URL syntax, I just put the module's "mod.ts" file up on a web server that I own, accessable via a URL.

Deno (v1.16.1) would not accept my custom URL that wasn't coming from deno.land/x/.

"What's this?" I thought at the time -- this is hardly decentralized. Moreover, when I discovered that if you want to be able to import a third party module, it must be published on Github only. At this point I concluded that NPM's approach was no worse than Deno's approach and was superior to what Deno had implemented.

So I decided to bite the bullet and publish the little ttl storage module up on Github and link it to deno.land/x.

It solved my problem, but it would have been so much easier to just put my module up on my own web server and I'm done.

Again, I'm not forcing anyone to use this module. It only matters that I can reuse a library to keep things DRY.

(Additionally, being forced to publish code to Microsoft to get this to work makes me think that there are some deals between Deno and Microsoft that have yet to come to light. Call me cynical, but if you don't understand this, then it's not me that's naive.)

@Soremwar
Copy link
Contributor

@gold I'm a little confused about what your problem is here. Deno DOES support self hosted modules no matter the origin and it should be working just as you described.

This specific issue is meant to discuss origins for modules hosted on deno.land/x, not modules for Deno itself

If you want to discuss/debug the problem you are facing, why don't you hop into the discord server instead?

@gold
Copy link

gold commented Nov 15, 2021

@Soremwar Thanks for letting me know that "Deno DOES support self hosted modules."

I tried again.

For some reason, the original site's URL was reachable via a browser, but Deno had encountered an unexpectedf TLS handshake error during a build. I just now put the module on another one of my personal sites and Deno's import statement worked perfectly. I now feel like a complete idiot! (feel free to use my simple ttl_storage module at deno.land/x)

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests