New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SystemJS ♥️ unpkg #35

Closed
mjackson opened this Issue May 25, 2017 · 44 comments

Comments

Projects
None yet
5 participants
@mjackson
Member

mjackson commented May 25, 2017

From @alexisvincent on December 28, 2016 23:41

I smell an opportunity here to provide a beautiful SystemJS loading experience with unpkg.

If you haven't already played with it, SystemJS is a client side module loader and can already provides a nice experience for module loading from cdns and unpkg (Angular does this in their official guides).

I think we could make the experience even nicer however by automatically resolving the configs for raw projects using systemjs-config-builder.

The experience on the browser side would look something like this:

app.js

import { DOM } from 'react';
import { render } from 'react-dom';

render(
    DOM.div('Hello World'), 
    document.getElementById('app')
)

index.html

<!DOCTYPE html>
<html>
<head>
  <title>SystemJS ♥️ unpkg</title>
  <script src="unpkg.com/systemjs@19.0.0/dist/system.js"></script>
  <script src="unpkg.com/systemjs@19.0.0/dist/unpkg-config.js"></script>
</head>
<body>
  <div id="app" />
  <script>
    System.import('./app.js')
  </script>
</body>
</html>

Which is obviously really attractive since the only dependency you need is SystemJS, which then automatically handles the rest. This makes barrier to entry for client side development really minimal. I could also write a hosted service which parsed client code and made loading modules in production efficient.

@mjackson Let me know if this is something you would be interested to support.

Copied from original issue: unpkg/express-unpkg#65

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @alexisvincent on December 28, 2016 23:44

Could even collapse

  <script src="unpkg.com/systemjs@19.0.0/dist/system.js"></script>
  <script src="unpkg.com/systemjs@19.0.0/dist/unpkg-config.js"></script>

to

  <script src="unpkg.com/systemjs-tools@1.3.8/unpkg.js"></script>
@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

Is the unpkg-config.js file something that we would generate on our server? In order to generate that file, would we need to do an npm install of the package?

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @alexisvincent on December 29, 2016 17:56

That config would be static. But we would need to generate other config files on the server after an npm install. But the config file could be cached just like the others

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

What other config files would we need to generate on the server? Assume I know nothing about SystemJS. I'm having a hard time figuring out what would be required on our side.

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @sebastien on December 29, 2016 21:39

@alexisvincent I was wondering how you would manage the various pre-built JavaScript configurations we find on NPM projects. The main problem with NPM packages is that they're all implemented using a variety of precompilers (babel, traceur, tsc, etc) and when they ship with pre-built "true" JavaScript files (ie that would run as-is in the browser) these files can be placed and laid-out in many different ways.

For instance, https://unpkg.com/gl-matrix@2.3.2 is not usable as-is as the requires wont' resolve (require("./gl-matrix/common.js") points to https://unpkg.com/gl-matrix@2.3.2/common.js). The other problem is that the JS files likely won't work as-is in the browser, and we should instead use the pre-built version in https://unpkg.com/gl-matrix@2.3.2/dist/gl-matrix.js. The https://unpkg.com/gl-matrix@2.3.2/package.json does not specify a browser: entry either, so how would the SystemJS unpkg config manage to locate and load the prebuild file in that case?

Having a per-package configuration would be unviable because of the sheer quantity of NPM packages and the pace of updates, so the only other way would be to implement a custom SystemJS resolver for unpkg that would get a directory listing as well as the package.json to identify where to pull the pre-built files from. However, this means that to load a package, there would be at least 3 requests: 1 to get the package.json, one to get the root directory listing and at least one more to load the pre-built file.

What did you have in mind specifically?

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @alexisvincent on January 2, 2017 8:20

Sorry for the delay in replying. I'm taking a few days off all work

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

No worries, @alexisvincent :) We all need some time off

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @alexisvincent on January 17, 2017 21:0

Sorry that this has taken so long, I took a break and have been focusing on
other OSS work. I've given this some thought and have some new ideas.

I want to point out at the get go that there are 2 major areas we can optimize. The first is user experience, the second is cache maximization. I will explore these two areas separately, but they work elegantly together.

Coupling a smart client side module loader (SystemJS) with an automatic dependency resolver (unpkg) could create a very compelling development AND production story.

UX

The kind of workflow I'm envisioning is one where folks could seamlessly require packages and they would be automatically resolved from unpkg. A cli tool could also be provided to resolve the dependency tree upfront and provide a really solid production workflow.

Cache

Apart from the user experience boon, we could actually gain a serious cache boost. Currently, the status quo is to download a bunch of dependencies with npm and then bundle them with your app into some big bundle. Then whenever we make changes to our app code, the whole bundle is invalidated. This story is made marginally better by progressively importing dependency trees as you need them, but still, there's often no barrier between app code (frequently changes) and dependencies (infrequently changes). Some people address this by bundling node_module dependencies separately to app code. While this is a step in the right direction, modern apps often have large transitive dependency trees and bundling dependencies as a whole means that whenever any one dep updates, the whole dep bundle is invalidated. Another problem is that creating your own bundles means that dep caches can't be shared across different applications.

The most cache efficient option is thus utilizing public, cdn cached builds. unpkg for example. The problem with that is that manual dependency management is a huge pain. It's the reason we invented package managers.

Let's assume for a moment that unpkg was able to provide browser friendly builds on a package level for all npm packages (explored further down). And instead of bundling dependencies together with your app you dynamically resolved them in the browser with a smart module loader. You get the best of both worlds, automatic dependency management as well as globally shared package level caching.

Enter SystemJS and unpkg.

Ideally, packages and our apps are actually resolved and cached on a file level, but until http2 and Service Workers are more widespread, and browsers improve parallel file parsing/loading performance, I think package level builds are a pretty good step forwards.

Implementation

Ok, so there are a number of things that need to be built, and some more that would make the experience kickass.

Background

First, some background for folks unfamiliar with SystemJS and browser module loading in general. Assume no bundles or prebuilt files (they can be considered a convenience).

Loading in the browser is more nuanced then on the server, since we have to know up front exactly where files are, since the time to localise them experimentally is too high. Take for instance the case where a package author calls require('./foo'). What they implicitly mean is ./foo.js and node correctly resolves this, since it checks if ./foo.js exists. However, the author could have also meant ./foo.json. The point is there's no way to resolve the filename upfront, and the performance cost of doing so in the browser is too high.
Another issue here, is that the node resolution logic is also indeterministic. An npm install could place a node package in any number of places. Yarn introduces deterministic install locations, but as an outsider, without access to the yarn.lock, I would have no way of knowing where to actually find the package.
Because of this indeterminism, to load packages through SystemJS, we need to statically analyse the package on the server and generate mappings and other hints, so that we make the job of loading the package deterministic.

SystemJS compatibility and unoptimized loader

Assuming we wanted to load files individually (not great performance, but still useful) through unpkg, we would need access to the statically generated config for the package. Assuming we have this, writing an unpkg SystemJS loader would be trivial. Usage in app code would then look something like this (loader's concern):

// explicite version and loader
import react from 'react@1.2.3!unpkg'

// latest version and explicite loader
import react from 'react!unpkg'

// pre-generated config mapping (easy) or automatic resolution
import react from 'react'

This is quite clearly a UX boost for the developer (even if not particularly performant), since they don't have to think about bundling or local dependencies through npm.

All that needs to be done on the unpkg side is to perform the static analysis on the package when it first resolves it, and to the serve that config file. This is easy to do and I've already done most of the grunt work through my systemjs-config-builder package, which would just need some simple modifications. I was lucky since I can piggy back off of the static analysis functions that JSPM uses internally.

Live module bundling and optimized loader

Now that we have a config file for the package, we could leverage systemjs/builder's universal build capability to generate builds for the package in any format (umd, amd, cjs, esm, global...) we could conceivably generate builds 'on the fly' with something similar to @sebastien's ?as=umd url extension. Now not only can unpkg serve individual package files, but it can also serve arbitrary builds of the package.

This also means we can efficiently load an app with package level caching, and the UX described in my previous point. If folks jumped on to this way of doing things (as they have unpkg itself), it could also significantly contribute to globally shared caches.

In the case that a package already provides its own builds, those can be used in place of building our own. This can be determined via something like the browser key in the package.json.

This would provide an ideal development AND production workflow and I think as an offering could greatly ease on-boarding in the web. It would also be really useful for example apps and usage examples, since people could experiment inline without setting up a large build chain.

unpackaged (package manager)

Once these basic primitives have been implemented, we could even provide a package manager (jokingly called unpackaged) which could simply do module resolution for you, so that compatible package versions could be resolved and the browser would also know which package builds to load upfront. So package version control for unpkg :)

Local Development Server

One of the annoying things about developing against a CDN is that you need to always be 'online'. If you want to work in a plane or somewhere without easy internet access, you're stuck unless you have a local copy.

Since unpkg is open source, folks could actually just run a local version and resolve packages off of it in development.

Wrap up

Sorry for the long post but I thought it was worth putting all my thoughts on paper so that everyone is on the same page.

Am really interested to hear your feedback.

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

Thanks for all the details, @alexisvincent. This sounds very exciting! A few quick questions:

All that needs to be done on the unpkg side is to perform the static analysis on the package when it first resolves it, and to the serve that config file.

That means we need to install all of that package's dependencies, right? Because systemjs-config-builder looks through the node_modules directory to build the config file?

We're currently just a simple file server, and we don't actually ever run npm install (or yarn install) on our servers, so this would be a fairly significant change for us. We'd need to make sure we can do this securely and carefully manage disk space.

Now that we have a config file for the package, we could leverage systemjs/builder's universal build capability to generate builds

How does this affect package authors? Can they just use ES6 modules? Or do they have to use SystemJS.import, etc.?

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

Tell me more about the package manager. Is that the piece that people use to generate the SystemJS config file they need in production?

Also, the development server. How would it get the package sources without a node_modules dir to pull from? Would we keep our own cache on disk?

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @alexisvincent on January 18, 2017 7:24

That means we need to install all of that package's dependencies, right? Because systemjs-config-builder looks through the node_modules directory to build the config file?

We're currently just a simple file server, and we don't actually ever run npm install (or yarn install) on our servers, so this would be a fairly significant change for us. We'd need to make sure we can do this securely and carefully manage disk space.

My bad, when I was scanning through the codebase I saw stuff like this

      checkLocalCache(packageDir, (isCached) => {
        if (isCached)
          return next(packageDir) // Best case: we already have this package on disk.

and assumed you saved local copies of the package. I'm guessing you then just proxy through to the npm registry on a file level? How do you manage getting a hold of the files?

As it currently stands systems-config-builder would need a local copy of at least all the files in the package. It doesn't need the full dependency tree, but obviously would eventually need to parse all of them to be able to resolve all files.

How does this affect package authors? Can they just use ES6 modules? Or do they have to use SystemJS.import, etc.?

Package authors wouldn't need to change anything with regards to how they normally publish. SystemJS.import is just for dynamic requires. Typically we only use SystemJS.import at the entry points to our application.

So this approach would work on all packages that SystemJS and JSPM currently support. There are certain packages that fail. For instance, some packages do the following:

const namedModules = ['one', 'two', 'three'].map( num => require(`./some/${num}.js`) )

For obvious reasons this kind of dynamic require is really difficult to statically analyze (ESM's change this since import is static, so as a community we're moving away from this). In this case the SystemJS/JSPM community has generally submitted PR's to the author to turn this kind of thing into something easier to load. Luckily, the static config used to load the package can easily be extended with compatibility fixes for the package. For example

SystemJS.config({
  packages: {
    'someProblematicPackage': {
        map: {
            './noExtention': './noExtention.js',
            './funckyRequireFile.js': '/myFunckyOverrideFile.js'
        }
    }
  }
})

And while waiting for the author to update their package, the SystemJS/JSPM community normally publishes their fix to the JSPM registry.

Luckily these kinds of patches are relatively rare and we would be able to leverage that registry and the community effort that has gone into compatibility. To date, the registry has over 2265 commits, most of those are overrides for individual package versions that were problematic through the years.

Tell me more about the package manager. Is that the piece that people use to generate the SystemJS config file they need in production?

Without the package manager tool thing, loading modules would work just fine, but SystemJS would import the packages on the fly, introducing latency into the process, this would be fine generally, but in development and production (for different reasons) you want your site to load as quickly as possible. To fix this we need to tell SystemJS (the browser) upfront all of the packages we need, then SystemJS could preemptively load them in parallel. People would also want to control package versions and have version clashes resolved by a tool that understands semver. This needn't be a cli tool, it could also be an online tool, but having it ready as a cli command would be useful. In either case, the 'package manager' would just enhance the experience, it's not required.

Also, the development server. How would it get the package sources without a node_modules dir to pull from? Would we keep our own cache on disk?

Again, my assumption was that unpkg had access to the local files. But regardless of how it worked, we could just run npm-http-server on our local machines with a local file cache and use whatever mechanism it uses to serve files. Folks could also, if they wanted to, run their own production server. Also this would defeat the purpose of the global cache, but they would get per package caching.

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

As it currently stands systems-config-builder would need a local copy of at least all the files in the package. It doesn't need the full dependency tree, but obviously would eventually need to parse all of them to be able to resolve all files.

Ok, that sounds doable. We currently do keep a local cache of the files in a package, but not its dependencies. We just grab the .tgz from npm, decompress it, and serve files out of that.

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @alexisvincent on January 18, 2017 14:29

Cool. How do you feel about the rest?

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

I think it's a very compelling UX and I'd love to work with you on this. I've already claimed the unpkg npm package in anticipation of providing a CLI that could help people use unpkg, but hadn't fully thought through exactly what it might do until now.

My workflow is traditionally to just install all my dependencies locally and then use a bundler (webpack) to put everything together when I deploy, so I'll probably have some questions about how SystemJS does things as we go. But I think my "outsider" influence could also help us to make sure the experience is as seamless as possible for people who aren't familiar with SystemJS. We can build on SystemJS tools, but I think we can keep the new user experience really simple so that people who are just getting started on a new project won't have to know too much about the underlying infrastructure in order to use our tool.

Let's try to develop a roadmap. Where do you think we should start?

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @sebastien on January 18, 2017 17:37

I think @alexisvincent use case is very compelling, and alleviates the need for using NPM altogether, as long as the import resolver resolves to unpkg.

The main questions I would have are:

  1. how do you generate the per-package configuration file for SystemJS?
  2. how would they be retrieved and injected into SystemJS' configuration?
  3. how do you build any given NPM package so that it becomes loadable by any modern browser?

I think there's a possibility to implement a resolver plug-in for SystemJS that would work in most cases (ie. where the NPM package includes build files) without having to do any significant change to unpkg, so I'm curious to see how you envisioned these 3 points.

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @EricSimons on January 18, 2017 20:42

@alexisvincent "To fix this we need to tell SystemJS (the browser) upfront all of the packages we need, then SystemJS could preemptively load them in parallel." <- since all of the packages req'd would be getting resolved from unpkg, wouldn't the max concurrent connections to same domain make the loading experience unreasonably slow for dev & especially prod? (Assuming no HTTP2)

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @alexisvincent on January 18, 2017 20:50

@EricSimons :/ Damn, I hadn't thought about that... Good catch.

As you mentioned, in http2 this is a non issue. But you're right. In http 1.1 this would effectively introduce a latency in-between most requests... What is the max in most browsers? 8?

At least for the first load. There after it would serve from cache.

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @alexisvincent on January 18, 2017 22:1

@mjackson even with @EricSimons's revelation, I can still see tons of uses for this. Probably most notably the on demand bundling and on-boarding UX. It's really annoying though about the max concurrent connections. That effectively (unless I can think of a way to work around this somehow), means that, for production, this is a no go. I have a few ideas about how we can get around this, just need to flesh them out a bit and then I'll share. Out of interest, @mjackson what percentage of connections to unpkg use http2?

@sebastien To answer your questions

how do you generate the per-package configuration file for SystemJS?

systemjs-config-builder piggybacks off of the install time analysis that JSPM performs. And effectively does this on a package level recursively through node_modules. I would need to do some trivial modifications to decouple the traversal logic from the generation step and then we could do this very simply.

how would they be retrieved and injected into SystemJS' configuration?

via a custom unpkg loader.

how do you build any given NPM package so that it becomes loadable by any modern browser?

we could leverage any of the existing builders, so effectively doing what people currently do manually, automatically. We definitely wouldn't be able to support all packages. For instance serverside packages would fail. But the ones that people currently use in the browser would be simple to package.

@mjackson the fact that you're unfamiliar with SystemJS is definitely helpful.

With regards to the roadmap.

The first thing to do would probably be to agree on an external facing API for loading the config file and bundles.

Next would be for me to build in support with systemjs-config-builder for per package analysis. The api would effectively be generateConfig('path/to/package'), so this would need to be easily integratable on the unpkg side. What would it take to implement this? We would also then need to do this on the fly based on a query param. Something like: https://unpkg.com/react@2.2.2?file=systemjs-config.json.

@mjackson I could today or tomorrow provide a dummy getConfig function to do the generation. Would you be able to sort out the ?file=systemjs-config.json -> getConfig mapping?

As soon as this is done, I can write a simple loader to resolve packages (very slowly) from unpkg. But it could get the basic workflow down. The nice thing would be that almost everything from this point onwards would be changes to the implementation (read optimizations) and the external API would remain the same.

I think in parallel we could also look at way to make running npm-http-server as a dev server easy.

Then I think the next thing to do would be to implement the 'on the fly' module bundling. Something we could do to alleviate some of the max connections problem, is tie bundle granularity to the protocol. So for instance with http2 we could serve the bundled package without deps. Then with http1, we could serve a package with its deps bundled in. Perhaps this isn't the best way to deal with it. But I'm confident we will find a viable solution. We would need to come up with a url extension that works and then tie that to the module bundling process.

This would allow me to expand the loader to load on a package level instead of a file level. I think aiming for this point is a good starting goal.

After that we would need the version resolution and upfront package loading and provide that probably via a cli and web tool.

@mjackson As a general roadmap how does this sound?

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @EricSimons on January 18, 2017 22:32

@alexisvincent I think you're right that bundling is the correct solution here. Even with cache hits, loading the non-dist versions of react/redux from unpkg + booting up takes approx 3-5s after page load for me, which obviously is a bit too long for a production use case. Bundling would drop that number substantially though.

Btw I think that this is an excellent idea regardless of whether it's suitable for production use cases. In fact, I think the use case is even better for non-production use cases, as this solution would lower the barrier to entry for building side projects, examples, etc - things that people might not normally start building due to the (currently) unavoidable upfront cost of rolling their own tooling (webpack, etc).

I'm actually working on something similar for a Thinkster project right now, so I can share some of the problems we've sorted out along the way if that would be helpful!

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @alexisvincent on January 19, 2017 8:58

@EricSimons Exactly, thats the main target market. We have a web dev course at Uni and last year I saw too many folk get really bad impressions of the web, because of the implicit complexities of the ecosystem. This year I want it to be as seamless for folks as possible. I want to have something that I can hand to people and say, here, use this. But that is actually simple, not just artificially so.

I think I've found have a rough solution for the HTTP 1.x case. There are various more optimized variations of this but I'm going to write down the simplest so that everyone can get the basic idea.

Lets assume the project has just 3 dependencies, react@15.4.2, redux@3.6.0 and lodash@4.17.2 and imagine for a moment that these projects didn't have transitive dependencies (makes the explanation easier).

We can construct a url, where deps are ordered alphabetically, that tells unpkg upfront all the deps you will need, which would look something like this: https://unpkg.com?bundle=lodash@14.17.2:react@15.4.3:redux@3.6.0. Upon first load, that file could be requested, unpkg could then determine if the incoming request is HTTP 2 or 1.

In the case that it's HTTP 1, it could stream the result of the concatenation of https://unpkg.com/lodash@14.17.2?bundle=system, https://unpkg.com/react@15.4.3?bundle=system, https://unpkg.com/redux@3.6.0?bundle=system. We can set the cache timeout to 1 year. So in a single request, the app will have all the deps it needs.

In the case that it's HTTP 2, we can return a null response, with either a header, a cookie or some code, that informs SystemJS that it's on a host that supports http2 (I haven't found a way to do this without making a request), it can then load https://unpkg.com/lodash@14.17.2?bundle=system and the others itself. Once again the response could be cached and subsequent loads wouldn't have to ask unpkg if the client supports HTTP 2. Then if in the meantime they update their browser, the next time the site updates, they can leverage the HTTP 2 load and caching. We could leverage server push in the first request, but I haven't yet figured out a way (without using Service Workers) to inform the server which files the client has so as to avoid pushing deps it has in its http cache. This is almost something I wish was fixed by the protocol.

If we get buy-in from the person serving the index file (maybe by providing an express middleware or something), we could optimize this a bit by letting that server inform SystemJS whether or not the browser supports HTTP 2, it could also inline the SystemJS source into the index file, so that SystemJS gets control as soon as possible. Then as soon as it gets control it can check if the browser supports HTTP 2 and it can immediately make the parallel requests if so.

Another variation would be to along with the request for https://unpkg.com?bundle=lodash@14.17.2:react@15.4.3:redux@3.6.0 to request the others simultaneously and let the unpkg server kill those requests, but you then will hit the max connection limit and so that would cause a temporary block on the other asset loads.

Assuming a 40ms download time (latency + download) on all files to both servers we get the following approximate first hit load times. Ignoring time taken to parse and load the JS.

Unoptimized HTTP 1
40ms (initial index load) + 40ms (SystemJS load and bundle request) = 80ms

Optimized HTTP 1
40ms (initial index load + SystemJS load) + 40ms (bundle request) = 80ms

Unoptimized HTTP 2
40ms (initial index load) + 40ms (SystemJS load and bundle request) + 40ms (all small bundles) = 120ms

Optimized HTTP 2
40ms (initial index load + SystemJS load + protocol hinting) + 40ms (bundle request and small bundle requests) = 80ms

While in the worst case Unoptimized HTTP 2 is one roundtrip slower then the others, it makes up for that by providing per package caching. Also, HTTP 2, can leverage internal cache from visiting other websites that use the same scheme.

Unless @EricSimons can once again shatter my dreams, this could be a viable way forwards. Let me know your thoughts.

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @alexisvincent on January 19, 2017 9:6

Even with cache hits, loading the non-dist versions of react/redux from unpkg + booting up takes approx 3-5s after page load for me, which obviously is a bit too long for a production use case.

@EricSimons Can you explain your experiment again. I didn't quite understand what you tested.

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @alexisvincent on January 19, 2017 10:7

@EricSimons Would love to chat regarding your Thinkster project!

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

Also worthwhile noting in this discussion is that the <script type="module"> spec supports loading directly from URLs. So e.g. you can do stuff like this. /cc @matthewp

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @matthewp on January 25, 2017 23:28

A crazy idea that's been in my brain for the last few days that i just wanted to throw out. What if there was a babel plugin that transformed:

import React from 'react';

into:

import React from '//unpkg.com/react@1.0.0/es-main.js';

assuming that this module had an ES main. And it did this recursively for all dependencies and their dependencies.

That part is easy, but what if unpkg had a mode that did this for you, ran the code through babel using this plugin (and only this plugin), caching the results of course. Then essentially all of your code in development mode would be coming from unpkg.

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

@matthewp It's crazy you said that. I've had literally the exact same thought running through my head this past week.

I was thinking about possibly using a ?module query string parameter to tell unpkg to run this transform on all your code. Then use <script type="module"> and away you go!

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @nkbt on January 26, 2017 3:41

I'll leave it here just in case

https://twitter.com/malyw/status/806954044577239040

img_1891

If intruducing new API, better make it like polyfill or at leadt compatible.

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

If intruducing new API, better make it like polyfill or at leadt compatible.

Yep. A ?module query parameter would be completely backwards compatible. If you don't have it, you'll get the same file you've always gotten. If you do, party!

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @alexisvincent on January 26, 2017 6:0

What ever does get released, its worth coordinating them, so that it doesn't end up confusing folks when a ton of ways of doing something come out of unpkg.

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

For sure. @matthewp is actually talking about moving in that direction and supporting <script type="module"> better on the unpkg side, which I believe is the natural way for unpkg to evolve. Sure, we could include better support for bundlers, which I'm also interested in doing. But ultimately I believe the future of unpkg should be as a CDN for individual files.

2 things need to happen in order to make this work well:

  • People need to start publishing packages with ES modules. Not a lot of people are publishing ES modules to npm right now. It's mostly still CommonJS. But the bundlers themselves are getting better as supporting ES modules out of the box (see Rollup and webpack 2) so I believe that more package authors will start including them. I know we're already supporting them in react-router.
  • Browsers need to support <script type="module">, which Safari TP already does. I'd expect more browsers to include support as the spec formalizes.

Anyway, to avoid hijacking this issue too much I've started a new thread in #24

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @abhishiv on February 19, 2017 11:35

@alexisvincent I have been working on exactly on the same thing, but have used jspm itself to install packages, upload them to s3, and then extract config. As you can imagine the workflow isn't quite optimized. I'm trying to comeup with a way to use unpkg, and your systemjs-config-builder project.

Since I'm guessing you are quite busy with other projects, would you be kind enough to document which areas need work on. I have a few free days at my disposal, and would love to see unpkg work with systemjs.

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @alexisvincent on February 19, 2017 12:2

@abhishiv Interestingly enough I started working on this again last night and this morning and have been working on a large rewrite to support this issue. I don't have a lot of time to put into it so if you're keen to help that would be awesome.

Theres a few things that need to happen here.

systemjs-config-builder

  • support the full npm resolution logic (working on)
  • be able to just process one package (working on)
  • support overrides via JSPM overrides registry (would appreciate help)

unpkg

  • generate configs on a package level
  • generate bundles for the packages

unpkg-loader

  • resolve the modules from unpkg via generated configs

The biggest two blockers atm are the unpkg support and systems-config-builder. Of the two, systemjs-config-builder would be the most helpful to receive help with. Specifically the overrides.

Let me know what you're keen to help with and we can nail this thing.

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @nkbt on February 19, 2017 18:58

Please, check out this cool thing made by people from Atlassian and presented last week on our local SydJS, it packages scripts with webpack on the fly into a one working bundle.

https://pkgzip.com

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @alexisvincent on February 19, 2017 19:11

@nkbt Very cool! We're wanting to do a very similar thing! Just more customizable and without including all transitive deps. Thanks for the link

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

@nkbt There's also https://wzrd.in/ which does the same thing, just using Browserify instead of webpack. I think Browserify is a better fit for an auto-bundler service because it works better "out of the box" with most npm modules, whereas webpack usually requires some sort of custom config.

@mjackson

This comment has been minimized.

Member

mjackson commented May 25, 2017

From @nkbt on February 20, 2017 7:38

Yeah, I reckon wzrd was mentioned in a talk. unpkg was also endorsed several rimes =). I hope they will publish the video/slides eventually.

@EricSimons

This comment has been minimized.

EricSimons commented Jul 21, 2017

@mjackson @alexisvincent I think you guys are gonna dig what @apai4 and I have been up to :)

TL;DR - We created an in-browser dev environment powered by Unpkg + System. You can check it out now over at https://stackblitz.com — we're planning on announcing it sometime next week once we get installs to be 100% stable (see #38). Also, best to check it out in Chrome/Firefox for the full experience 👍

As @nkbt and @mjackson mentioned, there are some hosted Browserify/Webpack based solutions for bundling deps out there. However, most requests made to them are cache misses as they can only cache entire bundles, not individual packages. This forces them to download the entire tarballs of all packages being installed from NPM, extract them, run Browserify/Webpack, and then return the result. This works fine when you're running a single npm install process on your local machine, but it doesn't scale well to hundreds (much less thousands) of requests per minute. These services typically take 15s+ to resolve for any request that's a cache miss (which most are) and you'll frequently hit server errors in the 500 range that fail to resolve at all.

So needless to say, while great for tinkering around, they just can't provide the same snappy & consistent UX that CLI's like yarn/npm running locally can.

Enter SystemJS. The beauty of System is that it runs entirely in your browser, no server side bundling required. However, it's missing one of the most fundamental components of developing software: the notion of a file system. This is where our wrapper around System comes in — it essentially exposes a filesystem and other necessary functionality to run Webpack loaders and an in-memory devserver. Check it out in action (it even works while you're offline):

screenflow

So how do NPM packages get retrieved? This is where Unpkg comes in. Thanks to the ?json flag on unpkg repo's, we can slurp down the entire directory listing before requesting any files. We then trace the main field, typings, etc and download every file required for the application to work. We then boot your application from these virtual files that are being served entirely in-memory.

In short, SystemJS & Unpkg are an incredible duo for dev UX because they reflect what makes local dev environments great: the client is doing all of the downloading, installing, bundling, and even serving the application.

Would love to hear your thoughts! We're also working on open sourcing all this over at https://github.com/stackblitz/core :)

@alexisvincent

This comment has been minimized.

alexisvincent commented Jul 21, 2017

@EricSimons Awesome :D Would love to hear more about your filesystem abstraction. Are you downloading ALL files in the package? Can you expand on the download and bundle phase?

@EricSimons

This comment has been minimized.

EricSimons commented Jul 25, 2017

@alexisvincent totally — we basically just created a mock file system that lives entirely in memory & is dumped to a flat file on S3 when you save.

Regarding package installation, we don't download all files — only the ones the app would needs to boot. Specifically, this is all files required by the main field and TS type definitions. The files are downloaded one-by-one from unpkg and stored in the in-memory FS. If you try and import a file that's not required by main (like a CSS file from the bootstrap package, for example), we go and retrieve it on-demand and persist it to the in-memory FS. This is all being done on the frontend btw, so there's 0 server cost incurred on our end of things :)

Download/bundle is also rather straightforward here, as we simply download the last state of your filesystem from our S3 bucket and then kick off the System.import for your app. We have a custom loader/plugin system in front of System that handles all of the custom FS resolution stuff, webpack loaders, etc.

I really want to open source our dependency resolver & custom loader system because it provides a lot of the dev UX you described earlier in this thread. Would love to brainstorm w/ you about this!

@ggoodman

This comment has been minimized.

ggoodman commented Aug 4, 2017

@EricSimons Any plans to open-source your work on this mock file system and webpack loader wrappers? You've gone and teased us with all this amazing new tech and now we're dying to see how it actually works.

@EricSimons

This comment has been minimized.

EricSimons commented Aug 6, 2017

@ggoodman We don't mean to tease! Just wanted to share our findings w/ everyone :)

Will take us a bit to get everything to a state where we can open source it; lots of optimizations/bug fixes/testing/refactoring that needs to be done. I'm planning on updating the readme over at https://github.com/stackblitz/core tomorrow with info on this 🍻

@mjackson

This comment has been minimized.

Member

mjackson commented Aug 31, 2017

We've currently got 2 threads going on SystemJS, so I'm going to close this one in favor of #16 which I've made some progress on.

@alexgorbatchev

This comment has been minimized.

alexgorbatchev commented May 3, 2018

@EricSimons thanks for sharing this really amazing stuff! I just tried out turbo-resolver and it works great! Curious if you are still going to release SystemJS side of things and if so do you have a rough time frame in mind?

@EricSimons

This comment has been minimized.

EricSimons commented May 3, 2018

@alexgorbatchev awesome to hear! We're actually finishing up the last refactor to our module loader before it goes open source; hoping to have it done by EOM. Would you be interested in trying it out? Also curious to hear what your use case is

@alexgorbatchev

This comment has been minimized.

alexgorbatchev commented May 3, 2018

@EricSimons Yes, very much would love to test it out. I'm working on internal application at Square, we have many teams contributing multiple projects. I'm exploring the possibility of using internal NPM as release and delivery vehicle for this application instead of bundling everything the webpack way. Does that make sense?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment