Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow individually publishable API client libraries. #1431

Closed
wants to merge 2 commits into from
Closed

Allow individually publishable API client libraries. #1431

wants to merge 2 commits into from

Conversation

jmdobry
Copy link
Contributor

@jmdobry jmdobry commented Jul 13, 2016

Lots of discussion floating around on why a PR like this is necessary, like GoogleCloudPlatform/gcloud-common#138.

Summary of changes

tldr;

Move the code for each API client into its own publishable package folder within this repository (rename lib/ to packages/), change a lot of relative import paths to use named package imports, and switch to a @google-cloud/* naming scheme.

Long version

What is this PR? Right now gcloud-node combines all API client library code into a single installable NPM package. This means that to update client code for an individual API, the entire gcloud-node package must be version bumped and re-published.

This PR makes it possible for each sub-folder within the packages/ folder to be published to NPM as its own package. Currently, lib/index.js imports all of the lib/ sub-folders using relative paths, pulling that code directly into the published gcloud-node package. Furthermore, individual lib/ sub-folders sometimes import code from each other using relative paths. In order for packages/vision/ to be cleanly publishable as an individual NPM package, any code it imports from outside of itself (e.g. require('../storage/file.js') or require('../lib/common/util.js'), needs to be changed to named package imports (e.g. require('@google-cloud/storage').File or require('@google-cloud/common').util. This decouples the versioning and publishing of sub-packages from one another.

With this change, some individual packages will be free to move forward toward 1.0.0 while others might need to stay back at alpha or beta versions. The actual version number of the gcloud-node becomes less meaningful, as now the "big" package will just depend on a bunch of individually versioned and published packages.

Before

npm install gcloud
var gcloud = require('gcloud');
var vision = gcloud.vision();

After

npm install google-cloud
var gcloud = require('google-cloud');
var vision = gcloud.vision();

or

npm install @google-cloud/vision
var vision = require('@google-cloud/vision')();

or

npm install @google-cloud/pubsub
var pubsub = require('@google-cloud/pubsub')();

etc.

List of changes

  • - package.json
    • - Change "name" from "gcloud" to "google-cloud"
    • - Remove existing "dependencies" and switch to depending on @google-cloud/storage, @google-cloud/pubsub, etc.
  • - lib/index.js
    • - Update require calls from using relative paths to using package names, e.g. require('./storage/index.js') => require('google-cloud-storage')
  • - Sub-packages
    • - Add individual package.json files for each lib/<folder>
    • - Each sub-package has its own dependencies, unused dependencies removed from google-cloud-common
    • - Each sub-package will use a unique User-Agent header, for better usage tracking.
    • - Set sub-package package.json "version" field to 0.1.0
    • - Update sub-package use of relative import paths (../lib/common) to use @google-cloud/common, e.g. require('../lib/common/utils.js') => require('@google-cloud/common').util
    • - @google-cloud/common turned into a module, rather than dependents importing file paths.
  • - Tests
    • - When mocking dependencies, update paths to reflect changes in lib/
    • - Note: There seems to be a bug in mockery that doesn't like the symbolic links that are now necessary for local development, so I switched to proxyquire, which works fine with symbolic links.
  • - scripts/
    • - Added link.sh, which sets up the symbolic links necessary for local development
    • - Added unlink.sh, the reverse of link.sh
    • - Added a simple publish.sh script, referenced in each sub-package package.json file, which can publish a sub-package to NPM with the appropriate legal files included.
  • - Docs
    • - Update README.md

@googlebot googlebot added the cla: yes This human has signed the Contributor License Agreement. label Jul 13, 2016
@stephenplusplus
Copy link
Contributor

Thanks for this 🎉

We should make sure we don't need to use new:

var vision = require('google-cloud-vision');
var visionClient = vision();

// or more likely, some opts will be required:
var vision = require('google-cloud-vision');
var visionClient = vision({ projectId: '...', keyFilename: '...' });

// and what I would personally do to make it a one-liner:
var vision = require('google-cloud-vision')({ projectId: '...', keyFilename: '...' });

@jmdobry
Copy link
Contributor Author

jmdobry commented Jul 13, 2016

Ah yes, we don't need new, I'll update my examples.

@stephenplusplus
Copy link
Contributor

Can you write up a little "Here's how this all works" bit?

Also, are there any similarities between the process outlined here and how this PR works?

@jmdobry
Copy link
Contributor Author

jmdobry commented Jul 14, 2016

Can you write up a little "Here's how this all works" bit?

Will do.

Also, are there any similarities between the process outlined here and how this PR works?

In my PR I didn't really address the publish process as discussed in #1411. My PR is more about organizational changes required in the code itself to facilitate individually publishable API packages. I'll explain better in my summary that I will add to my PR description.

@jmdobry
Copy link
Contributor Author

jmdobry commented Jul 14, 2016

I updated the PR description.

@jmdobry
Copy link
Contributor Author

jmdobry commented Jul 15, 2016

Open questions are:

  • Versioning
  • Release process
  • Documentation

@stephenplusplus
Copy link
Contributor

A couple of thoughts:

I think we are going to go with scoped packages (https://docs.npmjs.com/misc/scope), so google-cloud for the umbrella package, and @google-cloud/storage for a submodule.

var vision = require('google-cloud-vision').Vision();

We should be able to lose the Vision property so that a user doesn't have to "ask" for vision twice:

var vision = require('google-cloud-vision');

Is anything complicating that?

  • Versioning

I think the umbrella package should just be the next minor bump from where we are when this PR lands, as opposed to something like 1.0.0-beta.

For the module packages, I would start out at 1.0.0 on each.

  • Release process

I'd like Travis to be involved, but pushing to npm with Travis has consistently failed for us over the last year or so. This forces a manual release process from the terminal, so the ideas in #1411 should be free to implement after we land the separation of services from this PR.

The manual release process should look like this:

Release the umbrella package

# Start from freshly installed dependencies:
rm -rf node_modules && npm install

# Check for faulty code:
npm run lint && npm run test && npm run system-test

# Create a git tag and increment package.json
npm version major

# Publish to npm from a clean directory:
# (`rm -rf node_modules` is to work around an npm publish bug)
rm -rf node_modules && npm publish

# Push to GitHub:
git push origin master --follow-tags

Release a module package

# Start from a clean sub-module directory
cd lib/bigquery && rm -rf node_modules && npm install

# Check for faulty code:
npm run lint && npm run test && npm run system-test

# Create a git tag and increment package.json:
npm version major

# Publish to npm from a clean directory:
# (`rm -rf node_modules` is to work around an npm publish bug)
rm -rf node_modules && npm publish

# Push to GitHub:
git push origin master --follow-tags

# Because it was a major increment with breaking changes,
# we have to manually update the umbrella package's dependency
# on it:
cd ../../ # back to the umbrella package directory
npm uninstall --save @google-cloud/bigquery
npm install --save @google-cloud/bigquery
git commit -am 'Update @gcloud/bigquery'

# Release the umbrella package as:
#   - a minor bump -- if the umbrella package is pre-1.0
#   - a major bump -- if the umbrella package is post-1.0
# [ steps outlined in previous "Release the umbrella package" section ]

Once we decide on a versioning pattern (what the umbrella package and each module package starts at), a more specific guide can be whipped up.

  • Documentation

@callmehiphop !

@jmdobry
Copy link
Contributor Author

jmdobry commented Jul 15, 2016

@stephenplusplus

I think we are going to go with scoped packages

I'll fix the package names.

var vision = require('google-cloud-vision').Vision();`

We should be able to lose the Vision property so that a user doesn't have to "ask" for vision twice:

var vision = require('google-cloud-vision');

Is anything complicating that?

Well, some submodules need access to constructors from other modules, with the most common example being the File constructor which is imported from the Storage submodule into other submodules.

If the overall service constructor is the exported module (i.e. module.exports = Storage;), then the only way to make the File constructor exportable is to attach it to the Storage constructor as a static property (i.e. Storage.File = File;), which I think is gross. This has always been a limitation of CommonJS, so if you have additional exports but you want the convenience of a default export you've had to attach additional exports to the default export.

ES2015 module syntax allows a default export and named exports within a single module, so I switched the submodules to use named exports with the service constructor also being the default export as a forward-looking change (these submodules don't exist yet, so here we have a chance to set precedent), i.e.:

// export {
//   Storage as default,
//   Storage,
//   File 
// }
exports.default = Storage;
exports.Storage = Storage;
exports.File = File;

So you could do:

import { File, Storage } from '@google-cloud/storage';

const storage = Storage({
  // config
});

or

import Storage from '@google-cloud/storage';
import { File } from '@google-cloud/storage';

const storage = Storage({
  // config
});

or in CommonJS

const CloudStorage = require('@google-cloud/storage');
const { File, Storage } = CloudStorage;

const storage = Storage({
  // config
});

In general I prefer exporting constructors/factory functions, and letting the user instantiate, rather than the act of importing of a module having side effects (magically instantiated client with default options).

Thoughts?

@stephenplusplus
Copy link
Contributor

Well, some submodules need access to constructors from other modules, with the most common example being the File constructor which is imported from the Storage submodule into other submodules.

I see that as our problem that we shouldn't make the user pay for. While gross, I'd prefer exposing the sub-type constructors on the parent type (Storage.File), or any other way, as long as the thing exported is the way to use the module directly.

the act of importing of a module having side effects (magically instantiated client with default options).

It wouldn't. The way to instantiate the module would be to invoke the function returned by the require:

var vision = require('@google-cloud/vision');
var visionClient = vision();

// (aka)
var vision = require('@google-cloud/vision')();

@jmdobry
Copy link
Contributor Author

jmdobry commented Jul 15, 2016

Okay, I'll switch away from named exports.

@@ -21,6 +21,19 @@
},
"excludeFiles": [
"system-test/data/*",
"test/testdata/*"
"test/testdata/*",
"lib/bigquery/node_modules/",

This comment was marked as spam.

This comment was marked as spam.

@stephenplusplus
Copy link
Contributor

We originally used proxyquire, but switched away for some reason. As long as it's working now, I'm glad to ditch mockery, so thanks for that :)

@@ -34,6 +34,7 @@ function runCodeInSandbox(code, sandbox) {
timeout: 1000
});
} catch(err) {
console.log(err.stack);

This comment was marked as spam.

This comment was marked as spam.

@Gormartsen
Copy link
Contributor

Subscribed.
I just want to add a feedback here on this topic.

Right now to use gcloud.compute I need to install all other APIs as well.

It would be awesome if I can install only gcloud-compute with gce-images only.

@stephenplusplus
Copy link
Contributor

Thanks for the updates! 🍻

@callmehiphop what are your thoughts on the un-checked TODOs from the opening post? My thoughts: #1431 (comment)

@Gormartsen
Copy link
Contributor

@stephenplusplus I'd like Travis to be involved, but pushing to npm with Travis has consistently failed for us over the last year or so.

I am working on CI system right now, where release flow could be separated task connected or not with github context.

If you up to try, I can invest my time to setup a release process example on fork and we can move from there.

Btw: release process could be done from user defined system via SSH.

Let me know what you think.

@jmdobry
Copy link
Contributor Author

jmdobry commented Jul 18, 2016

@stephenplusplus Do you think I should further modify the directory structure to be something like what Babel has?

lib/
  bigquery/
    index.js
    package.json
  bigtable/
    index.js
    package.json
  common/
    index.js
    package.json
  index.js
system-test/
test/
package.json

=>

packages/
  google-cloud/
    src/
      index.js
    system-test/
    test/
    package.json
  bigquery/
    src/
      index.js
    system-test/
    test/
    package.json
  bigtable/
    src/
      index.js
    system-test/
    test/
    package.json
  common/
    src/
      index.js
    system-test/
    test/
    package.json

@stephenplusplus
Copy link
Contributor

@jmdobry that looks nice to me. I like keeping the tests close to the source. If someone is contributing a Bigtable bug fix, they wouldn't have to run 1,000 unrelated tests just to see if their change works. And working in an isolated context from the rest of the code sounds better to me. If you're cool with it, I'm totally on board.

@Gormartsen that sounds pretty interesting. I'm down for taking a look at an example. Worth mentioning the project owners (cc: @jgeewax) would have to be involved in the decision to bring in a new tool.

@stephenplusplus
Copy link
Contributor

@jmdobry on the versioning, JJ advises to stay in 0.x land. So, google-cloud should start at 0.1.0, as well as all of the sub-packages.

@jmdobry
Copy link
Contributor Author

jmdobry commented Jul 18, 2016

Okay.

@jmdobry
Copy link
Contributor Author

jmdobry commented Jul 18, 2016

Re-organized into package folders.

@@ -7,6 +7,15 @@

## Testing

### Setup

This comment was marked as spam.

This comment was marked as spam.

@stephenplusplus
Copy link
Contributor

Is the linking script going to block out any contributors on Windows machines?

@coveralls
Copy link

coveralls commented Jul 25, 2016

Coverage Status

Coverage remained the same at 100.0% when pulling 5194297 on jmdobry:google-cloud into 9cdcef5 on GoogleCloudPlatform:master.

@jmdobry
Copy link
Contributor Author

jmdobry commented Jul 29, 2016

I found a really neat way to make the shell scripts cross-platform: https://github.com/shelljs/shelljs

@callmehiphop
Copy link
Contributor

@stephenplusplus is this just waiting on the updated docs stuff?

@stephenplusplus
Copy link
Contributor

That's a better question for @jmdobry :)

@callmehiphop
Copy link
Contributor

callmehiphop commented Aug 1, 2016

I think if the docs are the only thing holding this PR back, we should just merge this in and get any pending PRs resolved. If necessary we can time the release with the docs being finished.

I have a feeling our JSON docs script is going to need quite a bit of work to accommodate the modular docs and I'd hate to delay this more than we need to.

@stephenplusplus
Copy link
Contributor

Sure, sounds good to me. Whenever @jmdobry gives the go-ahead, we can merge. The only thing that might be left is the integration of shelljs so Windows users can contribute.

@stephenplusplus stephenplusplus mentioned this pull request Aug 2, 2016
23 tasks
var env = require('./env');
var Dataset = require('../src/dataset.js');
var Table = require('../src/table.js');
var env = require('../../system-test/env.js');

This comment was marked as spam.

@jmdobry
Copy link
Contributor Author

jmdobry commented Aug 3, 2016

I'll finalize this today so it can get merged.

* Re-organize into package folders. (#1)
@jmdobry
Copy link
Contributor Author

jmdobry commented Aug 3, 2016

Done, rebased against master, conflicts resolved. The scripts I added are now cross-platform via ShellJS (the other scripts could probably be upgraded too).

The scripts I added are:

  • scripts/install - Installs NPM dependencies in each sub-package. Works in up to 5 directories at a time so as to be fast.
    • Is automatically invoked via postinstall when npm install is run at the root of the repository
  • scripts/uninstall - Reverse of scripts/install
  • scripts/link - Creates local symbolic links (via npm link) between the various sub-packages in the correct order based on inter-package dependencies. Useful for testing your local changes to multiple sub-packages that are inter-dependent. Also, these links are required for the tests to pass while the sub-packages have not yet been published to NPM.
    • Once the sub-packages have been published to NPM, these lines .travis.yml can be removed:

      - npm install shelljs
      - "./scripts/link"
  • scripts/unlink - Reverse of scripts/link

Sorry if my rebasing made life harder for other PRs based on this PR.

@coveralls
Copy link

coveralls commented Aug 3, 2016

Coverage Status

Coverage remained the same at 100.0% when pulling 436466d on jmdobry:google-cloud into c741869 on GoogleCloudPlatform:master.

@stephenplusplus
Copy link
Contributor

Works in up to 5 directories at a time so as to be fast.

Woo!

Is automatically invoked via postinstall when npm install is run at the root of the repository

I think this means that it will also run when users npm install google-cloud.

Sorry if my rebasing made life harder for other PRs based on this PR.

No problem, I knew I was in for it :)

@stephenplusplus
Copy link
Contributor

I think this means that it will also run when users npm install google-cloud.

^ Ignore me, I got my package.jsons confused.

@stephenplusplus
Copy link
Contributor

Looks good to me. Made a PR with a couple tweaks that this needed to run locally: https://github.com/jmdobry/gcloud-node/pull/2

@googlebot
Copy link

We found a Contributor License Agreement for you (the sender of this pull request), but were unable to find agreements for the commit author(s). If you authored these, maybe you used a different email address in the git commits than was used to sign the CLA (login here to double check)? If these were authored by someone else, then they will need to sign a CLA as well, and confirm that they're okay with these being contributed to Google.

@googlebot googlebot added cla: no This human has *not* signed the Contributor License Agreement. and removed cla: yes This human has signed the Contributor License Agreement. labels Aug 3, 2016
@coveralls
Copy link

coveralls commented Aug 3, 2016

Coverage Status

Coverage remained the same at 100.0% when pulling 618c238 on jmdobry:google-cloud into c741869 on GoogleCloudPlatform:master.

@stephenplusplus
Copy link
Contributor

Just running some tests locally, then merging in. Yay!

@stephenplusplus
Copy link
Contributor

Merged in via 1ccd918

Thank you for all of the help! 👍 🎉 The docs are still in progress, but should be completed shortly. We'll start firing off the modules then. MODULES!!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla: no This human has *not* signed the Contributor License Agreement.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants