-
Notifications
You must be signed in to change notification settings - Fork 597
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Introduce Gapic-generated files for language API. #1476
Conversation
So, to use it: var Language = require('@google-cloud/language');
// hand-written client, highest level of abstraction, idiomatic/sugar methods
var language = Language();
language.annotate(...);
language.detectEntities(...);
language.detectSentiment(...);
// auto-gen client, middle level of abstraction, still has auto-retry, auth, etc.
var api = Language.v1beta1.LanguageServiceApi();
api.annotateText(...);
api.analyzeEntities(...);
api.analyzeSentiment(...);
// grpc, lowest level of abstraction, just a grpc stub, have to do auth manually
var LanguageService = Language.v1beta1.grpc().LanguageService;
var stub = new LanguageService('language.googleapis.com', credentials); // credentials acquired manually
stub.annotateText(...);
stub.analyzeEntities(...);
stub.analyzeSentiment(...); |
That's right -- but two notes:
I am not sure what is the best design to reveal those gRPC constants to the users -- please let me know if you have some better ideas. |
Wouldn't the hand-written part want to expose them to the user for easiest access? We'd also want JSDoc comments on them so they get into the library's documentation. |
Just curious why this is a promise. Does anything async need to happen before a stub can be used? In other words, why not just: var api = Language.v1beta1.LanguageServiceApi();
api.stub.annotateText(); I think I like accessing the "stub" through a different method better than the // grpc, lowest level of abstraction, just a grpc stub, have to do auth manually
var LanguageService = Language.v1beta1.grpc().LanguageService;
var stub = new LanguageService('language.googleapis.com', credentials); // credentials acquired manually
stub.annotateText(...);
stub.analyzeEntities(...);
stub.analyzeSentiment(...); How I would personally use that: var gcloudLanguage = require('@google-cloud/language');
var grpc = gcloudLanguage.v1beta1.grpc();
var language = new grpc.LanguageService('language.googleapis.com', credentials);
language.annotateText({
// something that needs an enum
type: grpc.Document.Type.PLAIN_TEXT
}, ...);
language.analyzeEntities(...);
language.analyzeSentiment(...); In the case where there are multiple services that a user can instantiate, isn't it common or always the case that they can share an auth client? If that's true, maybe we should make the -var api = language.v1beta1.LanguageServiceApi();
-var language = api.LanguageServiceApi({ projectId: 'grape-spaceship-123', keyFilename: '...' });
-var otherService = api.OtherServiceApi({ projectId: 'grape-spaceship-123', keyFilename: '...' });
+var api = language.v1beta1({ projectId: 'grape-spaceship-123', keyFilename: '...' });
+var language = api.LanguageServiceApi();
+var otherService = api.OtherServiceApi();
// ^-- both share the auth client
language.annotateText(...);
otherService.method(...); Anywhere there is an upper-camelcase identifier, var language = api.LanguageServiceApi(); My preference would be to not require Would you explain more about what kind of documentation is expected for these new APIs? Specifically, will the auto gen client have a separate page that lists all of the methods and arguments, with examples? And will the "stub" have a separate page with the same? Since we're going to have 3 ways of doing the same thing, I'm concerned about the added complexity on our docs site. Let me know if there's an existing vision for how these docs will intertwine. |
Pushed a new patch, but that fixes minor syntactic things.
That's because gRPC stub needs to wait for the authentication. Also |
Right now Google-GAX has its own auth factory as https://github.com/googleapis/gax-nodejs/blob/master/lib/grpc.js#L36, so multiple service instances share the same client instance if the service instances are under the same package. By adopting google-auto-auth library, your design would make more sense -- but thinking further, the auth client can't be shared among multiple APIs (like Language API and Speech API), right? var language = require('@google-cloud/language').v1beta1({keyFile: '...'});
var speech = require('@google-cloud/speech').v1({keyFile: '...'});
var langApi = language.languageServiceApi();
var speechApi = speech.speechApi();
langApi.annotateText(...);
speechApi.nonStreamingRecognize(...);
// langApi and speechApi don't share the auth client... So rather than doing this, it would probably better to accept an 'auth' parameters, wouldn't it? var Auth = require('google-auto-auth');
var auth = new Auth({keyFile: '...'});
var language = require('@google-cloud/language').v1beta1;
var speech = require('@google-cloud/speech').v1;
var langApi = language.languageServiceApi({auth: auth});
var speechApi = speech.speechApi({auth: auth});
... |
We want to have a separate page for the auto-gen clients, with the list of methods, arguments, and examples. I don't care about stubs. We don't require showing their documents, it does not affect the usability of auto-gen clients. |
That was my initial thought on the subject.
Yeah, the stubs are just a proto file turned into JavaScript by Protobuf.js, and a user can look at the proto file or the API's gRPC reference docs to understand the resulting structure of the stub. |
var language = require('@google-cloud/language').v1beta1({keyFile: '...'});
var speech = require('@google-cloud/speech').v1({keyFile: '...'});
var langApi = language.languageServiceApi();
var speechApi = speech.speechApi();
langApi.annotateText(...);
speechApi.nonStreamingRecognize(...); I would rather not require the user to install & learn an external dependency (google-auto-auth) in order to be able to use the autogen layer. Also, google-auto-auth returns an auth client, already bound to the required scopes. The user would have to know the scopes the API requires to create the correct auth client. We should just force the cost on the user of duplicate config objects containing their keyfile, projectId, and other required settings from the API.
I think our libraries somewhat clash here, since we don't support promises (yet), and I think when we do, we would put anything that returns a promise behind a function call, e.g. I'm on board with var api = Language.v1beta1.LanguageServiceApi();
api.stub.annotateText();
var service1 = Language.v1beta1.ServiceOneApi({ auth details });
var service2 = Language.v1beta1.ServiceTwoApi(); // how does this know about `{auth details}` The user would have to provide the auth details twice, resulting in two auth clients being created, right? If it moves a level up, I believe those issues would go away: var languageServices = Language.v1beta1({ auth details });
var service1 = languageServices.ServiceOneApi();
var service2 = languageServices.ServiceTwoApi(); Regarding exposing the gRPC constants, can we put those right on the autogen class as statics? Language.v1beta1.Document.Type.PLAIN_TEXT |
I didn't intend to advertise the existence of |
I wanted to allow injecting gRPC module itself too, to share the same C-extension. This is quite limited usage, but this would be helpful to some expert users in some case. I am in the process of modifying var language = require('@google-cloud/language');
var service = language.v1beta1({keyFile: ...});
var api = service.LanguageServiceApi();
api.annotateText({type: service.grpc.Document.Type.PLAIN_TEXT, ... }); If the user has some special need to customize gRPC module: var grpc = ...;
var service = language.v1beta1({keyFile: ..., grpc: grpc});
... |
Forgotten to mention: I didn't think of the difference of scopes, and indeed that would affect the design. Thank you for pointing it out. |
Removing |
Uploaded a new patchset. It relies on a patch for google-gax at jmuk/gax-nodejs@65052ae |
|
I think normally users won't access to grpc stubs, but in case they want, var language = require('@google-cloud/language');
var v1beta1 = language.v1beta1({keyFile: ...});
var stub = new v1beta1.grpc.LanguageService(...); |
Another idea for var language = require('@google-cloud/language');
var v1beta1 = language.v1beta1({keyFile: ...});
var api = v1beta1.LanguageServiceApi();
api.annotateText({type: v1beta1.Document.Type.PLAIN_TEXT, ...});
// not v1beta1.grpc.Document.Type.PLAIN_TEXT |
Got it. |
So are we pretty satisfied with how this works? Can your manual edits be automated at all? The manual stuff seems pretty minimal, seems like it could be automated. |
* @param {String} opts.appVersion | ||
* The version of the calling service. | ||
*/ | ||
function LanguageServiceApi(opts) { |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
var gaxGrpc = gax.grpc(options); | ||
var result = {}; | ||
extend(result, languageServiceApi(gaxGrpc)); | ||
return result; |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
|
This can be generated automatically, but I manually created to keep this discussion.
I may miss your point, but modified index.js so that the edit is simply an addition of a line to index.js, and then this edit can also be automatically generated.
Thanks. Will do in the next API (probably adding that in the code generation pipeline). |
Your edit should work fine. |
- Move API service class to a toplevel object - index.js creates a auth context (through GAX) and apply it to the generated code. This way, the same context will be shared among multiple instances if an API consists of multiple service classes.
Are there any other additional comments on #1476 (comment) or #1476 (comment)? Are there anything I'm missing? |
I think we're set. We just need to find a way to keep our unit test coverage at 100%. We will need a unit test that checks And since we aren't worried about writing tests for the generated files, we'll have to find a way to have our linter skip those files. The easiest way to do that will probably be from using a naming convention and adding it to .jshintignore So maybe we want to add:
Or to make it a little more robust, maybe we would place generated files in their own directory to allow:
|
Wouldn't we want the generated code to pass JSHint? At the least it could catch undefined variables, etc., especially since we won't have actual tests for the generated code. |
Hmm, maybe I'm getting my dev tools mixed up. What's causing the coverage drop? |
@@ -31,7 +31,7 @@ | |||
"lint": "jshint scripts/ packages/ system-test/ test/ && jscs packages/ system-test/ test/", | |||
"test": "npm run docs && npm run bundle && mocha test/docs.js packages/*/test/*.js", | |||
"system-test": "mocha packages/*/system-test/*.js --no-timeouts --bail", | |||
"cover": "istanbul cover _mocha --report lcovonly -- --no-timeouts --bail packages/*/test/*.js -R spec", | |||
"cover": "istanbul cover _mocha --report lcovonly -x 'packages/*/src/v[0-9]*/*.js' -- --no-timeouts --bail packages/*/test/*.js -R spec", |
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
This comment was marked as spam.
This comment was marked as spam.
Sorry, something went wrong.
Note that JSHint (and JSCS) doesn't exclude the auto-generated files. I also found their output helpful for me to fix the code generator. |
It's all looking good to me. Travis failed because of our repo rename. Anything left to do? // @jmdobry @jmuk @callmehiphop |
Changes Unknown when pulling cddd4eb on jmuk:language into * on GoogleCloudPlatform:master*. |
LGTM |
LGTM too |
Thanks! Now we just need to figure out the docs: #1492 |
The same structure as googleapis#1476.
- retire use_pbjs flag -- it's not used anymore, and it does not fit with the new pattern of Gapic code. - update the grpc package template: this fits with the proto packages used by gcloud-node. - update gax package template: this will work well with the new pattern of Gapic code, as you can see in googleapis/gapic-generator#392 or googleapis/google-cloud-node#1476
- retire use_pbjs flag -- it's not used anymore, and it does not fit with the new pattern of Gapic code. - update the grpc package template: this fits with the proto packages used by gcloud-node. - update gax package template: this will work well with the new pattern of Gapic code, as you can see in googleapis/gapic-generator#392 or googleapis/google-cloud-node#1476 - add 'env' parameter to dependency_out protoc
Fixes #1463
Fixes #1464
auto-generated