Skip to content

logging: convert to gax#2072

Merged
stephenplusplus merged 16 commits intogoogleapis:masterfrom
stephenplusplus:spp--1859-logging
Apr 7, 2017
Merged

logging: convert to gax#2072
stephenplusplus merged 16 commits intogoogleapis:masterfrom
stephenplusplus:spp--1859-logging

Conversation

@stephenplusplus
Copy link
Contributor

@stephenplusplus stephenplusplus commented Mar 10, 2017

RE: #1859

To Dos

  • Convert from gRPC to gax
    • Handle automatic project ID insertion
    • Logging
    • Log
    • Metadata
    • Sink
  • Tests
    • Unit
      • Logging
      • Log
      • Metadata
      • Sink
    • System

This converts the Logging handwritten API from making manual gRPC requests via GrpcService to gax.

@stephenplusplus stephenplusplus added api: logging Issues related to the Cloud Logging API. Status: Don't Merge labels Mar 10, 2017
@googlebot googlebot added the cla: yes This human has signed the Contributor License Agreement. label Mar 10, 2017
@stephenplusplus
Copy link
Contributor Author

@jmuk -- I'm running into an issue with the listLogEntries method. I believe I'm calling it correctly, but I'm getting an error from the API.

{ orderBy: 'timestamp desc',
  pageSize: 1,
  projectIds: [ 'nth-circlet-705' ] }

Insufficient tokens for quota 'ReadGroup' and limit 'CLIENT_PROJECT-100s' of service 'logging.googleapis.com' for consumer 'project_number:288560394597'.

Do you know what I need to do to fix this?

@stephenplusplus stephenplusplus changed the title logging: convert to gax [skip ci] logging: convert to gax Mar 10, 2017
@jmuk
Copy link
Contributor

jmuk commented Mar 10, 2017

The error message is saying about the quota, somehow the API calls could hit to some limit? I'm still not sure, but it might be worth trying autoPaginate: false option to check if the error happens during automatic paginations.

Also, probably nicer to use resourceNames rather than projectIds -- the projectIds are marked as deprecated now. However I don't think resourceNames/projectIds difference affects the errors like this.

@stephenplusplus
Copy link
Contributor Author

autoPaginate: false gets rid of the error.

@jmuk
Copy link
Contributor

jmuk commented Mar 10, 2017

Oh and you specified pageSize as 1, so it probably goes into a number of API calls which comes to the quota for the number of API calls (https://cloud.google.com/logging/quota-policy 10 per seconds per project).

@stephenplusplus
Copy link
Contributor Author

stephenplusplus commented Mar 10, 2017

That's a bug in the generated code, right? API requests shouldn't continue after I receive my pageSize limit, regardless of autoPaginate value. This is done in @google-cloud/common paginator: https://github.com/GoogleCloudPlatform/google-cloud-node/blob/c51f14c7af7c01a3bdc520c180e767ea0448f660/packages/common/src/paginator.js#L210 (prior to that function executing, pageSize is translated to maxResults).

I can see how pageSize could mean "make as many requests as needed, but don't give me more than pageSize back per response". However, that's a library-agnostic option, i.e. a user familiar with the Logging API has no idea what autoPaginate is. So, how could the API have made the assumption we would put auto-paginating behavior in front of it? I think the expected behavior is to treat it as maxResults, and set autoPaginate to false automatically. Does that make sense?

@jmuk
Copy link
Contributor

jmuk commented Mar 10, 2017

I think this is an intended behavior. pageSize parameter specifies the number of the elements in a single API response object, I'm thinking it does not imply the auto pagination. auto paginations will be controlled by the autoPaginate flag.

@jmuk
Copy link
Contributor

jmuk commented Mar 10, 2017

But: if it's quite common among GCN, probably we should change the behavior on the codegen side. Is there good examples of the same pattern on other APIs?

@stephenplusplus
Copy link
Contributor Author

It looks like it's just Logging that uses pageSize, and it doesn't look like there is an alternative option for maxResults behavior. Since this is the only API, we can re-assign what pageSize means here. If it makes more sense to use it as the API defines it, we can switch to that. Let me know what you think we should do.

@stephenplusplus
Copy link
Contributor Author

@jmuk autoPaginate seems to be ignored in streaming APIs. That might make sense, since it's a stream, but it's leading to a "run until it dies" scenario, because the API eventually shuts me down due to my quota being exceeded. I think some type of maxResults argument should be supported by the generated API, wdyt?

@stephenplusplus
Copy link
Contributor Author

stephenplusplus commented Mar 23, 2017

It also seems that I can't end a stream early. We support this in GCN with the help of split-array-stream. This will take a chunk of API results, emit them onto a stream, and then execute a callback with an argument stating whether the stream has been closed or not: https://github.com/stephenplusplus/split-array-stream/blob/master/index.js

Before each item is pushed to the stream, it also makes sure the stream isn't ended. In this scenario, this would allow me to do something like this:

var entriesMatched = 0
logging.getEntriesStream()
  .on('data', function() {
    entriesMatched++
    if (entriesMatched === 20) this.end()
    if (entriesMatched > 20) throw new Error('I should only get 20!')
  })
  .on('end', function() {
    assert.strictEqual(entriesMatched, 20)
  })

maxResults would make the above scenario more logical, but ending a stream before all results have been received is useful as well, in the event you're looking for a specific item, and once you've found it, you don't want any more connections open or requests to be made.

});
delete reqOpts.gaxOptions;

self.api.Logging.writeLogEntries(reqOpts, options.gaxOptions, callback);

This comment was marked as spam.

@stephenplusplus stephenplusplus added the status: blocked Resolving the issue is dependent on other work. label Mar 23, 2017
@stephenplusplus
Copy link
Contributor Author

@callmehiphop PTAL -- some things to consider:

We use the {{projectId}} token replacement-- GAX has no idea about that, and doesn't have anything similar. So we have to stand in front of all GAX requests and replace those tokens.

We can get the authClient from GAX, but, we can't instantiate GAX unless we have a projectId or it throws. So, that's why I included google-auto-auth to get the auth client & projectId, and why I lazily instantiate the GAX clients.

@jmuk marked this as blocked because without being able to end a stream, a method like getEntriesStream(), which can have a very large number of results, is prone to breaking the QPS limits.

@jmuk
Copy link
Contributor

jmuk commented Mar 23, 2017

Sorry for the delays:

  • maxResults might be a good idea, that can happen anyways when the number of logs grow.
  • not ending lists properly is a bug on google-gax library. To make sure, that's for the paged responses like listLogEntriesStream method, correct?

@jmuk
Copy link
Contributor

jmuk commented Mar 23, 2017

@stephenplusplus
Copy link
Contributor Author

Yes, anywhere a stream is used and it has the chance to make multiple requests. split-array-stream might be worth looking into. I made it for this exact case -- you can see the usage example in its readme is based on our use case

Also, this got collapsed by GH, but is also a blocker, as we are getting output printed to the screen: #2072 (comment)

Thanks!

@stephenplusplus
Copy link
Contributor Author

@jmuk just to get an idea, how far off are these issues from being worked on? Feel free to let me know if you have any further questions or need a second pair of eyes on anything. Thanks!

@callmehiphop
Copy link
Contributor

@stephenplusplus a sort of general thought on the implementation, we do something similar in Spanner where we have to step in front of most requests to make sure we have a session available first and I think the implementations are a little bit different (here we use strings, there we use a bound function).

Thoughts on a small refactor for the sake of consistency?

@stephenplusplus
Copy link
Contributor Author

stephenplusplus commented Mar 29, 2017

Unfortunately, I have to go with the string (and we'll have to do this in Spanner... and Speech... and Vision as well), because we can't instantiate the generated client until we have a project ID.

* });
*/
Logging.prototype.getSinksStream = common.paginator.streamify('getSinks');
Logging.prototype.getSinksStream = function(options) {

This comment was marked as spam.

This comment was marked as spam.

@callmehiphop
Copy link
Contributor

@stephenplusplus ah, right. LGTM then

@stephenplusplus
Copy link
Contributor Author

@jmuk can you ping when the new release is available? Will new files need to be generated?

@jmuk
Copy link
Contributor

jmuk commented Apr 3, 2017

published google-gax 0.13.0. This includes the fixes and maxResults parameter.

@stephenplusplus stephenplusplus removed the status: blocked Resolving the issue is dependent on other work. label Apr 4, 2017
@stephenplusplus
Copy link
Contributor Author

Awesome, thanks!

@coveralls
Copy link

Coverage Status

Changes Unknown when pulling 51a81f3 on stephenplusplus:spp--1859-logging into ** on GoogleCloudPlatform:master**.

@coveralls
Copy link

Coverage Status

Changes Unknown when pulling 81e88ce on stephenplusplus:spp--1859-logging into ** on GoogleCloudPlatform:master**.

@stephenplusplus
Copy link
Contributor Author

@callmehiphop ready for another look-- tests have been added, but I believe the source is the same as your last review.

@callmehiphop
Copy link
Contributor

LGTM

@stephenplusplus stephenplusplus merged commit 7166f1a into googleapis:master Apr 7, 2017
sofisl pushed a commit that referenced this pull request Feb 26, 2026
* sample: blind write

* sample: blind write

* refactor

* add class mutation

* add class mutation

* feat: blind-writes

* refactor

* fix: lint errors

* refactor

* fix: lint errors

* fix: header checks

* refactor the blind write

* feat: add support for blind writes

chore(main): release 7.9.0 (#2053)

:robot: I have created a release *beep* *boop*
---

* **spanner:** Add support for batchWrite ([#2054](https://togithub.com/googleapis/nodejs-spanner/issues/2054)) ([06aab6e](https://togithub.com/googleapis/nodejs-spanner/commit/06aab6e39bbce9e3786f1ac631c80e8909197e92))

* **deps:** Update dependency google-gax to v4.3.4 ([#2051](https://togithub.com/googleapis/nodejs-spanner/issues/2051)) ([80abf06](https://togithub.com/googleapis/nodejs-spanner/commit/80abf06ba8ef9497318ffc597b83fb63e4408f9c))
* **deps:** Update dependency google-gax to v4.3.5 ([#2055](https://togithub.com/googleapis/nodejs-spanner/issues/2055)) ([702c9b0](https://togithub.com/googleapis/nodejs-spanner/commit/702c9b0f34e6cc34233c5aa52b97601b19f70980))
* **deps:** Update dependency google-gax to v4.3.6 ([#2057](https://togithub.com/googleapis/nodejs-spanner/issues/2057)) ([74ebf1e](https://togithub.com/googleapis/nodejs-spanner/commit/74ebf1e45cddf614c180295f3a761a8f84c5cb32))
* **deps:** Update dependency google-gax to v4.3.7 ([#2068](https://togithub.com/googleapis/nodejs-spanner/issues/2068)) ([28fec6c](https://togithub.com/googleapis/nodejs-spanner/commit/28fec6ca505d78d725efc123950be978e0c84ab7))

---
This PR was generated with [Release Please](https://togithub.com/googleapis/release-please). See [documentation](https://togithub.com/googleapis/release-please#release-please).

refactor: blind write method

fix: lint errors

fix: Retry with timeout (#2071)

Use `gaxOptions.timeout` during retry in streaming calls. Earlier the timeout value was only used for a single RPC not for the whole operation including retries. Now if RPC returns `Unavailable` error and the timeout value has been reached, library will throw an Deadline exceeded error.

```
const query = {
        sql: 'Select 1',
        gaxOptions: {timeout: 500}
    }
const [rows] = await database.run(query);
```

chore(main): release 7.9.1 (#2072)

:robot: I have created a release *beep* *boop*
---

* Retry with timeout ([#2071](https://togithub.com/googleapis/nodejs-spanner/issues/2071)) ([a943257](https://togithub.com/googleapis/nodejs-spanner/commit/a943257a0402b26fd80196057a9724fd28fc5c1b))

---
This PR was generated with [Release Please](https://togithub.com/googleapis/release-please). See [documentation](https://togithub.com/googleapis/release-please#release-please).

refactor: blind write method

test: unit test for blind write

test: unit test for blind write

refactor

fix: lint errors

feat: add support for change streams transaction exclusion option for Batch Write (#2070)

* feat: change stream transaction exclusion option for Batch Write

* refactor

docs: add doc to blindWrite method

docs: add doc to the setQueuedMutations

refactor: doc setQueuedMutations

fix: presubmit error

fix(deps): update dependency google-gax to v4.3.8 (#2077)

[![Mend Renovate](https://app.renovatebot.com/images/banner.svg)](https://renovatebot.com)

This PR contains the following updates:

| Package | Change | Age | Adoption | Passing | Confidence |
|---|---|---|---|---|---|
| [google-gax](https://togithub.com/googleapis/gax-nodejs) ([source](https://togithub.com/googleapis/gax-nodejs/tree/HEAD/gax)) | [`4.3.7` -> `4.3.8`](https://renovatebot.com/diffs/npm/google-gax/4.3.7/4.3.8) | [![age](https://developer.mend.io/api/mc/badges/age/npm/google-gax/4.3.8?slim=true)](https://docs.renovatebot.com/merge-confidence/) | [![adoption](https://developer.mend.io/api/mc/badges/adoption/npm/google-gax/4.3.8?slim=true)](https://docs.renovatebot.com/merge-confidence/) | [![passing](https://developer.mend.io/api/mc/badges/compatibility/npm/google-gax/4.3.7/4.3.8?slim=true)](https://docs.renovatebot.com/merge-confidence/) | [![confidence](https://developer.mend.io/api/mc/badges/confidence/npm/google-gax/4.3.7/4.3.8?slim=true)](https://docs.renovatebot.com/merge-confidence/) |

---

<details>
<summary>googleapis/gax-nodejs (google-gax)</summary>

[Compare Source](https://togithub.com/googleapis/gax-nodejs/compare/google-gax-v4.3.7...google-gax-v4.3.8)

-   **deps:** remove rimraf in favor of native node rm function ([#&#8203;1626](https://togithub.com/googleapis/gax-nodejs/issues/1626)) ([dd87646](https://togithub.com/googleapis/gax-nodejs/commit/dd87646618d5026549920e224df7f85cbb5ff6a8))

</details>

---

📅 **Schedule**: Branch creation - "after 9am and before 3pm" (UTC), Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied.

♻ **Rebasing**: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update again.

---

 - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box

---

This PR has been generated by [Mend Renovate](https://www.mend.io/free-developer-tools/renovate/). View repository job log [here](https://developer.mend.io/github/googleapis/nodejs-spanner).
<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzNy40MjUuMSIsInVwZGF0ZWRJblZlciI6IjM3LjQyNS4xIiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->

updated

updated

lint

refactor

* fix: presubmit error

* refactor: docs of the method writeAtLeastOnce

* test: unit test using mockspanner

* fix: lint errors

* docs refactor

* refactor

* refactor

---------

Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: surbhigarg92 <surbhigarg.92@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

api: logging Issues related to the Cloud Logging API. cla: yes This human has signed the Contributor License Agreement.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants