New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow plain JS object repeated fields to use toArray() method #1302
Open
glasser
wants to merge
1
commit into
protobufjs:master
Choose a base branch
from
apollographql:toarray
base: master
Could not load branches
Branch not found: {{ refName }}
Could not load tags
Nothing to show
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
We have several use cases for wanting to build up complex nested protobuf objects incrementally over time before serializing where representing an array with a repeated field is not helpful during the connstruction process: - Protobuf maps only support integral and string as keys, so we sometimes have fields that are repeated message objects that represent a complex key and a value. In order to build up this field over time in a non-quadratic way, it would be nice to be able to use an object containing (say) a Map, which can be converted to a list of messages once at encoding time. - We have a `repeated int64` field that represents a histogram (each number represents a number of events that fell in a certain duration bucket). This histogram generally contains large stretches of 0s so we have a compressed encoding where we use negative numbers to encode stretches of zeroes. We want to only convert from the uncompressed to compressed encoding when we're ready to encode. These challenges can be solved by creating a hierarchy of classes very similar to your protobuf hierarchy and converting to protobufs manually when it's time to encode, but this is a lot of effort to create and maintain. One could also use JS proxies to trap indexing and trigger some sort of just-in-time conversion to array, but that's annoying to write and probably not particularly efficient.) Instead, this change allows plain JS objects to contain non-Arrays for repeated fields as long as they have a toArray method. Implementations of encode, fromObject, and verify are updated to call toArray if necessary. The jsdoc in static and static-module targets reference this in field typings where appropriate, which is processed correctly by pbts. This change only looks for toArray on fields that explicitly contain a `[(js_use_toArray) = true]` option. This is because of unbenchmarked guesses that maybe leaving the toArray check out will be faster for fields that don't need it. This feature would be simpler to use if toArray was just hardcoded, though. Note that protobufjs does not require you to declare options, but you'll want to declare the option if you want to use your proto file with the official protobuf compiler. Something like this should do the trick: import "google/protobuf/descriptor.proto"; extend google.protobuf.FieldOptions { bool js_use_toArray = 50505; } Note that this import will cause protobuf.js to include thousands of lines of generated code in your output; see protobufjs#1300. Example TypeScript usage: # report.proto syntax = "proto3"; message Report { repeated Entry entries = 1 [(js_use_toArray)=true]; } message Entry { string key1 = 1; string key2 = 2; string value = 3; } # run.ts import {IReport, IEntry, Report, Entry} from "./report"; class MyEntries { private readonly entries = new Map<string, Entry>(); public set(key1: string, key2: string, value: string) { const entry = new Entry({key1, key2, value}); const key = JSON.stringify({key1, key2}); this.entries.set(key, entry); } public toArray(): IEntry[] { return Array.from(this.entries.values()); } } const r: IReport = new Report({entries: new MyEntries()}); const nestedEntries = r.entries as MyEntries; nestedEntries.set("x", "y", "v"); nestedEntries.set("x", "y", "w"); nestedEntries.set("a", "b", "c"); const roundTrip = Report.decode(Report.encode(r).finish()); console.log(roundTrip);
glasser
commented
Sep 27, 2019
@@ -1,5 +1,7 @@ | |||
// DO NOT EDIT! This is a generated file. Edit the JSDoc in src/*.js instead and run 'npm run types'. | |||
|
|||
import * as Long from "long"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't know why this showed up when I ran npm run types
.
I'm noticing that this repository seems to be back under development! Is there any interest in taking this PR (if I rebase it)? We've been using it in our fork. |
mattste
pushed a commit
to mattste/apollo_ex
that referenced
this pull request
May 15, 2023
This is our initial commit that includes `ApolloEx` support for creating Apollo Studio/GraphOS valid traces. For more details, see module documentation and README. ## Design decisions There were a few design decisions made during the process. ### Absinthe Phases I couldn't figure out a way to get tracing support without adding custom phases. This comes with the downside that a user has to modify their pipeline but I tried to make that as convenient as possible via the `Absinthe.Tracing.modify_pipeline/2` function. It also comes with the downside that we operate at the blueprint level which can be a little harder to follow than an Absinthe plugin that operates at the resolution-level. ### Testing The initial test case was found by running [Apollo Studio's test case](https://github.com/apollographql/apollo-server/blob/main/packages/server/src/__tests__/plugin/usageReporting/plugin.test.ts) and copy/pasting their trace data. I skipped testing some edge cases for now to get something shipped. In the future, I'd like to set-up testing so that I can automate comparing ApolloEx's output to the official Apollo server plugin's output for a set of queries. The next phase of the project will add support for pushing generated traces to Apollo Studio. We can then do some manual verification that traces appear correctly in Apollo Studio. ## Issues ### Protobuf issues Apollo's [reports.proto file](https://usage-reporting.api.apollographql.com/proto/reports.proto) that we use includes `js_use_toArray` which is something [specific to the Apollo protobuf.js fork](protobufjs/protobuf.js#1302). I manually removed it and created the file `reports.proto` (keeping the original under `reports.original.proto` so I could move forward. We should explore making this consistent with Apollo server's reporting usage plugin's approach or figure out a way to automate the removal. This same issue applies to `js_preEncoded`
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
We have several use cases for wanting to build up complex nested protobuf
objects incrementally over time before serializing where representing an array
with a repeated field is not helpful during the connstruction process:
Protobuf maps only support integral and string as keys, so we sometimes have
fields that are repeated message objects that represent a complex key and
a value. In order to build up this field over time in a non-quadratic way,
it would be nice to be able to use an object containing (say) a Map, which
can be converted to a list of messages once at encoding time.
We have a
repeated int64
field that represents a histogram (each numberrepresents a number of events that fell in a certain duration bucket).
This histogram generally contains large stretches of 0s so we have a
compressed encoding where we use negative numbers to encode stretches of
zeroes. We want to only convert from the uncompressed to compressed encoding
when we're ready to encode.
These challenges can be solved by creating a hierarchy of classes very similar
to your protobuf hierarchy and converting to protobufs manually when it's time
to encode, but this is a lot of effort to create and maintain. One could also
use JS proxies to trap indexing and trigger some sort of just-in-time conversion
to array, but that's annoying to write and probably not particularly efficient.)
Instead, this change allows plain JS objects to contain non-Arrays for repeated
fields as long as they have a toArray method. Implementations of encode,
fromObject, and verify are updated to call toArray if necessary. The jsdoc in
static and static-module targets reference this in field typings where
appropriate, which is processed correctly by pbts.
This change only looks for toArray on fields that explicitly contain a
[(js_use_toArray) = true]
option. This is because of unbenchmarked guessesthat maybe leaving the toArray check out will be faster for fields that don't
need it. This feature would be simpler to use if toArray was just hardcoded,
though.
Note that protobufjs does not require you to declare options, but you'll want to
declare the option if you want to use your proto file with the official protobuf
compiler. Something like this should do the trick:
Note that this import will cause protobuf.js to include thousands of lines of
generated code in your output; see #1300.
Example TypeScript usage: