Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to use grpc-gateway with dcodeIO/protobuf.js ? #80

Closed
sulliwane opened this issue Mar 26, 2017 · 23 comments
Closed

How to use grpc-gateway with dcodeIO/protobuf.js ? #80

sulliwane opened this issue Mar 26, 2017 · 23 comments
Assignees

Comments

@sulliwane
Copy link

sulliwane commented Mar 26, 2017

Hello,

I'm wondering how protobuf/js relates to dcodeIO/protobuf.js? Are these two lib doing the exact same thing?

Is it possible to swap protobuf/js for dcodeIO/protobuf.js?

I'm looking for a way to have an automatic translation from js objects to protobuf messages (without having to manually set every fields of the messages) that's how I do with the node-grpc implementation.

Many thanks

@wenbozhu
Copy link
Member

@murgatroid99

@murgatroid99
Copy link
Member

Those two libraries are both JavaScript implementations of Protocol Buffers, but they are completely different implementations, and they have completely different interfaces. Neither should be considered a drop-in replacement of the other.

Protobuf.js has an API for constructing a specific message from a regular JavaScript object. This is probably the closest you will get to "automatic translation from js objects to protobuf messages". That is what Node gRPC uses internally. As far as I know, protobuf/js does not have such an API.

@sulliwane
Copy link
Author

The API you are mentioning seems to be exactly what I'm looking for, would you mind showing a code example?

@sulliwane
Copy link
Author

Sorry I got you wrong. I thought you meant grpc-node is using google-protobufjs, but actually it's using dcodeIO/protobuf.js. Got it, many thanks !

@sulliwane
Copy link
Author

not so fast...

I'd like to use protobuf.js for everything, except the HTTP call. As per their documentation:

Using services
The library also supports services but it doesn't make any assumptions about the actual transport channel. Instead, a user must provide a suitable RPC implementation, which is an asynchronous function that takes the reflected service method, the binary request and a node-style callback as its parameters:

function rpcImpl(method, requestData, callback) {
    // perform the request using an HTTP request or a WebSocket for example
    var responseData = ...;
    // and call the callback with the binary response afterwards:
    callback(null, responseData);
}

Example:

// greeter.proto
syntax = "proto3";

service Greeter {
    rpc SayHello (HelloRequest) returns (HelloReply) {}
}

message HelloRequest {
    string name = 1;
}

message HelloReply {
    string message = 1;
}
var Greeter = root.lookup("Greeter");
var greeter = Greeter.create(/* see above */ rpcImpl, /* request delimited? */ false, /* response delimited? */ false);

greeter.sayHello({ name: 'you' }, function(err, response) {
    console.log('Greeting:', response.message);
});

@murgatroid99 Any chance to have a short example on how to make the call with grpc-web, but using protobuf.js for everything else? (ie: kind of the rpcImpl() method mentionned above).

@sulliwane sulliwane reopened this Mar 28, 2017
@sulliwane sulliwane changed the title How protobuf/js relates to dcodeIO/protobuf.js ? How to use grpc-gateway with dcodeIO/protobuf.js ? Mar 28, 2017
@updogliu
Copy link

Current grpc-web does not have a public API for plugging in arbitrary protobuf library. So grpc-web has to be used together with protobuf/js for now.

Can you brief why you prefer protobuf.js?

@sulliwane
Copy link
Author

sulliwane commented Mar 28, 2017

I think protobuf.js main advantages are:

  • API to automatically convert a JS object from/to a proto message
  • only 18KB gziped.
  • ".Proto" files dynamic parsing.
  • complete API documentation.
  • speed of the implementation (suggested by the benchmarks)

Finally it also supports the proto services definitions parsing, if can inject a rpcImplementation function (the one described above) I feel grpc-web could fit particularly well this use case (being agnostic toward the protobuf implementation, and being responsible to making the http call only, if that make sense).

@updogliu
Copy link

Can you please point me to the benchmarks?

@sulliwane
Copy link
Author

Sure, the JS test is here, and it is referenced on the main README.md of the project.

@sulliwane
Copy link
Author

sulliwane commented Mar 29, 2017

Right now, I can encode the request with protobuf.js, send it using fetch, and get the response. but can't decode the base64 string response:

import protobuf from 'protobufjs';
protobuf.load("package.proto")
.then(function(root) {
  const MessageReq = root.lookup("package.MessageReq");
  const messageReq = MessageReq.create({ symbols: ['IF1609'] })

  fetch('http://myhost:port/package.Service/Method', {
    method: 'POST',
    headers: {
      'Content-Type': 'application/x-protobuf',
      'X-Accept-Content-Transfer-Encoding': 'base64',
      'X-Accept-Response-Streaming': 'true'
    },
    body: MessageReq.encode(messageReq).finish(),
  })
  .then(response => response.text())
  .then(base64_string => {
    // Is that the proper way to convert the base64 string to a TypedArray?
    const buffer = Uint8Array.from(atob(base64_string), c => c.charCodeAt(0))

    const MessageRes = root.lookup("package.MessageRes");
    var messageRes = MessageRes.decode(buffer);

    console.log('messageRes', messageRes) // not OK, scrambled code
  })
});

Isn't the response a base64 string representation of the binary proto message? I get scrambled code when decoding it:
screenshot from 2017-03-29 11-22-11

@mwitkow
Copy link
Collaborator

mwitkow commented Mar 30, 2017

@sulliwane do you want to use protobuf.js just for encoding, and are looking for a pure client?

We implemented a grpc-web client transport in Fetch API+XHR fallback in TypeScript (that is translateable to ES).
https://github.com/improbable-eng/grpc-web/tree/master/ts/src/transports

Let us know if it is useful for you :)

@sulliwane
Copy link
Author

@mwitkow Yes, you got my point.

I was away for a few days, so I'm just looking at your grpc-web implementation now.

It looks like what I'm trying to do, except that the response you are reading seems to be already of type Uint8Array! -> But the nginx-gateway here response is a base64 encoded string (as far as I know, there is just this mode), so did you test using your grpc-web client with the nginx-gateway of this current repo?

As you can see in my code sample above, I'm just trying to get a simple unary call working with fetch and the current repo nginx-gateway, my request is well formed, but I just can't decode the gateway response base64 string to a correct proto object. So I suspect the base64 string is not just a simple binary proto object encoded as base64 (I need to parse something somewhere I guess).

Thank you for you help :)

@wenbozhu
Copy link
Member

wenbozhu commented Apr 6, 2017

sulliwane@, the grpc-web implementation is subject to change (frequently). E.g the support of the new spec is not yet published. For the current version, we have special logic to decide when to use base64 encoding, how to apply padding, keep-alive etc.

The gateway from @mwitkow is useful to test your own manual client implementation, However, we'd like to limit the adoption of the grpc-web protocol spec to only browser clients.

@sulliwane
Copy link
Author

@wenbozhu Ok. So you mean that this spec is not up to date right? I'm in fact trying to implement browser only client (no xhr, fetch only).

By slicing out the first 2 bytes of the response, I could properly decode the grpc-gateway response.

...
.then(base64_string => {
    // this is the proper way to convert base64 string to Uint8Array
    const buffer = Uint8Array.from(atob(base64_string), c => c.charCodeAt(0))

    const MessageRes = root.lookup("package.MessageRes");
    var messageRes = MessageRes.decode(buffer.slice(2));

    console.log('messageRes', messageRes) // OK, that's a proper proto message
  })

I'm not clear why grpc-gateway added an outer envelope to my proto message though...

@mwitkow
Copy link
Collaborator

mwitkow commented Apr 7, 2017

@sulliwane the nginx gateway uses old code, that predates the protocol spec.

@met-pub
Copy link

met-pub commented Apr 9, 2018

@mwitkow so is it possible to use protobuf.js just for encoding, and with your transport? I'm searching for a solution that with JS object <-> proto message and TypeScript support.

@justerest
Copy link

justerest commented Oct 3, 2018

It's working for grpc-web + protobuf.js.
I hope it helps somebody.

import { ChunkParser, ChunkType } from "grpc-web-client/dist/ChunkParser"
import { RPCImpl } from "protobufjs";

function rpcImpl(serviceName: string): RPCImpl {
    return async (method, requestData, callback) => {
        const request = await fetch(`${BASE_URL}/${serviceName}/${method.name}`, {
            method: "POST",
            headers: {
                "content-type": "application/grpc-web+proto",
                "x-grpc-web": "1",
            },
            body: frameRequest(requestData),
        });
        const buffer = await request.arrayBuffer();
        const chunk = parseChunk(buffer);

        callback(null, chunk && chunk.data ? new Uint8Array(chunk.data) : null);
    };
}

function parseChunk(buffer: ArrayBuffer) {
    return new ChunkParser()
        .parse(new Uint8Array(buffer))
        .find(chunk => chunk.chunkType === ChunkType.MESSAGE);
}

function frameRequest(bytes: Uint8Array) {
    const frame = new ArrayBuffer(bytes.byteLength + 5);
    new DataView(frame, 1, 4).setUint32(0, bytes.length, false);
    new Uint8Array(frame, 5).set(bytes);
    return new Uint8Array(frame);
}

@clehene
Copy link

clehene commented Feb 16, 2019

@justerest it looks like your example uses thehttps://github.com/improbable-eng/grpc-web rather than grpc/grpc-web.

@justerest
Copy link

@clehene you're right. I was inattentive.

@clehene
Copy link

clehene commented Feb 18, 2019

@justerest @sulliwane note that it's possible to use both protobuf.js and google/grpc-web by converting between one and the other at binary level through an equivalent of gpb.decode(pbjs_entity.encode). This is likely inefficient but may be a good tradeoff in order to use the desired tooling on one side until things work as intended on the other.

@stanley-cheung
Copy link
Collaborator

Closing this for now. We are unlikely to support dynamic protobuf in the near future.

loyalpartner pushed a commit to loyalpartner/grpc-web that referenced this issue Sep 4, 2020
* Fixup removed gRPC interface for metadata.

* Fixup change in ServerTransport interface in gRPC.

* fixup testserver for the metadata call
@rmv9490
Copy link

rmv9490 commented Mar 10, 2024

It's working for grpc-web + protobuf.js. I hope it helps somebody.

import { ChunkParser, ChunkType } from "grpc-web-client/dist/ChunkParser"
import { RPCImpl } from "protobufjs";

function rpcImpl(serviceName: string): RPCImpl {
    return async (method, requestData, callback) => {
        const request = await fetch(`${BASE_URL}/${serviceName}/${method.name}`, {
            method: "POST",
            headers: {
                "content-type": "application/grpc-web+proto",
                "x-grpc-web": "1",
            },
            body: frameRequest(requestData),
        });
        const buffer = await request.arrayBuffer();
        const chunk = parseChunk(buffer);

        callback(null, chunk && chunk.data ? new Uint8Array(chunk.data) : null);
    };
}

function parseChunk(buffer: ArrayBuffer) {
    return new ChunkParser()
        .parse(new Uint8Array(buffer))
        .find(chunk => chunk.chunkType === ChunkType.MESSAGE);
}

function frameRequest(bytes: Uint8Array) {
    const frame = new ArrayBuffer(bytes.byteLength + 5);
    new DataView(frame, 1, 4).setUint32(0, bytes.length, false);
    new Uint8Array(frame, 5).set(bytes);
    return new Uint8Array(frame);
}

Getting Failed to parse URL, Browser is not able to parse the URL

Fetch error: TypeError: Failed to execute 'fetch' on 'Window': Failed to parse URL from http://10.81.200.09:8000/loginService_package.LoginService/LoginMethod
at zone.js:1505:1
at proto. (zone.js:973:1)
at LoginService. (app.component.ts:41:31)
at Generator.next ()
at asyncGeneratorStep (asyncToGenerator.js:3:1)
at _next (asyncToGenerator.js:25:1)
at asyncToGenerator.js:32:1
at new ZoneAwarePromise (zone.js:1427:1)
at LoginService. (asyncToGenerator.js:21:1)
at LoginService.rpcImpl (app.component.ts:36:5)

Can someone please help in resolving this?
@justerest Can you please look into this issue once
Many Thanks in advance

@murgatroid99
Copy link
Member

murgatroid99 commented Mar 11, 2024

@rmv9490 Your error message says "Fetch error: TypeError: Failed to execute 'fetch' on 'Window': Failed to parse URL from http://10.81.200.09:8000/loginService_package.LoginService/LoginMethod" So, the problem is in that URL. In that IP address, the fourth component is 09. The leading 0 causes the component to be interpreted as octal, and 9 is not a valid octal digit, which is why the URL fails to parse. If you remove that 0, it will work (or you will at least get a different error).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants