Giter Club home page Giter Club logo

ngx-grpc's People

Contributors

adejewski-msol avatar damirsaifut avatar dependabot[bot] avatar dprogm avatar fadelis avatar greegko avatar grubana avatar hellysonrp avatar jfyne avatar philippemorier avatar plauss avatar semantic-release-bot avatar smnbbrv avatar stocksr avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ngx-grpc's Issues

protoc-gen-ng: Optional/Required fields when using proto2

Hi, I'm aiming to avoid all the typescript errors regarding "everything is optional" in proto3 by using proto2 (where you can specify optional and required).

It seems like the plugin (protoc-gen-ng) isn't generating the typings properly for those cases, and still emits string | undefined signature.

Any ideas how to fix this?

Support for custom options

Protocol buffers are allowing to create own options. Example:

import "google/protobuf/descriptor.proto";

extend google.protobuf.MessageOptions {
  optional string my_option = 51234;
}

message MyMessage {
  option (my_option) = "Hello world!";
}

See
https://developers.google.com/protocol-buffers/docs/proto#customoptions

It is possible on messages, message fields, enums, enum values, services etc.

It would be great if the code generator would store those meta information as static values on the class.

@ngx-grpc/worker: No exported member 'Error' in grpc-web

Hi,
I have an issue using @ngx-grpc/worker 2.4.0 together with grpc-web 1.3.0

Error: node_modules/@ngx-grpc/worker/lib/api.d.ts:1:10 - error TS2305: Module '"grpc-web"' has no exported member 'Error'.
1 import { Error, Metadata, Status } from 'grpc-web';

According to Github history, the exported member "Error" was renamed to "RpcError" recently. After manually changing it to "RpcError" in the api.d.ts file everything works fine.

Stream stops receiving messages after 4 received

I have a grpc-js server setup with a rpc method called getUsers(). When making the call from the client, I only receive 4 of the messages back, despite there being 6 sent.

Here's the client method:

const client = new UsersClient(environment.rpcHost);
    const params = new CMSQuery();
    client.getUsers(params).on('data', (message: User) => {
      const obj = message.toObject();
      console.log(`${obj.name} received`);
      this.users.push(obj);
    }).on('end', () => {
      console.log('End of request');
    });

And here's the server method:

    async getUsers(call: grpc.ServerWritableStream<CMSQuery, User>): Promise<any> {
        const params = call.request.toObject();
        this.collection.find({}).forEach(user => {
            const message = protoFromObject(User, user);
            call.write(message);
            console.log(`${user.name} sent`);
        }).finally(() => {
            console.log('Closing request');
            call.end();
        });
    }

The server has the following console output:

User1 sent
User2 sent
User3 sent
User4 sent
User5 sent
User6 sent
Closing Request

And yet the client only has the following console output:

User1 received
User2 received
User3 received
User4 received
End of request

Are there any obvious things that would cause this? Some sort of timeout? Can anyone give me some pointers as to where I should be looking as I'm really stuck at the moment.

Thanks

feature: Make GrpcMessage a generic type.

feature: Make GrpcMessage a generic type.

export interface GrpcMessage<T = unknown> {
  toObject(): T;
  toJSON(): T;
}

It will improve type safety. And also wold be a required step to message interfaces.

Conflicting message names from packages

Hi! Love the project thanks so much for the work.

I have been implementing it into our companies workflow to save us from using the default grpc-web implementations.

One issue I am coming across is conflicting messages in different proto packages. For example:

In lets say alpha/product.proto I have

syntax = "proto3";

package alpha;

message Product {
    string id = 1;
}

And in beta/product.proto I have

syntax = "proto3";

package beta;

message Product {
    string id = 1;
}

Now I am trying to create a library, publish it to npm and then import it into our product sites. So to create a library I am creating a public-api.ts

export * from './lib/alpha/product.pb';
export * from './lib/beta/product.pb';

Now this is a problem, since I will get an error

Module './lib/alpha/product.pb' has already exported a member 'Product'

So here are what I think the options are:

  1. Split my alpha and beta services into separate libraries. This is on the surface the best move, but this becomes a problem when imports get involved. (We have quite a lot of shared constants for example)
  2. Somehow namespace the message names with the package.

I have sort of tried 2 by doing this in my public-api.ts

import * as alpha from './lib/alpha/product.pb';
import * as beta from './lib/beta/product.pb';

export { alpha, beta }

However doing this I end up with an error from angular when im attempting to build an app that imports the library

> [email protected] build /example-site
> ng build "--prod"


ERROR in app/app.module.ts(48,16): Error during template compile of 'AppModule'
  Unsupported bundled module reference in 'beta'
    'beta' contains the error.

This error appears to be to do with the import and export syntax I have used and the way that aot compilation handles it.

So really what I am asking for here is just a discussion as to what you think would be the best approach to solve this, without having to rename alll my proto messages to something unique.

Thanks!

global reconnect / retry mechanism

Hi Semen,

is there a feature, wich allows global reconnects / retries for server side streams?
the basic reconnect, i am implementing is getting a bit tedious:

  • every stream subscription is assigned to an attribute.
  • on completed an unsubscribe and a re-subscribe is performed.
  • of course on network issues there should be a callback/test to navigator isonline or something like timeouts so that the reconnect is only triggered if the device is online again or can ping the backend etc.
    this manual approch works fine, but it would be great if there is a way to configure / use a global reconnect/retry mechanism.

Thank you very much!

Package name with uppercase letters is not properly generated

Let's assume the following proto file:

syntax = "proto3";

package ArmoniK.api.grpc.v1.submitter;

service Submitter{
	rpc CancelSession (Session) returns (Empty);
}

message Session {
	string id = 1;
}

message Empty {}

The generator will not work because it wil generate this:

Observable<GrpcEvent<thisProto.ArmoniK.api.grpc.v1.submitter.Empty>>

instead of this

Observable<GrpcEvent<thisProto.Empty>>

Error: Response closed without grpc-status (Headers only)

I have the following message and service definitions

/**
* User model
*/
message User {
  string user_id = 1; // MongoDB unique identifier
  string email = 2; // User's email address
  string password = 3;
  Role role = 4;
  int64 created = 5; // Creation timestamp
  string creator = 6; // Email address of the creator
  bool disabled = 7; // Manual lockout flag
  int32 auth_failures = 8; // Auth failure count
  bool locked = 9; // Auto lockout flag
  string token = 10; // JWT token for post login auth
  string name = 11;
  int64 updated = 12;
}
service Users {
  // Login and get JWT token
  rpc login(Login) returns (User) {}
.....

And the following service method on my Python server:

def login(self, request, context):
        if not request.email:
            context.set_code(grpc.StatusCode.INVALID_ARGUMENT)
            context.set_details('The username/password provided is incorrect')
            return messages_pb2.LoginResponse(error='The username/password provided is incorrect')

        user = self.db.users.find_one({'email': request.email})
        return messages_pb2.User(**user)

When calling from another client, such as one written in Python, the request is absolutely fine.

However, when calling from grpc-web with the ImprobableEngGrpcWebClient in angular, I get the following error:

users.service.ts:70 Error: Response closed without grpc-status (Headers only)
    at Object.onEnd (messages_pb_service.js:69) [angular]
    at :4200/vendor.js:86815:24343 [angular]
    at :4200/vendor.js:86815:4892 [angular]
    at Array.forEach (<anonymous>) [angular]
    at e.rawOnEnd (grpc-web-client.umd.js:1) [angular]
    at e.onTransportEnd (grpc-web-client.umd.js:1) [angular]
    at :4200/vendor.js:86815:13311 [angular]
    at Object.onInvoke (core.js:28301) [angular]
    at :4200/polyfills.js:1165:40 [angular]
    at Object.onInvokeTask (core.js:28289) [angular]

It's definitely not a connection issue, as I get a HTTP 200 response, and the body of the request contains the data that's expected:

Request URL: http://localhost:8080/Protos.Users/login
Request Method: POST
Status Code: 200 OK
Remote Address: 127.0.0.1:8080
Referrer Policy: no-referrer-when-downgrade
accept-encoding: identity,gzip
access-control-allow-origin: http://localhost:4200
access-control-expose-headers: custom-header-1,grpc-status,grpc-message
content-type: application/grpc-web+proto
date: Sat, 20 Feb 2021 12:45:10 GMT
grpc-accept-encoding: identity,deflate,gzip
server: envoy
transfer-encoding: chunked
x-envoy-upstream-service-time: 3

And the response body:

�
�603006a4236cfd11f7bd0d5d��[email protected]�<$2b$10$.kLFgFLqz/2z8idQJweZZuLsr8e6T06WL5QuiUBf.ef3j0t1HLtI6 �(ó�À��ZSystem Admin`ó�À����grpc-status:0
grpc-message:

And the envoy debug log:

[2021-02-20 12:57:17.326][69423][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:190] [C1] using existing connection
[2021-02-20 12:57:17.326][69423][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:130] [C1] creating stream
[2021-02-20 12:57:17.326][69423][debug][router] [external/envoy/source/common/router/upstream_request.cc:354] [C0][S17761285570082037057] pool ready
[2021-02-20 12:57:17.326][69423][debug][http] [external/envoy/source/common/http/filter_manager.cc:755] [C0][S17761285570082037057] request end stream
[2021-02-20 12:57:17.330][69423][debug][router] [external/envoy/source/common/router/router.cc:1174] [C0][S17761285570082037057] upstream headers complete: end_stream=false
[2021-02-20 12:57:17.330][69423][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1484] [C0][S17761285570082037057] encoding headers via codec (end_stream=false):
':status', '200'
'content-type', 'application/grpc-web+proto'
'grpc-accept-encoding', 'identity,deflate,gzip'
'accept-encoding', 'identity,gzip'
'x-envoy-upstream-service-time', '4'
'access-control-allow-origin', 'http://localhost:4200'
'access-control-expose-headers', 'custom-header-1,grpc-status,grpc-message'
'date', 'Sat, 20 Feb 2021 12:57:17 GMT'
'server', 'envoy'

[2021-02-20 12:57:17.330][69423][debug][client] [external/envoy/source/common/http/codec_client.cc:112] [C1] response complete
[2021-02-20 12:57:17.330][69423][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:159] [C1] destroying stream: 0 remaining
[2021-02-20 12:57:17.331][69423][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1500] [C0][S17761285570082037057] encoding trailers via codec:

Anyone got any ideas what's causing this?

Provide option to run same command on unix and windows based system

At the moment this command:
./node_modules/.bin/grpc_tools_node_protoc --proto_path=../proto/public --proto_path=../proto --plugin=protoc-gen-ng=./node_modules/.bin/protoc-gen-ng --ng_out=./src/generated ../proto/public/**/**/**/*.proto does not work on windows based machines, as it seems the plugin protoc-gen-ng works only with mac or unix OS.

The only way to run it on windows is to point to protoc-gen-ng.cmd (and of course to change the way the relative path is calculated, but this is not a problem introducing ${PWD} instead of ./ makes it also compatible with other OS).

Since we are working in a platform-agnostic team, we would like to have only one same script for the different OS we have. Would that be possible somehow?

Thanks
Valerio

Authentication Token

Hi, I'm trying to automatically send a token with every GRPC call after the client has logged in.
Is there any built-in way to do this? If not, would a custom Interceptor to set the metadata be enough for the job?
Maybe it's worth mentioning this usage in the docs?

Thanks

Generate GrpcMessagePool

I have setup Logging as suggested and all was working well, until I used the "Any" type

I found this comment in the logging setup - which solved my problem
// please beware: if you use google.protobuf.Any you must pass the propermessagePool argument

Would it be possible for there to be a generated array of all the messages types suitable for creating a MessagePool to pass to the logger?

Params type for new request

I create a new request (e.g. const req = new TestRequest()) and try to know type of the message for this request by hovering the mouse over the request name but WebStorm IDE shows only:

TestRequest.constructor(     _value?: RecursivePartial<TestRequest>)
Message constructor. Initializes the properties and applies default Protobuf values if necessary
Params:
_value – initial values object or instance of TestRequest to deeply clone from

Is it possible to know type of request params by hovering function name in IDE? I can know type only by going inside a constructor of the request or by reading a gRPC service documentation.

How to create backend logic using python?

I want to have a python backend to execute the same client using angular. How these echo proto files are being generated in backend side? Can you please add the steps ?

feature: add default client settings

In my project I have only one backend for all proto services and it would be much convenient to setup all services once, as it done in nswag.

export const GRPC_SERVICE_DEFAULT_SETTINGS = new InjectionToken<GrpcClientSettings>('GRPC_SERVICE_DEFAULT_SETTINGS');

later on a particular service could be

export class EchoServiceClient {
  private client: GrpcClient;

  constructor(
    @Optional()
    @Inject(GRPC_ECHO_SERVICE_CLIENT_SETTINGS)
    clientSettings: GrpcClientSettings | undefined,
    @Optional()
    @Inject(GRPC_SERVICE_DEFAULT_SETTINGS)
    defaultSettings: GrpcClientSettings | undefined,
    @Inject(GRPC_CLIENT_FACTORY) clientFactory: GrpcClientFactory,
    private handler: GrpcHandler
  ) {
    if (defaultSettings === null && clientSettings === null) {
      throw new Error(
        'Either GRPC_SERVICE_DEFAULT_SETTINGS or GRPC_ECHO_SERVICE_CLIENT_SETTINGS or both should be provided'
      );
    }
    const settings = {
      ...(defaultSettings || {}),
      ...(clientSettings || {})
    } as GrpcClientSettings;
    this.client = clientFactory.createClient('EchoService', settings);
  } 
}

update @improbable-eng/grpc-web dependency

Currently we have @improbable-eng/grpc-web@"^0.13.0", while the latest version is 0.15.0 (published in Nov 2021).

Can we have ^0.13.0 updated, or is there anything blocking right now? Thanks!

ngx GRPC error Failed parsing HTTP/2 expected 'P' (80) got 'O' (79) at byte 0

I have a GRPC server that is working just fine (tested using BloomRPC). and I have a front-end that is built with Angular and ngx-grpc. I am getting the followin error on the server side when sending a request from my grpc client. Any idea what is the problem?

Complete BDP ping err={"created":"@1600003114.696000000","description":"Failed parsing HTTP/2","file":"d:\a\grpc-node\grpc-node\packages\grpc-native-core\deps\grpc\src\core\ext\transport\chttp2\transport\chttp2_transport.cc","file_line":2582,"referenced_errors":[{"created":"@1600003114.696000000","description":"Connect string mismatch: expected 'P' (80) got 'O' (79) at byte 0","file":"d:\a\grpc-node\grpc-node\packages\grpc-native-core\deps\grpc\src\core\ext\transport\chttp2\transport\parsing.cc","file_line":97}]}

Type errors on generation

Generated files have import and type errors (e.g. thisProto.HelloReply).

import * as thisProto from './greeter-service.pb';
...
sayHello: (
  requestData: thisProto.HelloRequest,
  requestMetadata = new GrpcMetadata()
): Observable<GrpcEvent<thisProto.HelloReply>> => {
  return this.handler.handle({
    type: GrpcCallType.unary,
    client: this.client,
    path: '/GreeterService/SayHello',
    requestData,
    requestMetadata,
    requestClass: thisProto.HelloRequest,
    responseClass: thisProto.HelloReply
  });
}

Proto:

syntax = "proto3";

message HelloReply {
   string Message = 1;
}
message HelloRequest {
   string Name = 1;
}
service GreeterService {
   rpc SayHello (HelloRequest) returns (HelloReply);
}

These are my dependencies if it helpes

  "dependencies": {
    "@angular/animations": "~13.0.0",
    "@angular/common": "~13.0.0",
    "@angular/compiler": "~13.0.0",
    "@angular/core": "~13.0.0",
    "@angular/forms": "~13.0.0",
    "@angular/platform-browser": "~13.0.0",
    "@angular/platform-browser-dynamic": "~13.0.0",
    "@angular/router": "~13.0.0",
    "@ngx-grpc/common": "^2.4.1",
    "@ngx-grpc/core": "^2.4.1",
    "@ngx-grpc/grpc-web-client": "^2.4.1",
    "@ngx-grpc/well-known-types": "^2.4.1",
    "google-protobuf": "^3.19.1",
    "grpc-web": "^1.3.0",
    "rxjs": "^6.6.7",
    "tslib": "^2.3.0",
    "zone.js": "~0.11.4"
  },
  "devDependencies": {
    "@angular-devkit/build-angular": "~13.0.3",
    "@angular/cli": "~13.0.3",
    "@angular/compiler-cli": "~13.0.0",
    "@ngx-grpc/protoc-gen-ng": "^2.4.1",
    "@types/google-protobuf": "^3.15.5",
    "@types/jasmine": "~3.10.0",
    "@types/node": "^12.11.1",
    "grpc-tools": "^1.11.2",
    "jasmine-core": "~3.10.0",
    "karma": "~6.3.0",
    "karma-chrome-launcher": "~3.1.0",
    "karma-coverage": "~2.0.3",
    "karma-jasmine": "~4.0.0",
    "karma-jasmine-html-reporter": "~1.7.0",
    "typescript": "~4.4.3"
  }

Any support

It says Any support is not implemented in which way is is this the case? code generation or something else?

Bi-directional streaming with @ngx-grpc/grpc-web-client

Dear forum members,

we would like to do bi-directional audio streaming from Angular 13 application to backend python grpc server. Any idea how we can accomplish this with ngx-grpc?

Many thanks for any pointers, example code and your help on this topic!

Best, Andreas

Not compatible with Angular 13.2

I tried to add the library to an Angular 13.2.2 project:

npm i -S @ngx-grpc/common @ngx-grpc/core @ngx-grpc/grpc-web-client @ngx-grpc/well-known-types google-protobuf grpc-web

But that gave me the following error:

npm ERR! code ERESOLVE
npm ERR! ERESOLVE unable to resolve dependency tree
npm ERR! 
npm ERR! While resolving: [email protected]
npm ERR! Found: [email protected]
npm ERR! node_modules/rxjs
npm ERR!   rxjs@"~7.4.0" from the root project
npm ERR! 
npm ERR! Could not resolve dependency:
npm ERR! peer rxjs@"^6.0.0" from @ngx-grpc/[email protected]
npm ERR! node_modules/@ngx-grpc/common
npm ERR!   @ngx-grpc/common@"*" from the root project
npm ERR! 
npm ERR! Fix the upstream dependency conflict, or retry
npm ERR! this command with --force, or --legacy-peer-deps
npm ERR! to accept an incorrect (and potentially broken) dependency resolution.

Option to configure custom module paths

Hi there!

Is it currently possible to provide a custom module path to a protobuf package?

For example, when generating e.g. Go or C# files, it is possible to specify a custom module location/namespace using options:

syntax = "proto3";

package my.package;

option go_package = "github.com/my/package";
option csharp_namespace = "Protobuf.My.Package";

// and this is what I imagine for ngx-grpc:
// option ngx_module = "@my/package";
...

If this is already implemented, please tell what the regarding option is called and if it is not implemented, would you like to do so?

Best regards,
Vincent

grpc-web v1.2.0 appears to be broken

Hey,

It looks as though a recent release of grpc-web 1.2.0 broke something. This might be it: grpc/grpc-web#865

It might be worth changing the dependencies in this package.json to something more specific like 1.1.0 which I believe works.

Proto3 required fields are generated as optional

Hello,

My problem is that the fields that are configured as required in the proto file are generated as optional fields.
I know that there have already been issues on this topic but I didn't really get anything out of it.
Is there any flag or option that can be set to get this kind of type check.

Repeated field doesn't work

Found an issue during migration:

Proto:

message Random {
  repeated int32 random_values = 1;
}

in the generated file we got the following, but during the parsing process the readInt32 will throw an error.

      switch (_reader.getFieldNumber()) {
        case 1:
          (_instance.random_values = _instance.random_values || []).push(
            _reader.readInt32()
          );
          break;

With readPackedInt32 function it works like a charm:

      switch (_reader.getFieldNumber()) {
        case 1:
          _instance.random_values = _reader.readPackedInt32();
          break;

Noticed the similar generated code difference for the enums. (got readEnum() instead of readPackedEnum())

Not sure about the other fields.

Generated PB files may have declared but never read parameters

Version

Issue

When generating .pb files, some parameters may be declared but never read.

For google.protobuf.Empty:

/* tslint:disable */
/* eslint-disable */
//
// THIS IS A GENERATED FILE
// DO NOT MODIFY IT! YOUR CHANGES WILL BE LOST
import { GrpcMessage, RecursivePartial } from '@ngx-grpc/common';
import { BinaryReader, BinaryWriter, ByteSource } from 'google-protobuf';
export class Empty implements GrpcMessage {
  static toBinary(instance: Empty) {
    const writer = new BinaryWriter();
    Empty.toBinaryWriter(instance, writer);
    return writer.getResultBuffer();
  }

  static fromBinary(bytes: ByteSource) {
    const instance = new Empty();
    Empty.fromBinaryReader(instance, new BinaryReader(bytes));
    return instance;
  }

  static refineValues(instance: Empty) {}

  static fromBinaryReader(instance: Empty, reader: BinaryReader) {
    while (reader.nextField()) {
      if (reader.isEndGroup()) break;

      switch (reader.getFieldNumber()) {
        default:
          reader.skipField();
      }
    }

    Empty.refineValues(instance);
  }

  static toBinaryWriter(instance: Empty, writer: BinaryWriter) {}

  /**
   * Creates an object and applies default Protobuf values
   * @param Empty value
   */
  constructor(value?: RecursivePartial<Empty>) {
    value = value || {};
    Empty.refineValues(this);
  }
  toObject() {
    return {};
  }
  toJSON() {
    return this.toObject();
  }
}
export module Empty {}

There are these errors
in refineValues:
TS6133: 'instance' is declared but its value is never read.

in toBinaryWriter:
TS6133: 'instance' is declared but its value is never read.
TS6133: 'writer' is declared but its value is never read.

Discussion

Should there be a check for unread values in refineValues and toBinaryWriter?
Or should they not be declared at all?

protoc-gen-ng: Incorrect generation for AsObject interfaces in *.pb.ts files

Hi, thank you for this cool plugin protoc-gen-ng, we use it in our Angular app to abstract all the low level gRPC stuff from the business logic layers, and it works quite well!

Also, we use the AsObject interfaces to assert correct types/shape of mock data by binding request and response parameter to those types for our mock server (written in node/TS using @grpc/grpc-js).

import { Server } from '@grpc/grpc-js';
import { handleUnaryCall } from '@grpc/grpc-js';
import { SnakeCasedPropertiesDeep } from 'type-fest';
import * as resource from './generated/resource.pb';

type handleSnakyUnaryCall<Request, Response> = handleUnaryCall<
  SnakeCasedPropertiesDeep<Request>,
  SnakeCasedPropertiesDeep<Response>
>;

const listResources: handleSnakyUnaryCall<resource.ListResourcesRequest.AsObject, resource.ListResourcesResponse.AsObject> = (
  requestData,
  callback
) => { /* ... */ }

const addResourceServices = (server: Server): void => {
  const protoFileName = 'resource.proto';
  const descriptor = (getProtoDescriptorByProtoPath(protoFileName) as MakeService<'Resource'>).resource;
  server.addService(descriptor.ResourceService.service, {
    listResources,
    /* ... */
  });
};

/* then it's added to the Server instance */

But in the generation for proto maps I found an issue. So let's look at the generated maps.pb.ts in this repository (around line 460 in the generated file):

export module MessageWithMap {
  /**
   * Standard JavaScript object representation for MessageWithMap
   */
  export interface AsObject {
    mapStringString?: { [prop: string]: string };
    mapStringMsg?: { [prop: string]: MapSubMessage };
    mapStringBytes?: { [prop: string]: Uint8Array };
  }

The type of mapStringMsg should be { [prop: string]: MapSubMessage.AsObject };, because MapSubMessage is the class itself. I am not sure, if this would also be desirable for the 'AsProtobufJSON' helper, but I think so. So basically, nested types are not resolved for these helpers.

To reproduce:

I just cloned the repo and after npm install I called

$(pwd)/node_modules/.bin/grpc_tools_node_protoc --plugin=protoc-gen-ng=$(pwd)/dist/protoc-gen-ng/main.js --ng_out=config=packages/protoc-gen-ng/ngx-grpc.conf.js:packages/protoc-gen-ng/test/out -I packages/protoc-gen-ng/test/proto $(find ./packages/protoc-gen-ng/test/proto
-iname "*.proto")

Plugin output is unparseable:

tried both grpc_tools_node_protoc --plugin=protoc-gen-ng=.\node_modules.bin\protoc-gen-ng.cmd --ng_out=src/app/proto -I src/app/proto src/app/proto/geo.proto and protoc --plugin=protoc-gen-ng=.\node_modules.bin\protoc-gen-ng.cmd --ng_out=src/app/proto -I src/app/proto src/app/proto/geo.proto

Getting this error:
--ng_out: protoc-gen-ng: Plugin output is unparseable: Active code page: 437\r\nz\275\003\n\rgeo.pbconf.tsz\253\003/* tslint:disable /\n/ eslint-disable */\n//\n// THIS IS A GENERATED FILE\n// DO NOT MODIFY IT! YOUR CHANGES WILL BE L
OST\nimport { InjectionToken } from '@angular/core';\n\n/\n * Specific GrpcClientSettings for Geospatial.\n * Use it only if your default settings are not set or the service requires other settings.\n /\nexport const GRPC_GEOSPATI
AL_CLIENT_SETTINGS = new InjectionToken(\n 'GRPC_GEOSPATIAL_CLIENT_SETTINGS'\n);\nz\324\034\n\013geo.pbsc.tsz\304\034/
tslint:disable /\n/ eslint-disable /\n//\n// THIS IS A GENERATED FILE\n// DO NOT MODIFY IT! YOUR CHANGES
WILL BE LOST\nimport { Inject, Injectable, Optional } from '@angular/core';\nimport {\n GrpcCallType,\n GrpcClient,\n GrpcClientFactory,\n GrpcEvent,\n GrpcMetadata\n} from '@ngx-grpc/common';\nimport {\n GRPC_CLIENT_FACTORY
,\n GrpcHandler,\n takeMessages,\n throwStatusErrors\n} from '@ngx-grpc/core';\nimport { Observable } from 'rxjs';\nimport * as thisProto from './geo.pb';\nimport { GRPC_GEOSPATIAL_CLIENT_SETTINGS } from './geo.pbconf';\n/

\n * Service client implementation for geospatial.Geospatial\n */\n@Injectable({ providedIn: 'any' })\nexport class GeospatialClient {\n private client: GrpcClient;\n\n /
\n * Raw RPC implementation for each service client
method.\n * The raw methods provide more control on the incoming data and events. E.g. they can be useful to read status OK metadata.\n * Attention: these methods do not throw errors when non-zero status codes are received.\n *
/\n $raw = {\n /
\n * Unary RPC for /geospatial.Geospatial/GetLocations\n *\n * @param requestMessage Request message\n * @param requestMetadata Request metadata\n * @returns Observable<GrpcEvent<thisProto.Loca
tions>>\n */\n getLocations: (\n requestData: thisProto.GetLocationReq,\n requestMetadata = new GrpcMetadata()\n ): Observable<GrpcEvent<thisProto.Locations>> => {\n return this.handler.handle({\n type:
GrpcCallType.unary,\n client: this.client,\n path: '/geospatial.Geospatial/GetLocations',\n requestData,\n requestMetadata,\n requestClass: thisProto.GetLocationReq,\n responseClass: thisProt
o.Locations\n });\n },\n /
\n * Server streaming RPC for /geospatial.Geospatial/StartLive\n *\n * @param requestMessage Request message\n * @param requestMetadata Request metadata\n * @returns Observable
<GrpcEvent<thisProto.Location>>\n */\n startLive: (\n requestData: thisProto.GetLocationReq,\n requestMetadata = new GrpcMetadata()\n ): Observable<GrpcEvent<thisProto.Location>> => {\n return this.handler.hand
le({\n type: GrpcCallType.serverStream,\n client: this.client,\n path: '/geospatial.Geospatial/StartLive',\n requestData,\n requestMetadata,\n requestClass: thisProto.GetLocationReq,\n
responseClass: thisProto.Location\n });\n }\n };\n\n constructor(\n @optional() @Inject(GRPC_GEOSPATIAL_CLIENT_SETTINGS) settings: any,\n @Inject(GRPC_CLIENT_FACTORY) clientFactory: GrpcClientFactory,\n private
handler: GrpcHandler\n ) {\n this.client = clientFactory.createClient('geospatial.Geospatial', settings);\n }\n\n /
*\n * Unary RPC for /geospatial.Geospatial/GetLocations\n *\n * @param requestMessage Request message\n

  • @param requestMetadata Request metadata\n * @returns Observable<thisProto.Locations>\n */\n getLocations(\n requestData: thisProto.GetLocationReq,\n requestMetadata = new GrpcMetadata()\n ): Observable<thisProto.Locatio
    ns> {\n return this.$raw\n .getLocations(requestData, requestMetadata)\n .pipe(throwStatusErrors(), takeMessages());\n }\n\n /\n * Server streaming RPC for /geospatial.Geospatial/StartLive\n *\n * @param requestM
    essage Request message\n * @param requestMetadata Request metadata\n * @returns Observable<thisProto.Location>\n /\n startLive(\n requestData: thisProto.GetLocationReq,\n requestMetadata = new GrpcMetadata()\n ): Observa
    ble<thisProto.Location> {\n return this.$raw\n .startLive(requestData, requestMetadata)\n .pipe(throwStatusErrors(), takeMessages());\n }\n}\nz\327r\n\tgeo.pb.tsz\311r/
    tslint:disable /\n/ eslint-disable */\n//\n// THI
    S IS A GENERATED FILE\n// DO NOT MODIFY IT! YOUR CHANGES WILL BE LOST\nimport {\n GrpcMessage,\n RecursivePartial,\n ToProtobufJSONOptions\n} from '@ngx-grpc/common';\nimport { BinaryReader, BinaryWriter, ByteSource } from 'googl
    e-protobuf';\n\n/
    \n * Message implementation for geospatial.GetLocationReq\n */\nexport class GetLocationReq implements GrpcMessage {\n static id = 'geospatial.GetLocationReq';\n\n /**\n * Deserialize binary data to message\n
  • @param instance message instance\n */\n static deserializeBinary(bytes: ByteSource) {\n const instance = new GetLocationReq();\n GetLocationReq.deserializeBinaryFromReader(\n instance,\n new BinaryReader(bytes)\n
    );\n return instance;\n }\n\n /\n * Check all the properties and set default protobuf values if necessary\n * @param _instance message instance\n */\n static refineValues(_instance: GetLocationReq) {\n _instance.im
    ei = _instance.imei || '';\n _instance.from = _instance.from || '0';\n _instance.to = _instance.to || '0';\n }\n\n /
    \n * Deserializes / reads binary message into message instance using provided binary reader\n * @pa
    ram _instance message instance\n * @param _reader binary reader instance\n */\n static deserializeBinaryFromReader(\n _instance: GetLocationReq,\n _reader: BinaryReader\n ) {\n while (_reader.nextField()) {\n if (_r
    eader.isEndGroup()) break;\n\n switch (_reader.getFieldNumber()) {\n case 1:\n _instance.imei = _reader.readString();\n break;\n case 2:\n _instance.from = _reader.readInt64String();\n
    break;\n case 3:\n _instance.to = _reader.readInt64String();\n break;\n default:\n _reader.skipField();\n }\n }\n\n GetLocationReq.refineValues(_instance);\n }\n\n /\n * Se
    rializes a message to binary format using provided binary reader\n * @param _instance message instance\n * @param _writer binary writer instance\n */\n static serializeBinaryToWriter(\n _instance: GetLocationReq,\n _writer
    : BinaryWriter\n ) {\n if (_instance.imei) {\n _writer.writeString(1, _instance.imei);\n }\n if (_instance.from) {\n _writer.writeInt64String(2, _instance.from);\n }\n if (_instance.to) {\n _writer.writeI
    nt64String(3, _instance.to);\n }\n }\n\n private _imei?: string;\n private _from?: string;\n private _to?: string;\n\n /
    \n * Message constructor. Initializes the properties and applies default Protobuf values if necessary\n
  • @param _value initial values object or instance of GetLocationReq to deeply clone from\n */\n constructor(_value?: RecursivePartial<GetLocationReq.AsObject>) {\n _value = _value || {};\n this.imei = _value.imei;\n this
    .from = _value.from;\n this.to = _value.to;\n GetLocationReq.refineValues(this);\n }\n get imei(): string | undefined {\n return this._imei;\n }\n set imei(value: string | undefined) {\n this._imei = value;\n }\n get
    from(): string | undefined {\n return this._from;\n }\n set from(value: string | undefined) {\n this._from = value;\n }\n get to(): string | undefined {\n return this._to;\n }\n set to(value: string | undefined) {\n t
    his._to = value;\n }\n\n /\n * Serialize message to binary data\n * @param instance message instance\n */\n serializeBinary() {\n const writer = new BinaryWriter();\n GetLocationReq.serializeBinaryToWriter(this, writer
    );\n return writer.getResultBuffer();\n }\n\n /
    \n * Cast message to standard JavaScript object (all non-primitive values are deeply cloned)\n */\n toObject(): GetLocationReq.AsObject {\n return {\n imei: this.imei,
    n from: this.from,\n to: this.to\n };\n }\n\n /\n * Convenience method to support JSON.stringify(message), replicates the structure of toObject()\n */\n toJSON() {\n return this.toObject();\n }\n\n /\n *
    Cast message to JSON using protobuf JSON notation: https://developers.google.com/protocol-buffers/docs/proto3#json\n * Attention: output differs from toObject() e.g. enums are represented as names and not as numbers, Timestamp is an
    ISO Date string format etc.\n * If the message itself or some of descendant messages is google.protobuf.Any, you MUST provide a message pool as options. If not, the messagePool is not required\n */\n toProtobufJSON(\n // @ts-ig
    nore\n options?: ToProtobufJSONOptions\n ): GetLocationReq.AsProtobufJSON {\n return {\n imei: this.imei,\n from: this.from,\n to: this.to\n };\n }\n}\nexport module GetLocationReq {\n /\n * Standard Java
    Script object representation for GetLocationReq\n */\n export interface AsObject {\n imei?: string;\n from?: string;\n to?: string;\n }\n\n /
    \n * Protobuf JSON representation for GetLocationReq\n */\n export interf
    ace AsProtobufJSON {\n imei?: string;\n from?: string;\n to?: string;\n }\n}\n\n/\n * Message implementation for geospatial.Location\n */\nexport class Location implements GrpcMessage {\n static id = 'geospatial.Location
    ';\n\n /
    \n * Deserialize binary data to message\n * @param instance message instance\n */\n static deserializeBinary(bytes: ByteSource) {\n const instance = new Location();\n Location.deserializeBinaryFromReader(instanc
    e, new BinaryReader(bytes));\n return instance;\n }\n\n /\n * Check all the properties and set default protobuf values if necessary\n * @param _instance message instance\n */\n static refineValues(_instance: Location) {\n
    _instance.imei = _instance.imei || '';\n _instance.lng = _instance.lng || 0;\n _instance.lat = _instance.lat || 0;\n _instance.ang = _instance.ang || 0;\n _instance.spd = _instance.spd || 0;\n _instance.ts = _insta
    nce.ts || '0';\n }\n\n /
    \n * Deserializes / reads binary message into message instance using provided binary reader\n * @param _instance message instance\n * @param _reader binary reader instance\n */\n static deserializ
    eBinaryFromReader(\n _instance: Location,\n _reader: BinaryReader\n ) {\n while (_reader.nextField()) {\n if (_reader.isEndGroup()) break;\n\n switch (_reader.getFieldNumber()) {\n case 1:\n _instanc
    e.imei = _reader.readString();\n break;\n case 2:\n _instance.lng = _reader.readDouble();\n break;\n case 3:\n _instance.lat = _reader.readDouble();\n break;\n case 4:\n
    _instance.ang = _reader.readInt32();\n break;\n case 5:\n _instance.spd = _reader.readInt32();\n break;\n case 6:\n _instance.ts = _reader.readInt64String();\n break;
    \n default:\n _reader.skipField();\n }\n }\n\n Location.refineValues(_instance);\n }\n\n /\n * Serializes a message to binary format using provided binary reader\n * @param _instance message instance
    n * @param _writer binary writer instance\n */\n static serializeBinaryToWriter(_instance: Location, _writer: BinaryWriter) {\n if (_instance.imei) {\n _writer.writeString(1, _instance.imei);\n }\n if (_instance.lng)
    {\n _writer.writeDouble(2, _instance.lng);\n }\n if (_instance.lat) {\n _writer.writeDouble(3, _instance.lat);\n }\n if (_instance.ang) {\n _writer.writeInt32(4, _instance.ang);\n }\n if (_instance.spd
    ) {\n _writer.writeInt32(5, _instance.spd);\n }\n if (_instance.ts) {\n _writer.writeInt64String(6, _instance.ts);\n }\n }\n\n private _imei?: string;\n private _lng?: number;\n private _lat?: number;\n private
    _ang?: number;\n private _spd?: number;\n private _ts?: string;\n\n /
    \n * Message constructor. Initializes the properties and applies default Protobuf values if necessary\n * @param _value initial values object or instance of
    Location to deeply clone from\n */\n constructor(_value?: RecursivePartial<Location.AsObject>) {\n _value = _value || {};\n this.imei = _value.imei;\n this.lng = _value.lng;\n this.lat = _value.lat;\n this.ang = _valu
    e.ang;\n this.spd = _value.spd;\n this.ts = _value.ts;\n Location.refineValues(this);\n }\n get imei(): string | undefined {\n return this._imei;\n }\n set imei(value: string | undefined) {\n this._imei = value;\n }
    \n get lng(): number | undefined {\n return this._lng;\n }\n set lng(value: number | undefined) {\n this._lng = value;\n }\n get lat(): number | undefined {\n return this._lat;\n }\n set lat(value: number | undefined) {
    \n this._lat = value;\n }\n get ang(): number | undefined {\n return this._ang;\n }\n set ang(value: number | undefined) {\n this._ang = value;\n }\n get spd(): number | undefined {\n return this._spd;\n }\n set spd
    (value: number | undefined) {\n this._spd = value;\n }\n get ts(): string | undefined {\n return this._ts;\n }\n set ts(value: string | undefined) {\n this._ts = value;\n }\n\n /\n * Serialize message to binary data
    n * @param instance message instance\n */\n serializeBinary() {\n const writer = new BinaryWriter();\n Location.serializeBinaryToWriter(this, writer);\n return writer.getResultBuffer();\n }\n\n /
    \n * Cast message to
    standard JavaScript object (all non-primitive values are deeply cloned)\n */\n toObject(): Location.AsObject {\n return {\n imei: this.imei,\n lng: this.lng,\n lat: this.lat,\n ang: this.ang,\n spd: this
    .spd,\n ts: this.ts\n };\n }\n\n /\n * Convenience method to support JSON.stringify(message), replicates the structure of toObject()\n */\n toJSON() {\n return this.toObject();\n }\n\n /\n * Cast message to JS
    ON using protobuf JSON notation: https://developers.google.com/protocol-buffers/docs/proto3#json\n * Attention: output differs from toObject() e.g. enums are represented as names and not as numbers, Timestamp is an ISO Date string fo
    rmat etc.\n * If the message itself or some of descendant messages is google.protobuf.Any, you MUST provide a message pool as options. If not, the messagePool is not required\n */\n toProtobufJSON(\n // @ts-ignore\n options?
    : ToProtobufJSONOptions\n ): Location.AsProtobufJSON {\n return {\n imei: this.imei,\n lng: this.lng,\n lat: this.lat,\n ang: this.ang,\n spd: this.spd,\n ts: this.ts\n };\n }\n}\nexport module Loc
    ation {\n /\n * Standard JavaScript object representation for Location\n */\n export interface AsObject {\n imei?: string;\n lng?: number;\n lat?: number;\n ang?: number;\n spd?: number;\n ts?: string;\n }\n
    n /
    \n * Protobuf JSON representation for Location\n */\n export interface AsProtobufJSON {\n imei?: string;\n lng?: number;\n lat?: number;\n ang?: number;\n spd?: number;\n ts?: string;\n }\n}\n\n/\n * Mes
    sage implementation for geospatial.Locations\n */\nexport class Locations implements GrpcMessage {\n static id = 'geospatial.Locations';\n\n /
    \n * Deserialize binary data to message\n * @param instance message instance\n */
    \n static deserializeBinary(bytes: ByteSource) {\n const instance = new Locations();\n Locations.deserializeBinaryFromReader(instance, new BinaryReader(bytes));\n return instance;\n }\n\n /\n * Check all the properties
    and set default protobuf values if necessary\n * @param _instance message instance\n */\n static refineValues(_instance: Locations) {\n _instance.locs = _instance.locs || [];\n }\n\n /
    \n * Deserializes / reads binary mess
    age into message instance using provided binary reader\n * @param _instance message instance\n * @param _reader binary reader instance\n */\n static deserializeBinaryFromReader(\n _instance: Locations,\n _reader: BinaryRea
    der\n ) {\n while (_reader.nextField()) {\n if (_reader.isEndGroup()) break;\n\n switch (_reader.getFieldNumber()) {\n case 1:\n const messageInitializer1 = new Location();\n _reader.readMessage(
    n messageInitializer1,\n Location.deserializeBinaryFromReader\n );\n (_instance.locs = _instance.locs || []).push(messageInitializer1);\n break;\n default:\n _reader.skip
    Field();\n }\n }\n\n Locations.refineValues(_instance);\n }\n\n /\n * Serializes a message to binary format using provided binary reader\n * @param _instance message instance\n * @param _writer binary writer instan
    ce\n */\n static serializeBinaryToWriter(_instance: Locations, _writer: BinaryWriter) {\n if (_instance.locs && _instance.locs.length) {\n _writer.writeRepeatedMessage(\n 1,\n _instance.locs as any,\n Lo
    cation.serializeBinaryToWriter\n );\n }\n }\n\n private _locs?: Location[];\n\n /
    \n * Message constructor. Initializes the properties and applies default Protobuf values if necessary\n * @param _value initial values ob
    ject or instance of Locations to deeply clone from\n */\n constructor(_value?: RecursivePartial<Locations.AsObject>) {\n _value = _value || {};\n this.locs = (_value.locs || []).map(m => new Location(m));\n Locations.refine
    Values(this);\n }\n get locs(): Location[] | undefined {\n return this._locs;\n }\n set locs(value: Location[] | undefined) {\n this._locs = value;\n }\n\n /\n * Serialize message to binary data\n * @param instance me
    ssage instance\n */\n serializeBinary() {\n const writer = new BinaryWriter();\n Locations.serializeBinaryToWriter(this, writer);\n return writer.getResultBuffer();\n }\n\n /
    \n * Cast message to standard JavaScript ob
    ject (all non-primitive values are deeply cloned)\n */\n toObject(): Locations.AsObject {\n return {\n locs: (this.locs || []).map(m => m.toObject())\n };\n }\n\n /\n * Convenience method to support JSON.stringify(m
    essage), replicates the structure of toObject()\n */\n toJSON() {\n return this.toObject();\n }\n\n /
    \n * Cast message to JSON using protobuf JSON notation: https://developers.google.com/protocol-buffers/docs/proto3#json\n
  • Attention: output differs from toObject() e.g. enums are represented as names and not as numbers, Timestamp is an ISO Date string format etc.\n * If the message itself or some of descendant messages is google.protobuf.Any, you MU
    ST provide a message pool as options. If not, the messagePool is not required\n */\n toProtobufJSON(\n // @ts-ignore\n options?: ToProtobufJSONOptions\n ): Locations.AsProtobufJSON {\n return {\n locs: (this.locs || [
    ]).map(m => m.toProtobufJSON(options))\n };\n }\n}\nexport module Locations {\n /*\n * Standard JavaScript object representation for Locations\n /\n export interface AsObject {\n locs?: Location.AsObject[];\n }\n\n /
    \n * Protobuf JSON representation for Locations\n */\n export interface AsProtobufJSON {\n locs?: Location.AsProtobufJSON[] | null;\n }\n}\n
    C:\Projects\Angular\factory-track-prototype-app\node_modules\grpc-tools\bin\protoc.js:41
    throw error;
    ^

Error: Command failed: C:\Projects\Angular\factory-track-prototype-app\node_modules\grpc-tools\bin\protoc.exe --plugin=protoc-gen-grpc=C:\Projects\Angular\factory-track-prototype-app\node_modules\grpc-tools\bin\grpc_node_plugin.exe --p
lugin=protoc-gen-ng=.\node_modules.bin\protoc-gen-ng.cmd --ng_out=src/app/proto -I src/app/proto src/app/proto/geo.proto
--ng_out: protoc-gen-ng: Plugin output is unparseable: Active code page: 437\r\nz\275\003\n\rgeo.pbconf.tsz\253\003/* tslint:disable /\n/ eslint-disable */\n//\n// THIS IS A GENERATED FILE\n// DO NOT MODIFY IT! YOUR CHANGES WILL BE L
OST\nimport { InjectionToken } from '@angular/core';\n\n/\n * Specific GrpcClientSettings for Geospatial.\n * Use it only if your default settings are not set or the service requires other settings.\n /\nexport const GRPC_GEOSPATI
AL_CLIENT_SETTINGS = new InjectionToken(\n 'GRPC_GEOSPATIAL_CLIENT_SETTINGS'\n);\nz\324\034\n\013geo.pbsc.tsz\304\034/
tslint:disable /\n/ eslint-disable /\n//\n// THIS IS A GENERATED FILE\n// DO NOT MODIFY IT! YOUR CHANGES
WILL BE LOST\nimport { Inject, Injectable, Optional } from '@angular/core';\nimport {\n GrpcCallType,\n GrpcClient,\n GrpcClientFactory,\n GrpcEvent,\n GrpcMetadata\n} from '@ngx-grpc/common';\nimport {\n GRPC_CLIENT_FACTORY
,\n GrpcHandler,\n takeMessages,\n throwStatusErrors\n} from '@ngx-grpc/core';\nimport { Observable } from 'rxjs';\nimport * as thisProto from './geo.pb';\nimport { GRPC_GEOSPATIAL_CLIENT_SETTINGS } from './geo.pbconf';\n/

\n * Service client implementation for geospatial.Geospatial\n */\n@Injectable({ providedIn: 'any' })\nexport class GeospatialClient {\n private client: GrpcClient;\n\n /
\n * Raw RPC implementation for each service client
method.\n * The raw methods provide more control on the incoming data and events. E.g. they can be useful to read status OK metadata.\n * Attention: these methods do not throw errors when non-zero status codes are received.\n *
/\n $raw = {\n /
\n * Unary RPC for /geospatial.Geospatial/GetLocations\n *\n * @param requestMessage Request message\n * @param requestMetadata Request metadata\n * @returns Observable<GrpcEvent<thisProto.Loca
tions>>\n */\n getLocations: (\n requestData: thisProto.GetLocationReq,\n requestMetadata = new GrpcMetadata()\n ): Observable<GrpcEvent<thisProto.Locations>> => {\n return this.handler.handle({\n type:
GrpcCallType.unary,\n client: this.client,\n path: '/geospatial.Geospatial/GetLocations',\n requestData,\n requestMetadata,\n requestClass: thisProto.GetLocationReq,\n responseClass: thisProt
o.Locations\n });\n },\n /
\n * Server streaming RPC for /geospatial.Geospatial/StartLive\n *\n * @param requestMessage Request message\n * @param requestMetadata Request metadata\n * @returns Observable
<GrpcEvent<thisProto.Location>>\n */\n startLive: (\n requestData: thisProto.GetLocationReq,\n requestMetadata = new GrpcMetadata()\n ): Observable<GrpcEvent<thisProto.Location>> => {\n return this.handler.hand
le({\n type: GrpcCallType.serverStream,\n client: this.client,\n path: '/geospatial.Geospatial/StartLive',\n requestData,\n requestMetadata,\n requestClass: thisProto.GetLocationReq,\n
responseClass: thisProto.Location\n });\n }\n };\n\n constructor(\n @optional() @Inject(GRPC_GEOSPATIAL_CLIENT_SETTINGS) settings: any,\n @Inject(GRPC_CLIENT_FACTORY) clientFactory: GrpcClientFactory,\n private
handler: GrpcHandler\n ) {\n this.client = clientFactory.createClient('geospatial.Geospatial', settings);\n }\n\n /
*\n * Unary RPC for /geospatial.Geospatial/GetLocations\n *\n * @param requestMessage Request message\n

  • @param requestMetadata Request metadata\n * @returns Observable<thisProto.Locations>\n */\n getLocations(\n requestData: thisProto.GetLocationReq,\n requestMetadata = new GrpcMetadata()\n ): Observable<thisProto.Locatio
    ns> {\n return this.$raw\n .getLocations(requestData, requestMetadata)\n .pipe(throwStatusErrors(), takeMessages());\n }\n\n /\n * Server streaming RPC for /geospatial.Geospatial/StartLive\n *\n * @param requestM
    essage Request message\n * @param requestMetadata Request metadata\n * @returns Observable<thisProto.Location>\n /\n startLive(\n requestData: thisProto.GetLocationReq,\n requestMetadata = new GrpcMetadata()\n ): Observa
    ble<thisProto.Location> {\n return this.$raw\n .startLive(requestData, requestMetadata)\n .pipe(throwStatusErrors(), takeMessages());\n }\n}\nz\327r\n\tgeo.pb.tsz\311r/
    tslint:disable /\n/ eslint-disable */\n//\n// THI
    S IS A GENERATED FILE\n// DO NOT MODIFY IT! YOUR CHANGES WILL BE LOST\nimport {\n GrpcMessage,\n RecursivePartial,\n ToProtobufJSONOptions\n} from '@ngx-grpc/common';\nimport { BinaryReader, BinaryWriter, ByteSource } from 'googl
    e-protobuf';\n\n/
    \n * Message implementation for geospatial.GetLocationReq\n */\nexport class GetLocationReq implements GrpcMessage {\n static id = 'geospatial.GetLocationReq';\n\n /**\n * Deserialize binary data to message\n
  • @param instance message instance\n */\n static deserializeBinary(bytes: ByteSource) {\n const instance = new GetLocationReq();\n GetLocationReq.deserializeBinaryFromReader(\n instance,\n new BinaryReader(bytes)\n
    );\n return instance;\n }\n\n /\n * Check all the properties and set default protobuf values if necessary\n * @param _instance message instance\n */\n static refineValues(_instance: GetLocationReq) {\n _instance.im
    ei = _instance.imei || '';\n _instance.from = _instance.from || '0';\n _instance.to = _instance.to || '0';\n }\n\n /
    \n * Deserializes / reads binary message into message instance using provided binary reader\n * @pa
    ram _instance message instance\n * @param _reader binary reader instance\n */\n static deserializeBinaryFromReader(\n _instance: GetLocationReq,\n _reader: BinaryReader\n ) {\n while (_reader.nextField()) {\n if (_r
    eader.isEndGroup()) break;\n\n switch (_reader.getFieldNumber()) {\n case 1:\n _instance.imei = _reader.readString();\n break;\n case 2:\n _instance.from = _reader.readInt64String();\n
    break;\n case 3:\n _instance.to = _reader.readInt64String();\n break;\n default:\n _reader.skipField();\n }\n }\n\n GetLocationReq.refineValues(_instance);\n }\n\n /\n * Se
    rializes a message to binary format using provided binary reader\n * @param _instance message instance\n * @param _writer binary writer instance\n */\n static serializeBinaryToWriter(\n _instance: GetLocationReq,\n _writer
    : BinaryWriter\n ) {\n if (_instance.imei) {\n _writer.writeString(1, _instance.imei);\n }\n if (_instance.from) {\n _writer.writeInt64String(2, _instance.from);\n }\n if (_instance.to) {\n _writer.writeI
    nt64String(3, _instance.to);\n }\n }\n\n private _imei?: string;\n private _from?: string;\n private _to?: string;\n\n /
    \n * Message constructor. Initializes the properties and applies default Protobuf values if necessary\n
  • @param _value initial values object or instance of GetLocationReq to deeply clone from\n */\n constructor(_value?: RecursivePartial<GetLocationReq.AsObject>) {\n _value = _value || {};\n this.imei = _value.imei;\n this
    .from = _value.from;\n this.to = _value.to;\n GetLocationReq.refineValues(this);\n }\n get imei(): string | undefined {\n return this._imei;\n }\n set imei(value: string | undefined) {\n this._imei = value;\n }\n get
    from(): string | undefined {\n return this._from;\n }\n set from(value: string | undefined) {\n this._from = value;\n }\n get to(): string | undefined {\n return this._to;\n }\n set to(value: string | undefined) {\n t
    his._to = value;\n }\n\n /\n * Serialize message to binary data\n * @param instance message instance\n */\n serializeBinary() {\n const writer = new BinaryWriter();\n GetLocationReq.serializeBinaryToWriter(this, writer
    );\n return writer.getResultBuffer();\n }\n\n /
    \n * Cast message to standard JavaScript object (all non-primitive values are deeply cloned)\n */\n toObject(): GetLocationReq.AsObject {\n return {\n imei: this.imei,
    n from: this.from,\n to: this.to\n };\n }\n\n /\n * Convenience method to support JSON.stringify(message), replicates the structure of toObject()\n */\n toJSON() {\n return this.toObject();\n }\n\n /\n *
    Cast message to JSON using protobuf JSON notation: https://developers.google.com/protocol-buffers/docs/proto3#json\n * Attention: output differs from toObject() e.g. enums are represented as names and not as numbers, Timestamp is an
    ISO Date string format etc.\n * If the message itself or some of descendant messages is google.protobuf.Any, you MUST provide a message pool as options. If not, the messagePool is not required\n */\n toProtobufJSON(\n // @ts-ig
    nore\n options?: ToProtobufJSONOptions\n ): GetLocationReq.AsProtobufJSON {\n return {\n imei: this.imei,\n from: this.from,\n to: this.to\n };\n }\n}\nexport module GetLocationReq {\n /\n * Standard Java
    Script object representation for GetLocationReq\n */\n export interface AsObject {\n imei?: string;\n from?: string;\n to?: string;\n }\n\n /
    \n * Protobuf JSON representation for GetLocationReq\n */\n export interf
    ace AsProtobufJSON {\n imei?: string;\n from?: string;\n to?: string;\n }\n}\n\n/\n * Message implementation for geospatial.Location\n */\nexport class Location implements GrpcMessage {\n static id = 'geospatial.Location
    ';\n\n /
    \n * Deserialize binary data to message\n * @param instance message instance\n */\n static deserializeBinary(bytes: ByteSource) {\n const instance = new Location();\n Location.deserializeBinaryFromReader(instanc
    e, new BinaryReader(bytes));\n return instance;\n }\n\n /\n * Check all the properties and set default protobuf values if necessary\n * @param _instance message instance\n */\n static refineValues(_instance: Location) {\n
    _instance.imei = _instance.imei || '';\n _instance.lng = _instance.lng || 0;\n _instance.lat = _instance.lat || 0;\n _instance.ang = _instance.ang || 0;\n _instance.spd = _instance.spd || 0;\n _instance.ts = _insta
    nce.ts || '0';\n }\n\n /
    \n * Deserializes / reads binary message into message instance using provided binary reader\n * @param _instance message instance\n * @param _reader binary reader instance\n */\n static deserializ
    eBinaryFromReader(\n _instance: Location,\n _reader: BinaryReader\n ) {\n while (_reader.nextField()) {\n if (_reader.isEndGroup()) break;\n\n switch (_reader.getFieldNumber()) {\n case 1:\n _instanc
    e.imei = _reader.readString();\n break;\n case 2:\n _instance.lng = _reader.readDouble();\n break;\n case 3:\n _instance.lat = _reader.readDouble();\n break;\n case 4:\n
    _instance.ang = _reader.readInt32();\n break;\n case 5:\n _instance.spd = _reader.readInt32();\n break;\n case 6:\n _instance.ts = _reader.readInt64String();\n break;
    \n default:\n _reader.skipField();\n }\n }\n\n Location.refineValues(_instance);\n }\n\n /\n * Serializes a message to binary format using provided binary reader\n * @param _instance message instance
    n * @param _writer binary writer instance\n */\n static serializeBinaryToWriter(_instance: Location, _writer: BinaryWriter) {\n if (_instance.imei) {\n _writer.writeString(1, _instance.imei);\n }\n if (_instance.lng)
    {\n _writer.writeDouble(2, _instance.lng);\n }\n if (_instance.lat) {\n _writer.writeDouble(3, _instance.lat);\n }\n if (_instance.ang) {\n _writer.writeInt32(4, _instance.ang);\n }\n if (_instance.spd
    ) {\n _writer.writeInt32(5, _instance.spd);\n }\n if (_instance.ts) {\n _writer.writeInt64String(6, _instance.ts);\n }\n }\n\n private _imei?: string;\n private _lng?: number;\n private _lat?: number;\n private
    _ang?: number;\n private _spd?: number;\n private _ts?: string;\n\n /
    \n * Message constructor. Initializes the properties and applies default Protobuf values if necessary\n * @param _value initial values object or instance of
    Location to deeply clone from\n */\n constructor(_value?: RecursivePartial<Location.AsObject>) {\n _value = _value || {};\n this.imei = _value.imei;\n this.lng = _value.lng;\n this.lat = _value.lat;\n this.ang = _valu
    e.ang;\n this.spd = _value.spd;\n this.ts = _value.ts;\n Location.refineValues(this);\n }\n get imei(): string | undefined {\n return this._imei;\n }\n set imei(value: string | undefined) {\n this._imei = value;\n }
    \n get lng(): number | undefined {\n return this._lng;\n }\n set lng(value: number | undefined) {\n this._lng = value;\n }\n get lat(): number | undefined {\n return this._lat;\n }\n set lat(value: number | undefined) {
    \n this._lat = value;\n }\n get ang(): number | undefined {\n return this._ang;\n }\n set ang(value: number | undefined) {\n this._ang = value;\n }\n get spd(): number | undefined {\n return this._spd;\n }\n set spd
    (value: number | undefined) {\n this._spd = value;\n }\n get ts(): string | undefined {\n return this._ts;\n }\n set ts(value: string | undefined) {\n this._ts = value;\n }\n\n /\n * Serialize message to binary data
    n * @param instance message instance\n */\n serializeBinary() {\n const writer = new BinaryWriter();\n Location.serializeBinaryToWriter(this, writer);\n return writer.getResultBuffer();\n }\n\n /
    \n * Cast message to
    standard JavaScript object (all non-primitive values are deeply cloned)\n */\n toObject(): Location.AsObject {\n return {\n imei: this.imei,\n lng: this.lng,\n lat: this.lat,\n ang: this.ang,\n spd: this
    .spd,\n ts: this.ts\n };\n }\n\n /\n * Convenience method to support JSON.stringify(message), replicates the structure of toObject()\n */\n toJSON() {\n return this.toObject();\n }\n\n /\n * Cast message to JS
    ON using protobuf JSON notation: https://developers.google.com/protocol-buffers/docs/proto3#json\n * Attention: output differs from toObject() e.g. enums are represented as names and not as numbers, Timestamp is an ISO Date string fo
    rmat etc.\n * If the message itself or some of descendant messages is google.protobuf.Any, you MUST provide a message pool as options. If not, the messagePool is not required\n */\n toProtobufJSON(\n // @ts-ignore\n options?
    : ToProtobufJSONOptions\n ): Location.AsProtobufJSON {\n return {\n imei: this.imei,\n lng: this.lng,\n lat: this.lat,\n ang: this.ang,\n spd: this.spd,\n ts: this.ts\n };\n }\n}\nexport module Loc
    ation {\n /\n * Standard JavaScript object representation for Location\n */\n export interface AsObject {\n imei?: string;\n lng?: number;\n lat?: number;\n ang?: number;\n spd?: number;\n ts?: string;\n }\n
    n /
    \n * Protobuf JSON representation for Location\n */\n export interface AsProtobufJSON {\n imei?: string;\n lng?: number;\n lat?: number;\n ang?: number;\n spd?: number;\n ts?: string;\n }\n}\n\n/\n * Mes
    sage implementation for geospatial.Locations\n */\nexport class Locations implements GrpcMessage {\n static id = 'geospatial.Locations';\n\n /
    \n * Deserialize binary data to message\n * @param instance message instance\n */
    \n static deserializeBinary(bytes: ByteSource) {\n const instance = new Locations();\n Locations.deserializeBinaryFromReader(instance, new BinaryReader(bytes));\n return instance;\n }\n\n /\n * Check all the properties
    and set default protobuf values if necessary\n * @param _instance message instance\n */\n static refineValues(_instance: Locations) {\n _instance.locs = _instance.locs || [];\n }\n\n /
    \n * Deserializes / reads binary mess
    age into message instance using provided binary reader\n * @param _instance message instance\n * @param _reader binary reader instance\n */\n static deserializeBinaryFromReader(\n _instance: Locations,\n _reader: BinaryRea
    der\n ) {\n while (_reader.nextField()) {\n if (_reader.isEndGroup()) break;\n\n switch (_reader.getFieldNumber()) {\n case 1:\n const messageInitializer1 = new Location();\n _reader.readMessage(
    n messageInitializer1,\n Location.deserializeBinaryFromReader\n );\n (_instance.locs = _instance.locs || []).push(messageInitializer1);\n break;\n default:\n _reader.skip
    Field();\n }\n }\n\n Locations.refineValues(_instance);\n }\n\n /\n * Serializes a message to binary format using provided binary reader\n * @param _instance message instance\n * @param _writer binary writer instan
    ce\n */\n static serializeBinaryToWriter(_instance: Locations, _writer: BinaryWriter) {\n if (_instance.locs && _instance.locs.length) {\n _writer.writeRepeatedMessage(\n 1,\n _instance.locs as any,\n Lo
    cation.serializeBinaryToWriter\n );\n }\n }\n\n private _locs?: Location[];\n\n /
    \n * Message constructor. Initializes the properties and applies default Protobuf values if necessary\n * @param _value initial values ob
    ject or instance of Locations to deeply clone from\n */\n constructor(_value?: RecursivePartial<Locations.AsObject>) {\n _value = _value || {};\n this.locs = (_value.locs || []).map(m => new Location(m));\n Locations.refine
    Values(this);\n }\n get locs(): Location[] | undefined {\n return this._locs;\n }\n set locs(value: Location[] | undefined) {\n this._locs = value;\n }\n\n /\n * Serialize message to binary data\n * @param instance me
    ssage instance\n */\n serializeBinary() {\n const writer = new BinaryWriter();\n Locations.serializeBinaryToWriter(this, writer);\n return writer.getResultBuffer();\n }\n\n /
    \n * Cast message to standard JavaScript ob
    ject (all non-primitive values are deeply cloned)\n */\n toObject(): Locations.AsObject {\n return {\n locs: (this.locs || []).map(m => m.toObject())\n };\n }\n\n /\n * Convenience method to support JSON.stringify(m
    essage), replicates the structure of toObject()\n */\n toJSON() {\n return this.toObject();\n }\n\n /
    \n * Cast message to JSON using protobuf JSON notation: https://developers.google.com/protocol-buffers/docs/proto3#json\n
  • Attention: output differs from toObject() e.g. enums are represented as names and not as numbers, Timestamp is an ISO Date string format etc.\n * If the message itself or some of descendant messages is google.protobuf.Any, you MU
    ST provide a message pool as options. If not, the messagePool is not required\n */\n toProtobufJSON(\n // @ts-ignore\n options?: ToProtobufJSONOptions\n ): Locations.AsProtobufJSON {\n return {\n locs: (this.locs || [
    ]).map(m => m.toProtobufJSON(options))\n };\n }\n}\nexport module Locations {\n /*\n * Standard JavaScript object representation for Locations\n /\n export interface AsObject {\n locs?: Location.AsObject[];\n }\n\n /
    \n * Protobuf JSON representation for Locations\n */\n export interface AsProtobufJSON {\n locs?: Location.AsProtobufJSON[] | null;\n }\n}\n

Combine web worker and ImprobableEngGrpcWebClientModule?

Hi,

thank you for this great project. Everything is working smooth.

Last point about web workers is giving me a hard time.
How can I combine it with the ImprobableEngGrpcWebClientModule driver? It does not support the worker attribute. Not forChildren and not forRoot.

Repeated messages not read correctly

It seems that repeated messages are not handled correctly. (Or I am doing something wrong 😄 )

Example code:

My .proto file:

service TodoService {
    rpc FindAll (TodoListRequest) returns (TodoList) {}
}

message TodoList {
    repeated Todo todos = 1;
}

I'm getting the correct response from the server (tested with BloomRPC):

{
    "todos": [
        {"id": 1, "name": "Buy Milk"},
        {"id": 2, "name": "Pay Bills"},
        {"id": 3, "name": "Play Games"}
    ]
}

However, I am getting this response from the TodoServiceClient:
Screenshot from 2019-11-20 14-49-19

It looks as though it adds (and overwrites) the properties of the items to the array itself, and not to the individual items.
Any idea what might be going wrong / how to fix this?

net::ERR_INCOMPLETE_CHUNKED_ENCODING 200 (OK)

Hi Semen,

on server side streams with the grpc-web client i cannot get rid of the
net::ERR_INCOMPLETE_CHUNKED_ENCODING 200 (OK) error.
Envoy is set up correctly with stream_idle_timeout: 0s.
For testing purposes a 10s short deadline is added to the GrpcMetadata.
Envoy picks this deadline and ends the stream at this deadline.
However everytime when the stream ends the error net::ERR_INCOMPLETE_CHUNKED_ENCODING 200 (OK) coming from a post request of the grpc service client via https://github.com/ngx-grpc/ngx-grpc/blob/2d9861c1e993a3496cf763e2d4d37231860a5f9e/packages/grpc-web-client/src/lib/grpc-web-client.ts#L96
is showing up in the developer console.


  | scheduleTask | @ | zone-evergreen.js:2845
  | scheduleTask | @ | zone-evergreen.js:389
  | onScheduleTask | @ | zone-evergreen.js:279
  | scheduleTask | @ | zone-evergreen.js:382
  | scheduleTask | @ | zone-evergreen.js:217
  | scheduleMacroTask | @ | zone-evergreen.js:240
  | scheduleMacroTaskWithCurrentZone | @ | zone-evergreen.js:675
  | (anonymous) | @ | zone-evergreen.js:2878
  | proto.<computed> | @ | zone-evergreen.js:971
  | ic | @ | index.js:38
  | Bc | @ | index.js:59
  | (anonymous) | @ | index.js:55
  | push.TxjO.Z.R | @ | index.js:55
  | (anonymous) | @ | ngx-grpc-grpc-web-client.js:83

Line 83 refers to js:

79    serverStream(path, req, metadata, reqclss, resclss) {
80        const descriptor = new MethodDescriptor(path, 'server_streaming', reqclss, resclss, (request) => request.serializeBinary(), resclss.deserializeBinary);
81        return new Observable(obs => {
82            var _a;
83            const stream = this.client.serverStreaming(this.settings.host + path, req, (_a = metadata === null || metadata === void 0 ? void 0 : metadata.toObject()) !== null && _a !== void 0 ? _a : {}, descriptor);
84            stream.on('status', status => obs.next(new GrpcStatusEvent(status.code, status.details, new GrpcMetadata(status.metadata))));
85            stream.on('error', error => {
86                obs.next(new GrpcStatusEvent(error.code, error.message, error.metadata));
87                obs.complete();
88            });
89            stream.on('data', data => obs.next(new GrpcDataEvent(data)));
90            stream.on('end', () => obs.complete());
91            return () => stream.cancel();
92        });
93    }

Setting authentication token for request

What is the correct way to set a bearer token when using this package? I am running into an issue where the request is always failing due to being unauthenticated but works when I test it on a server client (.net grpc client) using the same token. Do note that in both cases, I have a proxy sitting in front of the server for handling the grpc-web request.

intercept<Q extends GrpcMessage, S extends GrpcMessage>(request: GrpcRequest<Q, S>, next: GrpcHandler): Observable<GrpcEvent<S>> {
        return this.authService.getTokenSilently$().pipe(
            mergeMap(token => {
                const newMetadata: Metadata = {
                    ...request.requestMetadata,
                    ['Authorization']: `Bearer ${token}`
                };

                const tokenReq = {
                    ...request,
                    requestMetadata: newMetadata
                };
                return next.handle(tokenReq);
            })
        );
    }

I am getting an access token and setting is via initializing a new metadata object within a grpc interceptor. I can confirm the interceptor is registered and triggered for each request. I'm not quite sure if this is correct because no matter what I try, the access token is being refused when being set this way. Is there anything I am missing?

import duplication in *.pb.ts

[email protected](@ngx-grpc/[email protected])
[email protected]
[email protected]

Duplicate import lines occur in some generated *.pb.ts files like this.

/* tslint:disable */
/* eslint-disable */
//
// THIS IS A GENERATED FILE
// DO NOT MODIFY IT! YOUR CHANGES WILL BE LOST
import { GrpcMessage, RecursivePartial } from '@ngx-grpc/common';
import { BinaryReader, BinaryWriter, ByteSource } from 'google-protobuf';
import * as apiV1000 from './-commons.pb';
import * as googleProtobuf001 from './google/protobuf/timestamp.pb';
import * as apiV1002 from './mst-disease.pb';
import * as apiV1003 from './department.pb';
import * as apiV1004 from './staff.pb';
import * as googleProtobuf001 from './google/protobuf/timestamp.pb';

googleProtobuf001 is duplicated.
Therefore, the production build fails.
Currently this node script is working around the problem.

const fs = require('fs');
const cwd = process.cwd();
const targetDir = cwd + '/src/proto_ng';
const filenames = fs.readdirSync(targetDir).filter(filename => filename.match(/\.pb\.ts$/));

filenames.forEach(filename => {
  const lines = fs.readFileSync(targetDir + '/' + filename, {encoding: 'utf8'}).split('\n');
  const importLines = lines.filter(line => line.startsWith('import'));

  const duplicatedImportLines = importLines.filter((line, index) => importLines.includes(line, index + 1))
  if (duplicatedImportLines.length > 0) {
    duplicatedImportLines.forEach(duplicatedImportLine => {
      const firstIndex = lines.indexOf(duplicatedImportLine);
      const secondIndex = lines.indexOf(duplicatedImportLine, firstIndex + 1);
      lines.splice(secondIndex, 1);
    });
    // const fixedImportLines = lines.filter(line => line.startsWith('import'));
    // console.log(fixedImportLines);

    fs.writeFileSync(targetDir + '/' + filename, lines.join('\n'));
    console.log('changed: ' + targetDir + '/' + filename);
  }
});

Strict mode: goole.protobuf.Empty or message without fields are not supported

With something like this:

service HelloService {
  rpc SayHello (google.protobuf.Empty) returns (SayHelloReply);
}

message SayHelloReply {
  string message = 1;
}

will result in an error Empty is not a constructor`

Another case:

service HelloService {
  rpc SayHello (SayHelloRequest) returns (SayHelloReply);
}

message SayHelloRequest {};
message SayHelloReply {
  string message = 1;
}

will result in an error SayHelloRequest is not a constructor`

I tried an empty reply as well and there are other errors.
Some of the services that I'm dealing with use Empty because they don't need any fields.

Absorbing #8 into this one.

Problems with adding worker in Universal

Hey guys!

So, im having issues with adding worker as described in the readme to my angular project.

I have a Angular app with Universal built into it. I am unable to find a solution to make this work. Can you please suggest anything?

worker: new Worker('./grpc.worker', { type: 'module' }),
                            ^

ReferenceError: Worker is not defined

Thanks!

Vukasin

Little typo package name

Actually:

npm i -S @ngx-grpc/improbable-eng-grpc-web @improbable-eng/grpc-web

Correct:

npm i -S @ngx-grpc/improbable-eng-grpc-web-client @improbable-eng/grpc-web

protoc-gen-ng: implement support for optional fields in proto3

Hi, first of all thanks for this great library! It's easy to integrate and really works like a charm within angular. 👍
Since proto3 added support for optional fields per default recently protobuf v3.15.0, it would be great to reflect it inside the generated classes instead of declaring every field as optional. IMHO printAsObjectMapping and printAsJSONMapping in the fields generators should only print the ?, if the field is explicitly declared as optional in the proto3 definition.
Additionally protoc throws this error currently, when trying to compile proto files with optional fields: call.proto: is a proto3 file that contains optional fields, but code generator protoc-gen-ng hasn't been updated to support optional fields in proto3. Please ask the owner of this code generator to support proto3 optional.
It seems like protoc-plugin is causing the error, because it includes an on old protobuf compiler and has not been updated for quite a while:
https://github.com/konsumer/node-protoc-plugin/blob/331e33f49565be967007f3d817e99a5a94fb8a30/src/google/protobuf/compiler/plugin_pb.js
Is it still necessary to rely on this dependency or would it make sense to use google-protobuf/google/protobuf/compiler/plugin_pb directly, like improbable-eng/ts-protoc-gen does?

Not-OK status from backend does not trigger error callback

We are trying to migrate to the ngx-grpc package for our angular app.
Before we were using the improbable-eng/grpc-web one. That one has worked fine for us, outside of the lack of interceptors and observables.
Your package seems to fix both those shortcomings, so we are extremely happy with it.

There is just one thing we are having trouble with at the moment.
We are using the grpc-status trailer to send validation errors to the frontend. However this does not seem to work with ngx-grpc.

We expect the error callback to trigger when we put any grpc status code that is not status 0 OK.

Typescript service call:

this.myService
  .myRpcMethod({} as myRequest)
  .subscribe(
	response => {
	  // This will only trigger when I give a clean response without status code
	},
	error => {
	  // This will only trigger when there is some HTTP layer error (timeout, internal server error, etc)
          // We expect this to also trigger when grpc-status is not 0, otherwise it is not possible to get the status code/message at all
	},
	() => {
	  // This will trigger in all cases where response or error has triggered (as expected)
	}
  );

Our .NET backend implementation:

public override async Task<EmptyResponse> MyRpcMethod(MyRequest request, ServerCallContext context)
{
	context.Status = new Status(StatusCode.NotFound, "Not found");

	return new EmptyResponse
	{
		// Nothing in here
	};
}

When we throw a RpcException we get a similar result:

public override async Task<EmptyResponse> MyRpcMethod(MyRequest request, ServerCallContext context)
{
	var status = new Status(StatusCode.NotFound, "Not found");
        throw new RpcException(status);
}

When we throw any other exception, then the error does trigger, but then the grpc-status is lost.

Our environment is as follows:

  • Angular 8 app
  • Ambassador Edge Stack 1.1 (which has Envoy 1.12.1 built-in)
  • .NET core 3.0 microservices
  • Kubernetes cluster

Bug: proto with Map of message call toObjet in message constructor on javascript object before instanciation

Hi,

I have a bug with the map of the message below:

message ViewGraphics {
  map<uint64, gxd5.rendering.GraphicElement> elements = 1;
}

When my client receive a grpc response, the message is unserialized and give the following error:

value.elements[k].toObject is not a function

corresponding in ViewGraphics's constructor generated code to :

(this.elements = _value.elements
? Object.keys(_value.elements).reduce((r, k) => (Object.assign(Object.assign({}, r), { [k]: _value.elements[k]
? _value.elements[k].toObject()
: undefined })), {})
: {}),
ViewGraphics.refineValues(this);

In the ViewGraphics constructor the elements are not yet instanciated so toObject is'nt available. The instanciation of elements is done after the new ViewGraphics so we can't use toObject in constructor on elements.

best regards

Error: inject() must be called from an injection context

Hi,
I created an simple Angular Library which I then tracked on Bitbucket. I tried to install this library from the bitbucket url with 'npm install {git-repo-url}' and use the dist folder in my application.

This is the error I get when I want to use my installed library from Bitbucket
image

This is the folder structure of my library:
image

I also used "preserveSymlinks": true. It worked once, but then I got the same error back.

Any idea what might be going wrong / how to fix this?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.