Giter Club home page Giter Club logo

grpc-httpjson-transcoding's Introduction

grpc-httpjson-transcoding

grpc-httpjson-transcoding is a library that supports transcoding so that HTTP/JSON can be converted to gRPC.

It helps you to provide your APIs in both gRPC and RESTful style at the same time. The code is used in istio proxy and cloud endpoints to provide HTTP+JSON interface to gRPC service.

CI Status

Fuzzing Status

Develop

Bazel is used for build and dependency management. The following commands build and test sources:

$ bazel build //...
$ bazel test //...

Use the following script to check and fix code format:

$ script/check-style

Toolchain

The Bazel build system defaults to using clang 10 to enable reproducible builds.

Continuous Integration

This repository is integrated with OSS Prow. Prow will run the presubmit script on each Pull Request to verify tests pass. Note:

  • PR submission is only allowed if the job passes.
  • If you are an outside contributor, Prow may not run until a Googler LGTMs.

Contribution

See CONTRIBUTING.md.

License

grpc-httpjson-transcoding is licensed under the Apache 2.0 license. See LICENSE for more details.

grpc-httpjson-transcoding's People

Contributors

benjaminp avatar haberman avatar justin-mp avatar jwfang avatar lizan avatar mangchiandjjoe avatar nalexpear avatar nareddyt avatar qiwzhang avatar rkpagadala avatar shuoyang2016 avatar taoxuy avatar vadorovsky avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

grpc-httpjson-transcoding's Issues

does this support "fill-in" default protobuf fields ?

protobuf won't serialize fields which has default values.
this is no problem when both sides are using native protobuf codec,
but for json, it would be nice to have all fields, no matter default or not, present.

Supporting cmake

Is the project open for supporting cmake in addition to bazel?

UrlUnescapeString irreversabily transforms data

UrlUnescapeString produces the same output for semantically different strings, for example:

UrlUnescapeString("%2523", false) == "%23"
UrlUnescapeString("%23", false) == "%23"

This makes it impossible to understand original intent of a user.

Verbs are not distinguished from other segments

Say I want to have resources such as:

/people (a collection of people)
/people/{personId} (an individual person)

And I want both of these resource types to accept a custom verb called 'getResourceType'.

If someone makes a request to /people:getResourceType I want it to say 'person collection'
If someone makes a request to /people/xyz:getResourceTypeI want it to say 'person'


So my .proto might look like this:

service API {
  rpc ListPeople(Empty) returns (ListPeopleResponse) {
    option (google.api.http) = {
      get: "/people"
    };
  }
   
  rpc GetPerson(GetPersonRequest) returns (GetPersonResponse) {
    option (google.api.http) = {
      get: "/{person_id=people/*}"
    };
  }

  rpc GetResourceType(GetResourceTypeRequest) returns (GetResourceTypeResponse) {
    option (google.api.http) = {
      get: "/{resource_name=**}:getResourceType"
    };
  }
}

message GetPersonRequest {
  string person_id = 1;
}
message GetResourceTypeRequest {
  string resource_name = 1;
}
...

Then I might set up grpc-httpjson-transcoding using, say, Envoy.

When: I make a call to this service at /people:getResourceType

Expected: Service receives a GetResourceType call with resource_name = "people" (since that is the only method that accepts the getResourceType verb)

Actual: Service receives a GetPerson call with person_id = "getResourceType"


In my understanding, this is due to the ':' character being replaced with a '/' without remembering that the last segment is a verb.
https://github.com/grpc-ecosystem/grpc-httpjson-transcoding/blob/master/src/include/grpc_transcoding/path_matcher.h#L376

What's the usage scenario of this tool?

Hi All,

As title suggested, I am wondering if this package can do the following:

  • Capture a stream of gRPC request and response through network packet capturing (assuming that the captured packets are purely from one gRPC connection between one server and one client).
  • Feed the stream into this trans-coder. Would this trans-coder actually produces the corresponding json request and response directly?
  • Or I need to extract the protocol buffer binary stream out of the gRPC stream and feed to this transcoder, which means that this transcoder does not handle the protocol metadata in gRPC?

Error in building packages

After cloning, I ran bazel command to build project and I ran into following errors:
Starting local Bazel server and connecting to it...
ERROR: /home/basil/.cache/bazel/_bazel_basil/c7fbbb4ed84ea4e033d6d8801ef24b13/external/protobuf_git/BUILD:597:1: Traceback (most recent call last):
File "/home/basil/.cache/bazel/_bazel_basil/c7fbbb4ed84ea4e033d6d8801ef24b13/external/protobuf_git/BUILD", line 597
internal_gen_well_known_protos_java(srcs = WELL_KNOWN_PROTOS)
File "/home/basil/.cache/bazel/_bazel_basil/c7fbbb4ed84ea4e033d6d8801ef24b13/external/protobuf_git/protobuf.bzl", line 266, in internal_gen_well_known_protos_java
Label(("%s//protobuf_java" % REPOSITOR...))
File "/home/basil/.cache/bazel/_bazel_basil/c7fbbb4ed84ea4e033d6d8801ef24b13/external/protobuf_git/protobuf.bzl", line 266, in Label
REPOSITORY_NAME
builtin variable 'REPOSITORY_NAME' is referenced before assignment.
ERROR: /home/basil/.cache/bazel/_bazel_basil/c7fbbb4ed84ea4e033d6d8801ef24b13/external/protobuf_git/BUILD:101:1: Target '@protobuf_git//:windows' contains an error and its package is in error and referenced by '@protobuf_git//:protobuf'
ERROR: /home/basil/.cache/bazel/_bazel_basil/c7fbbb4ed84ea4e033d6d8801ef24b13/external/protobuf_git/BUILD:101:1: Target '@protobuf_git//:windows_msvc' contains an error and its package is in error and referenced by '@protobuf_git//:protobuf'
ERROR: /home/basil/.cache/bazel/_bazel_basil/c7fbbb4ed84ea4e033d6d8801ef24b13/external/protobuf_git/BUILD:101:1: Target '@protobuf_git//:android' contains an error and its package is in error and referenced by '@protobuf_git//:protobuf'
ERROR: /home/basil/.cache/bazel/_bazel_basil/c7fbbb4ed84ea4e033d6d8801ef24b13/external/protobuf_git/BUILD:101:1: Target '@protobuf_git//:windows' contains an error and its package is in error and referenced by '@protobuf_git//:protobuf'
ERROR: /home/basil/.cache/bazel/_bazel_basil/c7fbbb4ed84ea4e033d6d8801ef24b13/external/protobuf_git/BUILD:101:1: Target '@protobuf_git//:windows_msvc' contains an error and its package is in error and referenced by '@protobuf_git//:protobuf'
ERROR: /home/basil/Desktop/GSOC2019/grpc-httpjson-transcoding/repositories.bzl:47:9: Target '@protobuf_git//:protobuf' contains an error and its package is in error and referenced by '//external:protobuf'
ERROR: Analysis of target '//test:request_message_translator_test' failed; build aborted: Analysis failed
INFO: Elapsed time: 4.880s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (15 packages loaded, 87 targets configured)
Fetching @com_google_absl; fetching
Fetching @googleapis_git; fetching
Fetching @googletest_git; fetching

Question: Behavior when path and body in HTTP request map to same field

I have a proto definition from the greeter sample modified to the below:
Note that the uri path has a segment for "more.inner_name" and the body also maps to the field "more".

// The greeting service definition.
service Greeter {
  // Sends a greeting
  rpc SayHello (HelloRequest) returns (HelloReply) 
  {
      option (google.api.http) = 
      {
        get: "/api1/{name}/{more.inner_name}"
        body: "more"
      };
  }
}

message HelloInner {
    string inner_name = 1;
}

// The request message containing the user's name.
message HelloRequest {
  string name = 1;
  HelloInner more = 2;
}

I am issuing a request:
$ curl http://0.0.0.0:5000/api1/hh/printme -i -H "hello: brother" --raw --http1.1 --data '{"inner_name":"frombody"}' --request GET

On the grpc server, what is the expected body content? I see that the inner_name is set to "frombody". Is this behavior guaranteed to remain this way or could the inner_name be "printme" in the future?

Integrating with OSS-Fuzz

Greetings grpc-httpjson-transcoding developers and contributors,

We’re reaching out because your project is an important part of the open source ecosystem, and we’d like to invite you to integrate with our fuzzing service, OSS-Fuzz. OSS-Fuzz is a free fuzzing infrastructure you can use to identify security vulnerabilities and stability bugs in your project. OSS-Fuzz will:

  • Continuously run at scale all the fuzzers you write.
  • Alert you when it finds issues.
  • Automatically close issues after they’ve been fixed by a commit.

Many widely used open source projects like OpenSSL, FFmpeg, LibreOffice, and ImageMagick are fuzzing via OSS-Fuzz, which helps them find and remediate critical issues.

Even though typical integrations can be done in < 100 LoC, we have a reward program in place which aims to recognize folks who are not just contributing to open source, but are also working hard to make it more secure.

We want to stress that anyone who meets the eligibility criteria and integrates a project with OSS-Fuzz is eligible for a reward.

If you're not interested in integrating with OSS-Fuzz, it would be helpful for us to understand why—lack of interest, lack of time, or something else—so we can better support projects like yours in the future.

If we’ve missed your question in our FAQ, feel free to reply or reach out to us at [email protected].

Thanks!

Tommy
OSS-Fuzz Team

Is map in query param supported?

In my use case my GetSomethingRequest includes a map<string, string>, and I didn't manage to pass the map as a query param. So I have to attach a body to the GET request which is bad. This is how I define the messages and service:

message Key {
    map<string, string> data = 1;
    string field = 2;
}
message GetSomethingRequest {
    string parent = 1;
    Key key = 2;
}
service SomeService {
    rpc GetSomething(GetSomethingRequest) returns (Something) {
        option (google.api.http) = {
            get: "/v1/parent/{parent}/somethings"
        };
    }
}

I tried the following and neither of them works.

// also tried to replace [, ], {, }, :, = with percent encode.
// the first two are following https://github.com/grpc-ecosystem/grpc-gateway/pull/535
GET /v1/parent/parentname/somethings?field=aa&data[abc]=cba
GET /v1/parent/parentname/somethings?key.field=aa&key.data[abc]=cba
GET /v1/parent/parentname/somethings?key.field=aa&key.data={abc:cba}

So I wonder if map of primitives as query param is supported? If not any plans to add support for this? Thanks!

Error in build

Trying to build on windows but having issues

PS E:\src\grpc-httpjson-transcoding> bazel build //...
Starting local Bazel server and connecting to it...
WARNING: --enable_bzlmod is set, but no MODULE.bazel file was found at the workspace root. Bazel will create an empty MODULE.bazel file. Please consider migrating your external dependencies from WORKSPACE to MODULE.bazel. For more details, please refer to https://github.com/bazelbuild/bazel/issues/18958.
ERROR: C:/users/joss/_bazel_joss/qwq6hdnq/external/io_bazel_rules_docker/platforms/BUILD:78:9: in constraint_values attribute of platform rule @@io_bazel_rules_docker//platforms:image_transition: '@@io_bazel_rules_docker//platforms:image_transition_cpu' does not have mandatory providers: 'ConstraintValueInfo'
ERROR: C:/users/joss/_bazel_joss/qwq6hdnq/external/io_bazel_rules_docker/platforms/BUILD:78:9: in constraint_values attribute of platform rule @@io_bazel_rules_docker//platforms:image_transition: '@@io_bazel_rules_docker//platforms:image_transition_os' does not have mandatory providers: 'ConstraintValueInfo'
ERROR: C:/users/joss/_bazel_joss/qwq6hdnq/external/io_bazel_rules_docker/platforms/BUILD:78:9: Analysis of target '@@io_bazel_rules_docker//platforms:image_transition' failed
ERROR: E:/src/grpc-httpjson-transcoding/perf_benchmark/BUILD:73:9: Target @@io_bazel_rules_docker//platforms:image_transition was referenced as a platform, but does not provide PlatformInfo
ERROR: Analysis of target '//perf_benchmark:benchmark_main_image' failed; build aborted
INFO: Elapsed time: 26.266s, Critical Path: 0.09s
INFO: 1 process: 1 internal.
ERROR: Build did NOT complete successfully
FAILED:
    Fetching repository @@go_sdk; starting
    Fetching repository @@fuzzing_py_deps; Extracting wheels
    Fetching repository @@local_jdk; starting
    Fetching repository @@bazel_skylib~; starting
    Fetching repository @@apple_support~; starting
    Fetching repository @@protobuf~; starting
    Fetching repository @@rules_cc~; starting
    ```

Migrate to use prow GitHub app instead of bot account

Part of GoogleCloudPlatform/oss-test-infra#1171

Why

OSS prow currently uses bot personal access token(PAT) for authentication with GitHub APIs, which has a global rate limit of 5000 per hour. This is not very well scalable with many tenants. Switching over to GitHub app will get a rate limit per installation, which would greatly increase the rate limit exhaustion problem we have seen lately

AI

Install Google OSS Prow app on this repo by visiting https://github.com/apps/google-oss-prow

Add mapping streaming APIs to newline-delimited JSON streams

This issue copy of cloudendpoints/esp#728

Hi!
Please consider adding feature newline-delimited JSON streams like in https://github.com/grpc-ecosystem/grpc-gateway

Example exists in grpc-ecosystem/grpc-gateway#581

This would be useful if api return 100k + lines of statistics on advertising objects without loading everything into memory. =) I really want to use the protocol for exchanging such data, but now, unfortunately, I can’t. Of course, there may be problems with the performance of serialization and deserialization, I am ready to discuss this, maybe it is possible to solve the problem differently.

In general, I would like to have api based on the protocol of buffers, use authorization check at the esp level, also in some api sometimes it is necessary to give a large flow of data (we usually have statistics), while from the external system I already getting stream with 100k + CSV for one request time ( for example Google Ads) and I want to immediately redirect them to the output stream, possibly with minimal processing and json mapping. I don’t want to save data to a cloud guard or something like that and give a link to the data. I want to immediately serialize the answer. Certainly in Google it is already somehow decided, maybe you can tell how to do it correctly from the point of view of Google?

Thank you in advance!

This project is using internal-only protobuf APIs that may soon change namespaces or disappear.

Hi there, I work on protobufs. I noticed your project is using APIs from google/protobuf/util/internal:

#include "google/protobuf/util/internal/json_stream_parser.h"
#include "google/protobuf/util/internal/object_writer.h"

https://github.com/grpc-ecosystem/grpc-httpjson-transcoding/blob/master/src/json_request_translator.cc#L23-L24

These APIs have "internal" in the include path, and are not for end-user consumption. We will likely be removing these soon, or at least changing their namespace to make it even clearer that these APIs are not for users.

This came to my attention because of this PR in our repo:

protocolbuffers/protobuf#5939 (comment)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.