Giter Club home page Giter Club logo

cloud-bigtable-cbt-cli's Introduction

Google APIs

This repository contains the original interface definitions of public Google APIs that support both REST and gRPC protocols. Reading the original interface definitions can provide a better understanding of Google APIs and help you to utilize them more efficiently. You can also use these definitions with open source tools to generate client libraries, documentation, and other artifacts.

Building

Bazel

The recommended way to build the API client libraries is through Bazel >= 4.2.2.

First, install bazel.

To build all libraries:

bazel build //...

To test all libraries:

bazel test //...

To build one library in all languages:

bazel build //google/example/library/v1/...

To build the Java package for one library:

bazel build //google/example/library/v1:google-cloud-example-library-v1-java

Bazel packages exist in all the libraries for Java, Go, Python, Ruby, Node.js, PHP and C#.

Overview

Google APIs are typically deployed as API services that are hosted under different DNS names. One API service may implement multiple APIs and multiple versions of the same API.

Google APIs use Protocol Buffers version 3 (proto3) as their Interface Definition Language (IDL) to define the API interface and the structure of the payload messages. The same interface definition is used for both REST and RPC versions of the API, which can be accessed over different wire protocols.

There are several ways of accessing Google APIs:

  1. JSON over HTTP: You can access all Google APIs directly using JSON over HTTP, using Google API client library or third-party API client libraries.

  2. Protocol Buffers over gRPC: You can access Google APIs published in this repository through GRPC, which is a high-performance binary RPC protocol over HTTP/2. It offers many useful features, including request/response multiplex and full-duplex streaming.

  3. Google Cloud Client Libraries: You can use these libraries to access Google Cloud APIs. They are based on gRPC for better performance and provide idiomatic client surface for better developer experience.

Discussions

This repo contains copies of Google API definitions and related files. For discussions or to raise issues about Google API client libraries, GRPC or Google Cloud Client Libraries please refer to the repos associated with each area.

Repository Structure

This repository uses a directory hierarchy that reflects the Google API product structure. In general, every API has its own root directory, and each major version of the API has its own subdirectory. The proto package names exactly match the directory: this makes it easy to locate the proto definitions and ensures that the generated client libraries have idiomatic namespaces in most programming languages. Alongside the API directories live the configuration files for the GAPIC toolkit.

NOTE: The major version of an API is used to indicate breaking change to the API.

Generate gRPC Source Code

To generate gRPC source code for Google APIs in this repository, you first need to install both Protocol Buffers and gRPC on your local machine, then you can run make LANGUAGE=xxx all to generate the source code. You need to integrate the generated source code into your application build system.

NOTE: The Makefile is only intended to generate source code for the entire repository. It is not for generating linkable client library for a specific API. Please see other repositories under https://github.com/googleapis for generating linkable client libraries.

Go gRPC Source Code

It is difficult to generate Go gRPC source code from this repository, since Go has different directory structure. Please use this repository instead.

cloud-bigtable-cbt-cli's People

Contributors

alexoneill avatar billyjacobson avatar brandtnewton avatar broady avatar ccalok avatar codyoss avatar cshaff0524 avatar danielhultqvist avatar dsymonds avatar enocom avatar garye avatar google-cloud-policy-bot[bot] avatar hegemonic avatar humbertowastaken avatar igorbernstein2 avatar jba avatar jeanbza avatar jimfulton avatar justinuang avatar markduffett avatar mutianf avatar rameshdharan avatar renovate-bot avatar shweta345 avatar sneakybueno avatar steveniemitz avatar telpirion avatar tritone avatar trollyxia avatar yogesh-desai avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cloud-bigtable-cbt-cli's Issues

Clean up arg parsing

Each do*() method in the cbt parses its own arguments and constructs the appropriate call to the service. It would be nice to refactor these methods to use an argument struct that is composed by another function that only parses arguments.

Why to do this:

  • In some cases, the current situation results in a lot of duplicated call (e.g. doLookup() and doRead()).
  • Each do*() method has multiple concerns--arg parsing and composing the calls to the service. This breaks the principle of orthogonality.

feat: allow set value as BigEndian Int64

Is your feature request related to a problem? Please describe.

Starting with version 0.12.0, the cbt CLI can format certain complex types of data stored in table rows. When you use the cbt read or cbt lookup command, the cbt tool can "pretty print" values stored in the rows.
https://cloud.google.com/bigtable/docs/cbt-formatting

from https://github.com/googleapis/cloud-bigtable-cbt-cli/pull/28/files, cbt read and cbt lookup is able to decode BigEndian Int64 with a format file.

BigEndian Int64 and UTF8 string are two popular value types. However, cbt set still only able to set value as UTF8 string.

Describe the solution you'd like

similar to cbt read or cbt lookup, we can improve cbt set to support format file and encode properly.

Describe alternatives you've considered

If it is easy to generalize, it would be nice if cbt set can also support full data types, e.g. Hexadecimal, JSON, Protobuf

Additional context
cbt was quite useful in setup instance/table/column_family in Bigtable emulator (used in development or test environment)
. I have seen a lot use cases where producer and consumer are different applications and mostly implement in different languages.

since cbt is not able to set BigEndian Int64, currently a workaround is to use client SDK to mock data in CI of Consumer applications. It would be more convenient if cbt set can also

Security Policy violation Branch Protection

Allstar has detected that this repository’s Branch Protection security policy is out of compliance. Status:
Branch Protection enforcement is configured in Allstar, however Branch Protection is not available on this repository. Upgrade to GitHub Pro or make this repository public to enable this feature.
See: https://docs.github.com/en/repositories/configuring-branches-and-merges-in-your-repository/defining-the-mergeability-of-pull-requests/about-protected-branches for more information.
If this is not feasible, then disable Branch Protection policy enforcement for this repository in Allstar configuration.

This issue will auto resolve when the policy is in compliance.

Issue created by Allstar. See https://github.com/ossf/allstar/ for more information. For questions specific to the repository, please contact the owner or maintainer.

Security Policy violation SECURITY.md

Allstar has detected that this repository’s SECURITY.md security policy is out of compliance. Status:
Security policy not enabled.
A SECURITY.md file can give users information about what constitutes a vulnerability and how to report one securely so that information about a bug is not publicly visible. Examples of secure reporting methods include using an issue tracker with private issue support, or encrypted email with a published key.

To fix this, add a SECURITY.md file that explains how to handle vulnerabilities found in your repository. Go to https://github.com/googlestaging/cloud-bigtable-cli/security/policy to enable.

For more information, see https://docs.github.com/en/code-security/getting-started/adding-a-security-policy-to-your-repository.

This issue will auto resolve when the policy is in compliance.

Issue created by Allstar. See https://github.com/ossf/allstar/ for more information. For questions specific to the repository, please contact the owner or maintainer.

Security Policy violation Outside Collaborators

Allstar has detected that this repository’s Outside Collaborators security policy is out of compliance. Status:
Did not find any owners of this repository
This policy requires all repositories to have an organization member or team assigned as an administrator. Either there are no administrators, or all administrators are outside collaborators. A responsible party is required by organization policy to respond to security events and organization requests.

To add an administrator From the main page of the repository, go to Settings -> Manage Access.
(For more information, see https://docs.github.com/en/organizations/managing-access-to-your-organizations-repositories)

Alternately, if this repository does not have any maintainers, archive or delete it.

This issue will auto resolve when the policy is in compliance.

Issue created by Allstar. See https://github.com/ossf/allstar/ for more information. For questions specific to the repository, please contact the owner or maintainer.

cbt: TestPrintRow failed

This test failed!

To configure my behavior, see the Flaky Bot documentation.

If I'm commenting on this issue too often, add the flakybot: quiet label and
I will stop commenting.


commit: a71b81d
buildURL: Build Status, Sponge
status: failed

Test output
    valueformatting_test.go:569: Formatting printed incorrectly:
        wanted
        ----------------------------------------
        r1
          f1:c1                                    @ 1969/12/31-16:00:00.000000
            "Hello!"
          f1:c2                                    @ 1969/12/31-16:00:00.000000
            "\x01\x02"
          f2:person                                @ 1969/12/31-16:00:00.000000
            "\n\x03Jim\x10*\x1a\[email protected]\"\f\n\b555-1212\x10\x01"
        ,
        got
        ----------------------------------------
        r1
          f1:c1                                    @ 1970/01/01-00:00:00.000000
            "Hello!"
          f1:c2                                    @ 1970/01/01-00:00:00.000000
            "\x01\x02"
          f2:person                                @ 1970/01/01-00:00:00.000000
            "\n\x03Jim\x10*\x1a\[email protected]\"\f\n\b555-1212\x10\x01"

Feature request: cbt read cells-per-row option

Is your feature request related to a problem? Please describe.

I was trying to test how BigTable behaves when you send a zero cells-per-row row filter and reached for cbt read to construct a request but it didn't have the option.

Describe the solution you'd like

cbt read cells-per-column=n

Describe alternatives you've considered

Hand-rolling the request myself using the SDK.

cbt: TestJSONAndYAML failed

This test failed!

To configure my behavior, see the Flaky Bot documentation.

If I'm commenting on this issue too often, add the flakybot: quiet label and
I will stop commenting.


commit: 2be564b
buildURL: Build Status, Sponge
status: failed

Test output
    valueformatting_test.go:584: Formatting printed incorrectly: wanted
        ----------------------------------------
        r1
          f1:json
            age:   2.00
            name: "Brave"
        ,
        got
        ----------------------------------------
        r1
          f1:json
        age:     2.00
        name:   "Brave"
    
    </pre></details>

Choosing an encoding does not affect the display of a row key

Is your feature request related to a problem? Please describe.
I need to view data that is in bytes. cloud-bigtable-cbt-cli recently merged a PR (#28) that allowed a user to specify the encoding used when displaying byte data. This allows me to properly view the column values, but unfortunately the row key is still encoded in what I believe is UTF-8.

Describe the solution you'd like
I would love to be able to display the column values and the row key with the same encoding.

Describe alternatives you've considered
I've considered making a separate CLI tool that takes a row key and displays the data however I want, but the team loves CBT CLI for all of its other features and does not want to be switching back and forth between a custom CLI tool and CBT CLI.

Additional context

As you can see I can correctly read the column data, but row key (B?G???A?'?8???K?) isn't legible.

Screen Shot 2022-04-14 at 9 13 05 AM

Add format support for avro

Some customers encode their cell values as avro. It would be nice to support avro alongside protobuf.

See b/224602935

`main` download fails to install

Unzipping the main code and trying to install from that package fails with the following error:

Error: cbt.go:41:2: missing go.sum entry for module providing package cloud.google.com/go/bigtable (imported by cloud.google.com/go/cbt); to add:
	go get cloud.google.com/go/cbt

It seems like a dependency got updated in go.mod but not in go.sum?

Steps to reproduce

curl -LO https://github.com/googleapis/cloud-bigtable-cbt-cli/archive/refs/heads/main.zip
unzip main.zip
cd cloud-bigtable-cbt-cli-main && go build .

Unpacking `proto.Any`

Describe the solution you'd like

I'd like properties of type proto.Any to be recursively unpacked when formatting the cell values serialized in protobuf.

For example, I have the a cell value serialized using the following schema:

import "google/protobuf/any.proto";
message Foo {
   string id = 1;
   google.protobuf.Any bar = 2;
}

message Bar {
   string baz = 1;
}

When writing to Bigtable I set the value of bar with Any.pack on an instance of Bar. Then I try to inspect the content of the cell. I have all the proto files included in the format-file .

At the moment I get:

id: "123"
bar:
  type_url: "type.googleapis.com/Bar" value: "..."

I'd like cbt to unpack the value to the type specified under "type_url"

Additional context
For comparison, grpcurl is able to identify and unpack proto.Any as long as the type is one of the files added with -import-path and -proto: source?

Remove deprecated Snapshot feature

The Snapshot feature has been replaced with Backups. Access to Snapshots needs to be removed from cbt:

  1. Provide warning message to user when accessing feature.
  2. (After some time) Remove feature from cbt

Dependency Dashboard

This issue lists Renovate updates and detected dependencies. Read the Dependency Dashboard docs to learn more.

Edited/Blocked

These updates have been manually edited so Renovate will no longer make changes. To discard all commits and start over, click on a checkbox.

Open

These updates have all been created already. Click a checkbox below to force a retry/rebase of any.

Detected dependencies

gomod
go.mod
  • go 1.21
  • cloud.google.com/go/bigtable v1.23.0
  • github.com/google/go-cmp v0.6.0
  • github.com/jhump/protoreflect v1.16.0
  • golang.org/x/oauth2 v0.19.0
  • golang.org/x/sys v0.19.0
  • google.golang.org/api v0.177.0
  • google.golang.org/grpc v1.63.2
  • google.golang.org/protobuf v1.34.0
  • gopkg.in/yaml.v2 v2.4.0

  • Check this box to trigger a request for Renovate to run again on this repository

[Policy Bot] found one or more issues with this repository.

Policy Bot found one or more issues with this repository.

  • Default branch is 'main'
  • Branch protection is enabled
  • Renovate bot is enabled
  • Merge commits disabled
  • There is a CODEOWNERS file
  • There is a valid LICENSE.md
  • There is a CODE_OF_CONDUCT.md
  • There is a CONTRIBUTING.md
  • There is a SECURITY.md

Add mocks for unit tests

Right now the cbt tool doesn't have robust unit tests for individual CLI functions (doLookup(), doRead(), etc). These functions should have some tests attached to them:

  • Encapsulate all of the functions inside of a struct that can be mocked.
  • Add a dependency to the gomock framework.
  • Add a go:generate statement that updates the cbt mock structs.
  • Add tests for each method in the CBT that uses the mock frameworks.

the read option 'keys-only' is misnamed

The documentation for the keys-only option for cbt read says:

keys-only=<true|false>              Whether to print only row keys

However this isn't what it does. It prints all the columns, too. Really this option should be named strip-values, to mirror what the underlying BT API request does.

`cbt ls` error message is obtuse when using a bad instance ID

If I run cbt -instance=abc_123 ls I get an error message which does not help me diagnose my problem. I get:

$ cbt  -instance=abc_123 ls
2023/05/02 15:18:54 -creds flag unset, will use gcloud credential
2023/05/02 15:18:54 -project flag unset, will use gcloud active project
2023/05/02 15:18:54 gcloud active project is "autonomous-mote-782"
2023/05/02 15:18:54 Getting list of tables: rpc error: code = InvalidArgument desc = When parsing 'projects/autonomous-mote-782/instances/abc_123' : Instance name expected in the form 'projects/<project_id>/instances/<instance_id>'.
error details: name = DebugInfo detail = [ORIGINAL ERROR] generic::invalid_argument: When parsing 'projects/autonomous-mote-782/instances/abc_123' : Instance name expected in the form 'projects/<project_id>/instances/<instance_id>'. [google.rpc.error_details_ext] { message: "When parsing \'projects/autonomous-mote-782/instances/abc_123\' : Instance name expected in the form \'projects/<project_id>/instances/<instance_id>\'." } stack =

"Instance name expected in the form 'projects/<project_id>/instances/<instance_id>'" is very confusing since my instance name seemingly does match this pattern.

Conversely, if I do cbt -instance=abc ls, I get a better error message explaining what is wrong:

$ cbt  -instance=abc ls
2023/05/02 15:18:46 -creds flag unset, will use gcloud credential
2023/05/02 15:18:46 -project flag unset, will use gcloud active project
2023/05/02 15:18:47 gcloud active project is "autonomous-mote-782"
2023/05/02 15:18:47 Getting list of tables: rpc error: code = InvalidArgument desc = When parsing 'projects/autonomous-mote-782/instances/abc' : Invalid id for collection instances : Length should be between [6,33], but found 3 'abc'
error details: name = DebugInfo detail = [ORIGINAL ERROR] generic::invalid_argument: When parsing 'projects/autonomous-mote-782/instances/abc' : Invalid id for collection instances : Length should be between [6,33], but found 3 'abc' [google.rpc.error_details_ext] { message: "When parsing \'projects/autonomous-mote-782/instances/abc\' : Invalid id for collection instances : Length should be between [6,33], but found 3 \'abc\'" } stack =

It would be nice if the CLI told me why the instance name was unparseable or at least show me the regex which my instance ID needs to match.

Support UTF-8 strings for `read` and `lookup` outputs while using `ProtocolBuffer` encoding

Thanks for stopping by to let us know something could be better!

PLEASE READ: If you have a support contract with Google, please create an issue in the support console instead of filing on GitHub. This will ensure a timely response.

Is your feature request related to a problem? Please describe.
When I use ProtocolBuffer encoding, I'm frustrated by string value encoded in unreadable bytes.
For example, a Korean string "경동나비엔" is printed as "\352\262\275\353\217\231\353\202\230\353\271\204\354\227\224".

Describe the solution you'd like
I would like cbt to support UTF-8 string in read or lookup output.

Describe alternatives you've considered
I found that the unreadable byte sequence is from message.MarshalTextIndent()(link). If we use message.MarshalJSONIndent() instead, a UTF-8 string can be correctly printed like "경동나비엔". So it would be also good if cbt allows users to choose prototext or protojson as the output format. Then prototext will still output bytes in octal, but I can choose protojson to see UTF-string.

Additional context

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.