Giter Club home page Giter Club logo

nodejs-bigtable's Introduction

Google Cloud Platform logo

release level npm version

Cloud Bigtable Client Library for Node.js

A comprehensive list of changes in each version may be found in the CHANGELOG.

Read more about the client libraries for Cloud APIs, including the older Google APIs Client Libraries, in Client Libraries Explained.

Table of contents:

Quickstart

Before you begin

  1. Select or create a Cloud Platform project.
  2. Enable billing for your project.
  3. Enable the Cloud Bigtable API.
  4. Set up authentication with a service account so you can access the API from your local workstation.

Installing the client library

npm install @google-cloud/bigtable

Using the client library

// Imports the Google Cloud client library
const {Bigtable} = require('@google-cloud/bigtable');

const bigtable = new Bigtable();

async function quickstart() {
  // Connect to an existing instance:my-bigtable-instance
  const instance = bigtable.instance(INSTANCE_ID);

  // Connect to an existing table:my-table
  const table = instance.table(TABLE_ID);

  // Read a row from my-table using a row key
  const [singleRow] = await table.row('r1').get();

  // Print the row key and data (column value, labels, timestamp)
  const rowData = JSON.stringify(singleRow.data, null, 4);
  console.log(`Row key: ${singleRow.id}\nData: ${rowData}`);
}
quickstart();

Samples

Samples are in the samples/ directory. Each sample's README.md has instructions for running its sample.

Sample Source Code Try it
Delete Snippets source code Open in Cloud Shell
Filter Snippets source code Open in Cloud Shell
Instances source code Open in Cloud Shell
Quickstart source code Open in Cloud Shell
Read Snippets source code Open in Cloud Shell
Tableadmin source code Open in Cloud Shell
Write Batch source code Open in Cloud Shell
Write Conditionally source code Open in Cloud Shell
Write Increment source code Open in Cloud Shell
Simple Insert source code Open in Cloud Shell

The Cloud Bigtable Node.js Client API Reference documentation also contains samples.

Supported Node.js Versions

Our client libraries follow the Node.js release schedule. Libraries are compatible with all current active and maintenance versions of Node.js. If you are using an end-of-life version of Node.js, we recommend that you update as soon as possible to an actively supported LTS version.

Google's client libraries support legacy versions of Node.js runtimes on a best-efforts basis with the following warnings:

  • Legacy versions are not tested in continuous integration.
  • Some security patches and features cannot be backported.
  • Dependencies cannot be kept up-to-date.

Client libraries targeting some end-of-life versions of Node.js are available, and can be installed through npm dist-tags. The dist-tags follow the naming convention legacy-(version). For example, npm install @google-cloud/bigtable@legacy-8 installs client libraries for versions compatible with Node.js 8.

Versioning

This library follows Semantic Versioning.

This library is considered to be stable. The code surface will not change in backwards-incompatible ways unless absolutely necessary (e.g. because of critical security issues) or with an extensive deprecation period. Issues and requests against stable libraries are addressed with the highest priority.

More Information: Google Cloud Platform Launch Stages

Contributing

Contributions welcome! See the Contributing Guide.

Please note that this README.md, the samples/README.md, and a variety of configuration files in this repository (including .nycrc and tsconfig.json) are generated from a central template. To edit one of these files, make an edit to its templates in directory.

License

Apache Version 2.0

See LICENSE

nodejs-bigtable's People

Contributors

ajaaym avatar alexander-fenster avatar avaksman avatar bcoe avatar billyjacobson avatar callmehiphop avatar crwilcox avatar danieljbruce avatar dpebot avatar fhinkel avatar gcf-owl-bot[bot] avatar greenkeeper[bot] avatar hegemonic avatar igorbernstein2 avatar jkwlui avatar justinbeckwith avatar kolea2 avatar kolodny avatar laljikanjareeya avatar muraliqlogic avatar mutianf avatar release-please[bot] avatar renovate-bot avatar renovate[bot] avatar sduskis avatar sofisl avatar stephenplusplus avatar summer-ji-eng avatar vijay-qlogic avatar yoshi-automation avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nodejs-bigtable's Issues

Auto create family does not work

The autoCreate option does not work for family

Environment details

OS: macos 10.13.3
Node.js version: 8.10.0
npm version: 5.6.0
yarn version: 1.3.2
google-cloud-node version: master branch (0.13)

Steps to reproduce

        const table = bt.table('TestAutoCreate');
        await table.get({ autoCreate: true });

        const fam = table.family('AutoCreateFamily');
        await fam.get({ autoCreate: true });
 FamilyError: Column family not found: projects/teo-dev-de/instances/teo-dev-de/tables/TestAutoCreate/columnFamilies/AutoCreateFamily.
    at /Users/moander/dev/uzi/api/node_modules/@google-cloud/bigtable/src/family.js:355:17
    at /Users/moander/dev/uzi/api/node_modules/@google-cloud/bigtable/src/table.js:814:5
    at Immediate._onImmediate (/Users/moander/dev/uzi/api/node_modules/@google-cloud/bigtable/src/table.js:871:16)

getTables returns empty metadata.columnFamilies object

table.get() returns the families but the object is empty using getTables

t1 { AutoCreateFamily: { gcRule: null } }

t2 {}

Environment details

OS: macos 10.13.3
Node.js version: 8.10.0
npm version: 5.6.0
yarn version: 1.3.2
@google-cloud/bigtable version: master branch (0.13)

Steps to reproduce

    let [t1] = await table.get();
    console.log('t1', t1.metadata.columnFamilies);

    let [tables] = await bt.getTables();
    let t2 = tables.find(t => t.name === 'TestAutoCreate');
    console.log('t2', t2.metadata.columnFamilies);

Convert to ES classes

We have code like this:

if (!(this instanceof ChunkTransformer)) {
    return new ChunkTransformer(options);
  }

That won't work with TypeScript :)

Incorrect hyperlinks in documentation

The following hyperlinks in parameter description are incorrect

Page Address Nonworking Hyperlinks
createInstance(request, options, callback) returns Promise Hyperlink to Instance is 404
getInstance(request, options, callback) returns Promise Hyperlink to Instance is 404
listInstances(request, options, callback) returns Promise Hyperlink to ListInstancesResponse is 404
updateInstance(request, options, callback) returns Promise Hyperlink to Type,State and Instance is 404
partialUpdateInstance(request, options, callback) returns Promise Hyperlink to Instance and FieldMask is 404
createCluster(request, options, callback) returns Promise Hyperlink to Cluster is 404
getCluster(request, options, callback) returns Promise Hyperlink to Cluster is 404
listClusters(request, options, callback) returns Promise Hyperlink to ListInstancesResponse is 404
updateCluster(request, options, callback) returns Promise Hyperlink to State is 404
createAppProfile(request, options, callback) returns Promise Hyperlink to AppProfile is 404
getAppProfile(request, options, callback) returns Promise Hyperlink to AppProfile is 404
listAppProfiles(request, options, callback) returns Promise Hyperlink to AppProfile and ListAppProfilesResponse is 404
listAppProfilesStream(request, options) returns Stream Hyperlink to AppProfile is 404
updateAppProfile(request, options, callback) returns Promise Hyperlink to AppProfile and FieldMask is 404
getIamPolicy(request, options, callback) returns Promise Hyperlink to Policy is 404
setIamPolicy(request, options, callback) returns Promise Hyperlink to Policy is 404
createTable(request, options, callback) returns Promise Hyperlink to Table and Split is 404
listTables(request, options, callback) returns Promise Hyperlink to View,Table and ListTableResponse is 404
listTablesStream(request, options) returns Stream Hyperlink to View and Table is 404
getTable(request, options, callback) returns Promise Hyperlink to View and Table is 404
modifyColumnFamilies(request, options, callback) returns Promise Hyperlink to Modification and Table is 404
generateConsistencyToken(request, options, callback) returns Promise Hyperlink to GenerateConsistencyTokenResponse is 404
checkConsistency(request, options, callback) returns Promise Hyperlink to CheckConsistencyResponse is 404
snapshotTable(request, options, callback) returns Promise Hyperlink to Duration is 404
getSnapshot(request, options, callback) returns Promise Hyperlink to Snapshot is 404
listSnapshots(request, options, callback) returns Promise Hyperlink to Snapshot and ListSnapshotsResponse is 404
listSnapshotsStream(request, options) returns Stream Hyperlink to Snapshot is 404
readRows(request, options) returns Stream Hyperlink to RowSet,RowFilter and ReadRowsResponse is 404
sampleRowKeys(request, options) returns Stream Hyperlink to SampleRowKeysResponse is 404
mutateRow(request, options, callback) returns Promise Hyperlink to Mutation and MutateRowResponse is 404
mutateRows(request, options) returns Stream Hyperlink to MutateRowsResponse is 404
checkAndMutateRow(request, options, callback) returns Promise Hyperlink to RowFilter,Mutation and CheckAndMutateRowResponse is 404
readModifyWriteRow(request, options, callback) returns Promise Hyperlink to ReadModifyWriteRule and ReadModifyWriteRowResponse is 404

Increment overflow

I tried to increment a value by Number.MAX_SAFE_INTEGER

Incrementing beyond int64 gives you a string in return:

      "hulahoi": [
        {
          "value": "\u0000 \u0000\u0000\u0000\u0000\u0000\u0001",
          "labels": [],
          "timestamp": "1524579337814000"
        }
      ],

Environment details

OS: macos 10.13.3
Node.js version: 8.10.0
npm version: 5.6.0
yarn version: 1.3.2
@google-cloud/bigtable version: master branch (0.13)

Steps to reproduce

    await table.row('gwashington').increment('fam1:hulahoi');
    await table.row('gwashington').increment('fam1:hulahoi', Number.MAX_SAFE_INTEGER);

Convert API to use GAPIC

Continuing from Bigtable Convert grpc APIs to use GAPIC, Bigtable is the only API left to complete these steps:

  • Bigtable
    • Generated files are in repo
    • Handwritten API is using generated API
    • Passing through gaxOptions is available on all requests
    • Automatic projectId insertion works

Switch to int64-buffer

We currently use node-int64 which is slower and less correct than int64-buffer:

const Int64BE = require('int64-buffer').Int64BE;
const Int64 = require('node-int64');

const test = (number) => {
  const int64 = new Int64(number);
  const int64BE = new Int64BE(number);
  console.log('int64', int64.toNumber())
  console.log('int64BE', int64BE.toNumber())
  console.log({
    MAX_SAFE_INTEGER: Number.MAX_SAFE_INTEGER,
    isNumberTooBig:  int64BE.toNumber() > Number.MAX_SAFE_INTEGER,
  })
  console.log('int64', int64.toString())
  console.log('int64 Matches', int64.toString() === number)
  console.log('int64BE', int64BE.toString())
  console.log('int64BE Matches', int64BE.toString() === number)
  console.log('equals', int64.toBuffer().equals(int64BE.toBuffer()))
}

const a = number => new Int64(number).toBuffer();
const b = number => (new Int64BE(number)).toBuffer();

console.time('Int64');
for (var i = 0; i < 100000; i++) {
  a(123)
}
console.timeEnd('Int64');

console.time('Int64BE');
for (var i = 0; i < 100000; i++) {
  b(123)
}
console.timeEnd('Int64BE');

test('1234567890123456789')
test('9007199254740991')

outputs:

Int64: 162.114ms
Int64BE: 20.978ms
int64 Infinity
int64BE 1234567890123456800
{ MAX_SAFE_INTEGER: 9007199254740991, isNumberTooBig: true }
int64 Infinity
int64 Matches false
int64BE 1234567890123456789
int64BE Matches true
equals false
int64 -Infinity
int64BE 9007199254740991
{ MAX_SAFE_INTEGER: 9007199254740991, isNumberTooBig: false }
int64 -Infinity
int64 Matches false
int64BE 9007199254740991
int64BE Matches true
equals false

Bigtable.Filter is undefined

I am able to successfully create tables and families and add rows into my table. However when I need to use the Filter command, Filter comes off as undefined. I can get all rows in the table, however if I need to filter out some rows using the Filter, I am not able to.

Environment details

  • OS: Mac High Sierra
  • Node.js version: 8.11.1
  • npm version: 5.6.0
  • @google-cloud/bigtable version: 0.13.0

Steps to reproduce

  1. Ran NPM install and installed the package
  2. Ran the following code:
const Bigtable = require('@google-cloud/bigtable');
console.log(Bigtable); // Returns below output

/*
{ [Function: Bigtable]
  Cluster: { [Function: Cluster] getLocation_: [Function], getStorageType_: [Function] },
  Instance: [Function: Instance],
  v2: 
   { BigtableClient: [Function: BigtableClient],
     BigtableInstanceAdminClient: [Function: BigtableInstanceAdminClient],
     BigtableTableAdminClient: [Function: BigtableTableAdminClient] } }
*/
console.log(Bigtable.Filter); // Returns Undefined

Any idea if anything I am doing is incorrect?

Table Row mutation use Admin API?

Today I was playing around with this Library and noticed that it ate up all my BigTable Admin API quota, while I was inserting rows into a pre-created table.
Shouldn't any row mutation be performed through the BigTable API that doesn't have a quota?
I am running a Development BigTable instance at the moment.

Environment details

  • OS: Alpine
  • Node.js version: 8.9
  • npm version: 5.5
  • @google-cloud/bigtable version: 0.11.1

Steps to reproduce

const bigtable = new BigTable({
  projectId: process.env.GCLOUD_PROJECT,
  credentials: JSON.parse(process.env.BIGTABLE_SERVICE_ACCOUNT) // BigTable User service account
})
const instance = bigtable.instance('my-instance')
const table = instance.table('my-table')

for (var i = 0; i < 1000; i++) { // 7000 is the daily quota limit on Admin API Table Writes
  table.insert([{
     key: 'my-key-' + i, 
     data: { 
       family { 
         column: i
       }
     }
  }])
}
  

An in-range update of @google-cloud/nodejs-repo-tools is breaking the build ๐Ÿšจ

Version 2.1.4 of @google-cloud/nodejs-repo-tools was just published.

Branch Build failing ๐Ÿšจ
Dependency @google-cloud/nodejs-repo-tools
Current Version 2.1.3
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

@google-cloud/nodejs-repo-tools is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • โŒ continuous-integration/appveyor/branch Waiting for AppVeyor build to complete Details
  • โŒ ci/circleci: node4 CircleCI is running your tests Details
  • โŒ ci/circleci: node7 CircleCI is running your tests Details
  • โœ… ci/circleci: node9 Your tests passed on CircleCI! Details
  • โœ… ci/circleci: node6 Your tests passed on CircleCI! Details
  • โŒ ci/circleci: node8 Your tests failed on CircleCI Details

Commits

The new version differs by 2 commits.

  • 2728c15 2.1.4
  • f0007ee Use correct name for bigquery data transfer (#95)

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those donโ€™t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot ๐ŸŒด

An in-range update of mocha is breaking the build ๐Ÿšจ

โ˜๏ธ Greenkeeperโ€™s updated Terms of Service will come into effect on April 6th, 2018.

Version 5.0.3 of mocha was just published.

Branch Build failing ๐Ÿšจ
Dependency mocha
Current Version 5.0.2
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

mocha is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • โŒ ci/circleci: node8 Your tests failed on CircleCI Details
  • โœ… ci/circleci: node9 Your tests passed on CircleCI! Details
  • โœ… ci/circleci: node6 Your tests passed on CircleCI! Details
  • โœ… continuous-integration/appveyor/branch AppVeyor build succeeded Details
  • โœ… ci/circleci: node4 Your tests passed on CircleCI! Details

Release Notes v5.0.3

5.0.3 / 2018-03-06

This patch features a fix to address a potential "low severity" ReDoS vulnerability in the diff package (a dependency of Mocha).

๐Ÿ”’ Security Fixes

๐Ÿ”ฉ Other

Commits

The new version differs by 6 commits.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those donโ€™t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot ๐ŸŒด

Some numbers not decoded

We just upgraded to 0.13 from 0.11 and are seeing a regression with decode: true.

One of our numeric values is being decoded while in a ROW_IN_PROGRESS state in the ChunkTransformer here. Passing isPossibleNumber: true fixes the problem, but I'm not sure what other side effects that might have.

Happy to outline more of what we're doing if that helps narrow it down. Thanks!

row.get() ignores columns array when filters are applied

Environment details

OS: macos 10.13.3
Node.js version: 8.10.0
npm version: 5.6.0
yarn version: 1.3.2
@google-cloud/bigtable version: master branch (0.13)

Steps to reproduce

    const columns = ['fam:col'];
    const options = {
        filter: [
            {
                column: {
                    cellLimit: 1
                }
            },
        ]
    };

    await table.row('key').get(columns); // Returns single column
    await table.row('key').get(columns, options); // Returns all columns

An in-range update of concat-stream is breaking the build ๐Ÿšจ

โ˜๏ธ Greenkeeperโ€™s updated Terms of Service will come into effect on April 6th, 2018.

Version 1.6.1 of concat-stream was just published.

Branch Build failing ๐Ÿšจ
Dependency concat-stream
Current Version 1.6.0
Type dependency

This version is covered by your current version range and after updating it in your project the build failed.

concat-stream is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.

Status Details
  • โœ… ci/circleci: node8 Your tests passed on CircleCI! Details
  • โœ… ci/circleci: node9 Your tests passed on CircleCI! Details
  • โœ… ci/circleci: node6 Your tests passed on CircleCI! Details
  • โœ… ci/circleci: node4 Your tests passed on CircleCI! Details
  • โœ… continuous-integration/appveyor/branch AppVeyor build succeeded Details
  • โœ… ci/circleci: lint Your tests passed on CircleCI! Details
  • โŒ ci/circleci: docs Your tests failed on CircleCI Details

Commits

The new version differs by 1 commits.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those donโ€™t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot ๐ŸŒด

Pass rules to family.get with autoCreate

This is just a placeholder issue to apply this patch and write the necessary tests

diff --git a/src/family.js b/src/family.js
index 365f3bc..86a89df 100644
--- a/src/family.js
+++ b/src/family.js
@@ -297,7 +297,7 @@ Family.prototype.get = function(options, callback) {
   this.getMetadata(gaxOptions, function(err, metadata) {
     if (err) {
       if (err instanceof FamilyError && autoCreate) {
-        self.create({gaxOptions}, callback);
+        self.create({gaxOptions, rule: options.rule}, callback);
         return;
       }

Default credentials error while using with a Bigtable emulator

I try to use this library with a Bigtable emulator. Everything I try to do ends with an error:
Error: Unexpected error while acquiring application default credentials: Could not load the default credentials. Browse to https://developers.google.com/accounts/docs/application-default-credentials for more information.

Environment details

  • OS: Ubuntu 16.04
  • Node.js version: v6.14.1
  • npm version: 3.10.10
  • @google-cloud/bigtable version: 0.13.0

Steps to reproduce

  1. Download install and run Bigtable emulator.
  2. Export env variable using: $(gcloud beta emulators bigtable env-init)
  3. Run an example: node example.js
const Bigtable = require('@google-cloud/bigtable');

var client = new Bigtable.v2.BigtableTableAdminClient();

client.createTable("test").then().catch(function (err) {
    console.log(err)
})

Respect decode: false option when cell value is split between chunks

There appears to be a big in chunktransformer where if a cell is in progress the += operator is used. This causes two buffers to stringify: Buffer('test') + Buffer('ing') === String('testing')

I have the fix below, however I gave up trying to get the coverage to stay at 100%. @ajaaym can you take a look and try creating a PR to include tests? Thanks

diff --git a/src/chunktransformer.js b/src/chunktransformer.js
index 34b8163..eb64e9b 100644
--- a/src/chunktransformer.js
+++ b/src/chunktransformer.js
@@ -337,7 +337,15 @@ ChunkTransformer.prototype.processCellInProgress = function(chunk) {
   if (chunk.resetRow) {
     return this.reset();
   }
-  this.qualifier.value += Mutation.convertFromBytes(chunk.value, this.options);
+  const chunkQualifierValue =
+      Mutation.convertFromBytes(chunk.value, this.options);
+  if (chunkQualifierValue instanceof Buffer &&
+      this.qualifier.value instanceof Buffer) {
+    this.qualifier.value =
+        Buffer.concat([this.qualifier.value, chunkQualifierValue])
+  } else {
+    this.qualifier.value += chunkQualifierValue;
+  }
   this.moveToNextState(chunk);
 };

Patch & ProtoBuf issues

Hey @kolodny, can you take a look at these issues?

From https://circleci.com/gh/googleapis/nodejs-bigtable/1109:

> @google-cloud/[email protected] presystem-test /home/node/project
> git apply patches/patch-for-v4.patch || git apply patches/patch-for-v6-and-up.patch || true

error: node_modules/through2/node_modules/readable-stream/node_modules/process-nextick-args/index.js: No such file or directory
error: patch failed: node_modules/process-nextick-args/index.js:5
error: node_modules/process-nextick-args/index.js: patch does not apply
/home/node/project/system-test/read-rows-acceptance-tests.js:27
const builder = ProtoBuf.loadProtoFile({
                         ^

TypeError: ProtoBuf.loadProtoFile is not a function

An in-range update of sinon is breaking the build ๐Ÿšจ

Version 4.2.3 of sinon was just published.

Branch Build failing ๐Ÿšจ
Dependency sinon
Current Version 4.2.2
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

sinon is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • โŒ ci/circleci: node9 Your tests failed on CircleCI Details
  • โœ… ci/circleci: node8 Your tests passed on CircleCI! Details
  • โœ… ci/circleci: node7 Your tests passed on CircleCI! Details
  • โœ… ci/circleci: node6 Your tests passed on CircleCI! Details
  • โœ… ci/circleci: node4 Your tests passed on CircleCI! Details
  • โœ… continuous-integration/appveyor/branch AppVeyor build succeeded Details

Commits

The new version differs by 7 commits.

  • b5968ab Update docs/changelog.md and set new release id in docs/_config.yml
  • 9cbf3f2 Add release documentation for v4.2.3
  • 45cf330 4.2.3
  • 8f54d73 Update History.md and AUTHORS for new release
  • a401b34 Update package-lock.json
  • a21e4f7 Replace formatio with @sinonjs/formatio
  • f4e44ac Use comments in pull request template to get better descriptions with less template text

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those donโ€™t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot ๐ŸŒด

Confused about row(key).exists()

From @briangruber on October 27, 2017 16:4

I'm confused about using the exists() method on a row(key), hoping to get clarification.
If I want to check if a row exists I thought I would do this

table.row(key).exists().then(result => {
  let exists = result[0]
}

If the row does exists then I do step into the .then and exists will be true. But if it doesn't exist I actually get an exception of Unknown row: key. So, then what is the point of exists if when it doesn't exist it throws an error? Is result[0] expected to always be true and if it doesn't exist I'm supposed to use the exception? Or am I missing something?

Thanks!

Copied from original issue: googleapis/google-cloud-node#2702

Create an Instance Admin Sample

There needs to be a sample in the samples/ directory that manipulates instances. Here's the list of commands:

  • createSsdInstance [projectId] [instanceId]
    • SSD prod 3 nodes in us-central1-f
  • createReplicatedCluster [projectId] [InstanceId]
    • SSD prod 3 nodes, one in us-central1-f, one in us-central1-c
  • createDevInstance [projectId] [instanceId]
    • HDD dev in us-central1-f
  • listInstances [projectId]
  • getInstance [projectId] [instanceId]
  • getClusters [projectId] [instanceId]
  • deleteInstance [projectId] [instanceId]
  • createCluster [projectId] [instanceId]
    • SSD prod 3 nodes in us-central1-c
  • deleteCluster [projectId] [instanceId]

TODO: add AppProfileId CRUD once it exists.

Bigtable error silently ignored

From @kolodny on November 5, 2017 17:59

Steps to reproduce

const bigtable = require('@google-cloud/bigtable');

const bigtableClient = bigtable();
const instance = bigtableClient.instance(process.env.INSTANCE_ID);
const table = instance.table('doesnotexist');

Promise.resolve()
  .then(() => table.getRows())
  .then(([rows]) => console.log(`Read ${rows.length}`))
  .catch(error => console.error(`caught ${error}`))

I chased this bug down to https://github.com/GoogleCloudPlatform/google-cloud-node/blob/a64ad81517045196cf5a3f468ea15aad1e2c25da/packages/common-grpc/src/service.js#L376-L385

This "fake" response call causes streamResponseHandled in retry-request to be set to true on the fake response, that has the consequence of never firing the error callback. https://github.com/stephenplusplus/retry-request/blob/4181eec8187c3603d4e4e68db1ee6ac27725afa3/index.js#L113-L133

I tried reverting the code in #1444 to see if I could replicate the bug referenced at #1443 to see if I could find a different solution that would avoid this nasty regression, but I wasn't able to repro (I always got a response). I suspect that the code can be safely reverted. Reverting that bit of code did fix the bug of silently ignoring erorrs!

Thanks!

Copied from original issue: googleapis/google-cloud-node#2724

An in-range update of @google-cloud/nodejs-repo-tools is breaking the build ๐Ÿšจ

โ˜๏ธ Greenkeeperโ€™s updated Terms of Service will come into effect on April 6th, 2018.

Version 2.2.3 of @google-cloud/nodejs-repo-tools was just published.

Branch Build failing ๐Ÿšจ
Dependency @google-cloud/nodejs-repo-tools
Current Version 2.2.2
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

@google-cloud/nodejs-repo-tools is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • โŒ continuous-integration/appveyor/branch Waiting for AppVeyor build to complete Details
  • โŒ ci/circleci: node4 Your tests failed on CircleCI Details
  • โŒ ci/circleci: node9 Your tests failed on CircleCI Details
  • โŒ ci/circleci: node8 Your tests failed on CircleCI Details
  • โŒ ci/circleci: node6 Your tests failed on CircleCI Details

Commits

The new version differs by 1 commits.

  • 7b3af41 Fix link to open in cloud shell button image.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those donโ€™t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot ๐ŸŒด

Problem with boolean and decimal values

I'm seeing this result. bool and bool_t should be true, bool_f should be false, and dec should be 1.23

{
  "id": "gwashington",
  "data": {
    "fam1": {
      "bin": [
        {
          "value": "abc",
          "labels": [],
          "timestamp": "1523953458013000"
        }
      ],
      "bool": [
        {
          "value": "",
          "labels": [],
          "timestamp": "1523953458013000"
        }
      ],
      "bool_f": [
        {
          "value": "",
          "labels": [],
          "timestamp": "1523953458013000"
        }
      ],
      "bool_t": [
        {
          "value": "",
          "labels": [],
          "timestamp": "1523953458013000"
        }
      ],
      "dec": [
        {
          "value": 1,
          "labels": [],
          "timestamp": "1523953458013000"
        }
      ],
      "jadams": [
        {
          "value": 1,
          "labels": [],
          "timestamp": "1523953458013000"
        }
      ],
      "str": [
        {
          "value": "hello",
          "labels": [],
          "timestamp": "1523953458013000"
        }
      ]
    }
  }
}

Environment details

  • OS: macos 10.13.3
  • Node.js version: 8.10.0
  • npm version: 5.6.0
  • yarn version: 1.3.2
  • google-cloud-node version: master branch (0.13)

Steps to reproduce

    const entries = [
        {
          method: 'insert',
          key: 'gwashington',
          data: {
            fam1: {
                jadams: 1,
                bool: true,
                bool_t: true,
                bool_f: false,
                dec: 1.23,
                str: "hello",
                bin: Buffer.from('abc'),
            }
          }
        }
      ];

    await table.mutate(entries);

Row's ID always encoded as a string when using table.createReadStream

I'm using @google-cloud/[email protected] with node v8.9.3 on linux.

I have a table where the row keys contain binary data, e.g. <Buffer db ee fe e8 fe 9f 65 33 dd 11 1f 2e ec d4 00 00> (2+7+6P6fZTPdER8u7NQAAA== in base64).

If I retrieve the row using row.get(), the id correctly comes back as a Buffer.

const rowKey = Buffer.from('2+7+6P6fZTPdER8u7NQAAA==', 'base64');
let row = await table.row(rowKey).get();
console.log(row[0].id) // prints <Buffer db ee fe e8 fe 9f 65 33 dd 11 1f 2e ec d4 00 00> 

However, if I perform a scan using createReadStream, I get a different behavior. The id comes back as a string, which in my case is unusable.

const rowKey = Buffer.from('2+7+6P6fZTPdER8u7NQAAA==', 'base64');
table.createReadStream({
  start: rowKey,
  end: rowKey,
  decode: false
}).on('data',  row => {
  console.log(row.id); //prints ๏ฟฝ๏ฟฝ๏ฟฝ๏ฟฝ๏ฟฝ๏ฟฝe3๏ฟฝ.๏ฟฝ๏ฟฝ
})

I would expect the id to come back as a Buffer there as well.
Note I did try setting decode=false in the options, but that didn't affect the encoding of the row key at all.

An in-range update of eslint-plugin-prettier is breaking the build ๐Ÿšจ

Version 2.6.0 of eslint-plugin-prettier was just published.

Branch Build failing ๐Ÿšจ
Dependency eslint-plugin-prettier
Current Version 2.5.0
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

eslint-plugin-prettier is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • โŒ continuous-integration/appveyor/branch Waiting for AppVeyor build to complete Details
  • โŒ ci/circleci: node6 CircleCI is running your tests Details
  • โŒ ci/circleci: node4 CircleCI is running your tests Details
  • โœ… ci/circleci: node9 Your tests passed on CircleCI! Details
  • โœ… ci/circleci: node7 Your tests passed on CircleCI! Details
  • โŒ ci/circleci: node8 Your tests failed on CircleCI Details

Commits

The new version differs by 4 commits.

  • d772dfa Build: update package.json and changelog for v2.6.0
  • 9e0fb48 Update: Add option to skip loading prettierrc (#83)
  • e5b5fa7 Build: add Node 8 and 9 to Travis
  • 1ab43fd Chore: add test for vue parsing

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those donโ€™t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot ๐ŸŒด

Investigate: Throwing on lossy insertion

Currently if we add a boolean or double we lose information on the bigtable side since we can't properly encode the data. Some exploration needs to happen regarding how to go about resolving this issue.

Some ideas

  1. Require the user to provide a encoder/decoder when instantiating Bigtable if they use an "unsafe" value:
    const bigtable = const bigtable = new Bigtable({
      bufferConverter: new ThingWithEncodeAndDecodeMethod(),
    });
    ....
    table.insert({
      key: `someKey`,
      data: {
        [COLUMN_FAMILY_NAME]: {
          foo: 1.23,
          bar: true
        },
      },
    })
  2. Only allow int64, buffers, (and strings?), and require the user to convert and deconvert to those types:
    const coder = new ThingWithEncodeAndDecodeMethod()
    table.insert({
      key: `someKey`,
      data: {
        [COLUMN_FAMILY_NAME]: {
          foo: coder.encode(1.23),
          bar: coder.encode(true)
        },
      },
    });
    
    const [rows] = table.getRows();
    for (const row of rows) {
      const foo = coder.decodeDouble(row.data.foo)
      const bar = coder.decodeBoolean(row.data.bar)
    }

Releasing 0.12.0 Stalled

0.12.0 was not released 16 days ago due to some error: https://circleci.com/gh/googleapis/nodejs-bigtable/1109

I'm trying to get the release through now, but we're stalled on a new error.

@kolodny could you please take a look?

https://circleci.com/gh/googleapis/nodejs-bigtable/1408:

1) Bigtable/Table
       mutate()
         valid mutation:
     Error: Aborting after running 1000 timers, assuming an infinite loop!
      at Object.runAll (node_modules/lolex/src/lolex-src.js:671:15)
      at Context.done (system-test/mutate-rows.js:132:15)

  2) Bigtable/Table
       mutate()
         retries the failed mutations:
     Error: Aborting after running 1000 timers, assuming an infinite loop!
      at Object.runAll (node_modules/lolex/src/lolex-src.js:671:15)
      at Context.done (system-test/mutate-rows.js:132:15)

  3) Bigtable/Table
       mutate()
         has a `PartialFailureError` error when an entry fails after the retries:
     Error: Aborting after running 1000 timers, assuming an infinite loop!
      at Object.runAll (node_modules/lolex/src/lolex-src.js:671:15)
      at Context.done (system-test/mutate-rows.js:132:15)

  4) Bigtable/Table
       mutate()
         does not retry unretryable mutations:
     Error: Aborting after running 1000 timers, assuming an infinite loop!
      at Object.runAll (node_modules/lolex/src/lolex-src.js:671:15)
      at Context.done (system-test/mutate-rows.js:132:15)

  5) Bigtable/Table
       mutate()
         considers network errors towards the retry count:
     Error: Aborting after running 1000 timers, assuming an infinite loop!
      at Object.runAll (node_modules/lolex/src/lolex-src.js:671:15)
      at Context.done (system-test/mutate-rows.js:132:15)

  6) Bigtable/Table
       createReadStream
         simple read:

      AssertionError [ERR_ASSERTION]: .on('end') shoud have been invoked
      + expected - actual

      -false
      +true
      
      at Context.it (system-test/read-rows.js:138:11)

  7) Bigtable/Table
       createReadStream
         retries a failed read:

      AssertionError [ERR_ASSERTION]: .on('end') shoud have been invoked
      + expected - actual

      -false
      +true
      
      at Context.it (system-test/read-rows.js:138:11)

  8) Bigtable/Table
       createReadStream
         resets the retry counter after a successful read:

      AssertionError [ERR_ASSERTION]: .on('end') shoud have been invoked
      + expected - actual

      -false
      +true
      
      at Context.it (system-test/read-rows.js:138:11)

  9) Bigtable/Table
       createReadStream
         moves the start point of a range being consumed:

      AssertionError [ERR_ASSERTION]: .on('end') shoud have been invoked
      + expected - actual

      -false
      +true
      
      at Context.it (system-test/read-rows.js:138:11)

  10) Bigtable/Table
       createReadStream
         removes ranges already consumed:

      AssertionError [ERR_ASSERTION]: .on('end') shoud have been invoked
      + expected - actual

      -false
      +true
      
      at Context.it (system-test/read-rows.js:138:11)

  11) Bigtable/Table
       createReadStream
         removes keys already read:

      AssertionError [ERR_ASSERTION]: .on('end') shoud have been invoked
      + expected - actual

      -false
      +true
      
      at Context.it (system-test/read-rows.js:138:11)

  12) Bigtable/Table
       createReadStream
         adjust the limit based on the number of rows read:

      AssertionError [ERR_ASSERTION]: .on('end') shoud have been invoked
      + expected - actual

      -false
      +true
      
      at Context.it (system-test/read-rows.js:138:11)

  13) Bigtable/Table
       createReadStream
         does the previous 5 things in one giant test case:

      AssertionError [ERR_ASSERTION]: .on('end') shoud have been invoked
      + expected - actual

      -false
      +true
      
      at Context.it (system-test/read-rows.js:138:11)

Comment missing or uses incorrect param

The following functions either missing or use incorrect @param

Page Address Comment Discrepancy
create(options,callback) callback not in comment param
get(gaxOptions,callback) callback not in comment param
create(options,callback) callback not in comment param
create(options, callback) callback not in comment param
save(entry, gaxOptions, callback) Comment param is written as key instead of entry
create(options, callback) callback not in comment param
_flush(cb) callback function name is cb but comment has it as callback
destroy(err) parameter is err but comment its mentioned as error
all(pass) comment doesn't have details of param
column(column) comment doesn't have details of param
condition(condition) comment doesn't have details of param
family(family) comment doesn't have details of param
interleave(filters) comment doesn't have details of param
label(label) comment doesn't have details of param
row(row) comment doesn't have details of param
sink(sink) comment doesn't have details of param
time(time) comment doesn't have details of param
value(value) comment doesn't have details of param
convertFromBytes(bytes, options) comment doesn't have options in param description
parse(mutation) Comment uses param as entry instead of mutation

Unexpected \u0000 in row version values

I'm getting unexpected value on all except the latest one

Actual result:

{
 "id": "wmckinley",
 "data": {
  "fam1": {
   "tjefferson": [
    {
     "value": 3,
     "labels": [],
     "timestamp": "1523529421404000"
    },
    {
     "value": "\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0002",
     "labels": [],
     "timestamp": "1523529391871000"
    },
    {
     "value": "\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0001",
     "labels": [],
     "timestamp": "1523527967168000"
    }
   ]
  }
 }
}

Expected result:

{
 "id": "wmckinley",
 "data": {
  "fam1": {
   "tjefferson": [
    {
     "value": 3,
     "labels": [],
     "timestamp": "1523529421404000"
    },
    {
     "value": 2,
     "labels": [],
     "timestamp": "1523529391871000"
    },
    {
     "value": 1,
     "labels": [],
     "timestamp": "1523527967168000"
    }
   ]
  }
 }
}

Environment details

  • OS: macos 10.13.3
  • Node.js version: 8.10.0
  • npm version: 5.6.0
  • yarn version: 1.3.2
  • @google-cloud/bigtable version: 0.13

Steps to reproduce

(async () => {

    await bt.createTable('jau1', {
        families: [
            'fam1'
        ]
    }).catch(swallowCode(6));


    let [tables] = await bt.getTables();
    tables.forEach(t => {
        delete t.instance;
        delete t.bigtable;
        console.log(t);
    })


    const table = bt.table('jau1');
    await table.createFamily('fam2').catch(swallowCode(6));

    let rows = [
        {
            key: 'wmckinley',
            data: {
                fam1: {
                    tjefferson: 3
                }
            }
        }
    ];

    await table.insert(rows);

    [rows] = await table.getRows();
    console.log(JSON.stringify(rows,null,1))

    //-
    // <h4>Retrieving Rows</h4>
    //
    // If you're anticipating a large number of rows to be returned, we suggest
    // using the {@link Table#getRows} streaming API.
    //-
    table.createReadStream()
        .on('error', console.error)
        .on('data', function (row) {
            delete row.bigtable;
            delete row.instance;
            delete row.table;
            console.log('got row', JSON.stringify(row,null,1));
            // `row` is a Row object.
        });
})().catch(err => {
    console.warn(err);
})//.then(() => process.exit());

function swallowCode(code) {
    return err => {
        if (err.code !== code) {
            throw err;
        }
    }
}

`DEADLINE_EXCEEDED` Error when using `table.dropRows()`

When using thetable.dropRows() method to clear a table, I receive the following error:

{ Error: 4 DEADLINE_EXCEEDED: Insufficient deadline for DropRowRange. Please try again with a longer request deadline.
    at new createStatusError (/temp-bt/node_modules/google-gax/node_modules/grpc/src/client.js:64:15)
    at /temp-bt/node_modules/google-gax/node_modules/grpc/src/client.js:583:15
  code: 4,
  metadata:
   Metadata {
     _internal_repr:
      { 'google.rpc.debuginfo-bin': [Array],
        'grpc-status-details-bin': [Array] } },
  details: 'Insufficient deadline for DropRowRange. Please try again with a longer request deadline.'

The associated code is:

const Bigtable = require('@google-cloud/bigtable');

const bigtable = new Bigtable({
  projectId: 'some-project-id'
});

const INSTANCE_NAME = 'some-instance';

async function main() {
  await bigtable.createInstance(INSTANCE_NAME, {
    clusters: [{
      name: 'some-cluster',
      location: 'us-central1-c',
      nodes: 3
    }]
  });
  const instance = bigtable.instance(INSTANCE_NAME);
  const table = instance.table('someTable');
  await table.create({
    families: ['someFamily']
  });
  await table.insert({
    key: 'some-key',
    data: {
      someFamily: {
        someData: 'some-data'
      }
    }
  });
  await table.deleteRows(); // error thrown
}

main().catch(console.error);

Support labels on instances, and use PartialUpdateInstance

Instances support a notion of "labels" (key value pairs meaningful to users, like "project" = "myProject" or "env" = "staging"). Create and update instance should have a labels options which is basically a map of {string, string}.

Also, instance.setMetadata should use "PartialUpdateInstance" instead of "UpdateInstance"

An in-range update of eslint is breaking the build ๐Ÿšจ

โ˜๏ธ Greenkeeperโ€™s updated Terms of Service will come into effect on April 6th, 2018.

Version 4.19.0 of eslint was just published.

Branch Build failing ๐Ÿšจ
Dependency eslint
Current Version 4.18.2
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

eslint is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • โœ… ci/circleci: node9 Your tests passed on CircleCI! Details
  • โœ… ci/circleci: node8 Your tests passed on CircleCI! Details
  • โœ… ci/circleci: node6 Your tests passed on CircleCI! Details
  • โœ… ci/circleci: node4 Your tests passed on CircleCI! Details
  • โŒ ci/circleci: docs Your tests failed on CircleCI Details
  • โœ… ci/circleci: lint Your tests passed on CircleCI! Details
  • โœ… continuous-integration/appveyor/branch AppVeyor build succeeded Details

Release Notes v4.19.0
  • 55a1593 Update: consecutive option for one-var (fixes #4680) (#9994) (่–›ๅฎš่ฐ”็š„็Œซ)
  • 8d3814e Fix: false positive about ES2018 RegExp enhancements (fixes #9893) (#10062) (Toru Nagashima)
  • 935f4e4 Docs: Clarify default ignoring of node_modules (#10092) (Matijs Brinkhuis)
  • 72ed3db Docs: Wrap Buffer() in backticks in no-buffer-constructor rule description (#10084) (Stephen Edgar)
  • 3aded2f Docs: Fix lodash typos, make spacing consistent (#10073) (Josh Smith)
  • e33bb64 Chore: enable no-param-reassign on ESLint codebase (#10065) (Teddy Katz)
  • 66a1e9a Docs: fix possible typo (#10060) (Vse Mozhet Byt)
  • 2e68be6 Update: give a node at least the indentation of its parent (fixes #9995) (#10054) (Teddy Katz)
  • 72ca5b3 Update: Correctly indent JSXText with trailing linebreaks (fixes #9878) (#10055) (Teddy Katz)
  • 2a4c838 Docs: Update ECMAScript versions in FAQ (#10047) (alberto)
Commits

The new version differs by 12 commits.

  • 4f595e8 4.19.0
  • 16fc59e Build: changelog update for 4.19.0
  • 55a1593 Update: consecutive option for one-var (fixes #4680) (#9994)
  • 8d3814e Fix: false positive about ES2018 RegExp enhancements (fixes #9893) (#10062)
  • 935f4e4 Docs: Clarify default ignoring of node_modules (#10092)
  • 72ed3db Docs: Wrap Buffer() in backticks in no-buffer-constructor rule description (#10084)
  • 3aded2f Docs: Fix lodash typos, make spacing consistent (#10073)
  • e33bb64 Chore: enable no-param-reassign on ESLint codebase (#10065)
  • 66a1e9a Docs: fix possible typo (#10060)
  • 2e68be6 Update: give a node at least the indentation of its parent (fixes #9995) (#10054)
  • 72ca5b3 Update: Correctly indent JSXText with trailing linebreaks (fixes #9878) (#10055)
  • 2a4c838 Docs: Update ECMAScript versions in FAQ (#10047)

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those donโ€™t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot ๐ŸŒด

Can not read large row from Bigtable Stream

Hello,

I am having an issue reading bigtable rows from stream.
My rows are constantly growing while I receive new data.
Now the size of some rows have exceeded the maximum authorized and I am getting the following error message:

Error: 8 RESOURCE_EXHAUSTED: Received message larger than max (4411510 vs. 4194304)

Is there a way to stream such messages ? I am using hashed keys and as I can not receive the message, I can not see wich keys are having issue to be read.

const bigtableStream = table.createReadStream()
      .on('data', row => {
        // Do Stuff   
      });
      .on('error', err => console.log(err))

environement:
NodeJs: 8.9.4
google-cloud/bigtable: 0.13.1

AppProfileID CRUD

The instance admin API allows the creation of AppProfileIds. We need to be able to access that functionality from instance.js.

DEADLINE_EXCEEDED while inserting rows

Hello,

I am replicating a BigTable Instance. My script insert new rows after making some changes to the data from the old version to the new one.
The problem is that after a lot of insertion, I am getting the following error :

  { Error: 4 DEADLINE_EXCEEDED: Deadline Exceeded
    at createStatusError (/node_modules/grpc/src/client.js:64:15)
    at ClientReadableStream._emitStatusIfDone (/node_modules/grpc/src/client.js:270:19)
    at ClientReadableStream._receiveStatus (/node_modules/grpc/src/client.js:248:8)
    at /node_modules/grpc/src/client.js:749:12
  code: 4,
  metadata: Metadata { _internal_repr: {} },
  details: 'Deadline Exceeded' }

Am I inserting too many rows at the same time, do I need to set a timeout after a certain amount of insertions ?

Environement:
NodeJs: 8.9.4
google-cloud/bigtable: 0.13.1

Filter on binary data

I can write the buffer and retrieve it again using decode:false but I cannot figure out how to filter on the value.

const buf = Buffer.from('a468c3a669', 'hex');

// Throws Can't convert to RegExp String from unknown type
{
   value: buf
}

// Returns zero rows instead of throwing
{
  value: [
    buf
  ]
}

// Using binary string also returns zero rows
{
   value: buf.toString('binary')
}

Environment details

  • @google-cloud/bigtable version: 0.13.1

Instance.js createCluster method sets location incorrectly

The location property is not correctly framed as required in createCluster method.

Throws following error:

Error: 3 INVALID_ARGUMENT: Error in field 'cluster' : Error in field 'location' : When parsing 'projects/projects/grass-clump-479/locations/us-central1-b' : Location name expected in the form 'projects/<project_id>/locations/<zone_id>'.

Create a Table Admin Sample

There needs to be a sample in the samples/ directory that manipulates tables. Here's the list of commands:

  • createTable [projectId] [instanceId]
    • Creates a table named "my_table" with 1 column family "simpleFamily" that has a max versions of 1
  • checkTableExistence [projectId] [instanceId]
    • Does a list tables, and displays true if the table is in the list.
  • getTable [projectId] [instanceId]
    • Gets "my_table", and displays the column families and their gcPolicy
  • listTables [projectId] [instanceId]
    • // lists all of the tables
  • deleteTable [projectId] [instanceId]
    • Deletes "my_table"
  • addComplexColumnFamily [projectId] [instanceId]
    • Creates a column family called "complex" on "my_table" that has the following GCRule which means "have at least 2 columns, but no more than 10. Any additional columns past 2 should have a TTL of 30d:
   union: [
       {max_versions: 10}
       {intersection: [
         {max_versions: 2}
         {max_age: 30d}
       ]}
   ]
  • deleteComplexColumnFamily [projectId] [instanceId]
    • Deletes the "complex" column family

New patch release

Is there any chance you could release an 0.13.1 that includes the changes from #77? A release was alluded to in #83 but doesn't seem to have happened yet. Thank you!

An in-range update of uuid is breaking the build ๐Ÿšจ

Version 3.2.1 of uuid was just published.

Branch Build failing ๐Ÿšจ
Dependency uuid
Current Version 3.2.0
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

uuid is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • โŒ ci/circleci: node9 Your tests failed on CircleCI Details
  • โœ… ci/circleci: node8 Your tests passed on CircleCI! Details
  • โœ… ci/circleci: node7 Your tests passed on CircleCI! Details
  • โœ… ci/circleci: node4 Your tests passed on CircleCI! Details
  • โœ… continuous-integration/appveyor/branch AppVeyor build succeeded Details
  • โœ… ci/circleci: node6 Your tests passed on CircleCI! Details

Commits

The new version differs by 2 commits.

  • ce7d317 chore(release): 3.2.1
  • 262a8ea Fix illegal invocation of getRandomValues (#257)

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those donโ€™t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot ๐ŸŒด

Lowercase 't' in Bigtable in repo description (and everywhere else)

The description for this GitHub project currently is:

Node.js client for Google Cloud BigTable: Google's NoSQL Big Data database service. https://cloud.google.com/bigtable/

which is capitalizing the "T" in Bigtable incorrectly. As this repo will get cloned, the same capitalization will keep getting copied (with no way to push changes to those repos), which will make it difficult that the t is intended to be lowercase.

Unfortunately, it's not possible to submit a PR or another change to the subject; it just requires admin-level permissions to edit it (there's no history tracking for that feature, AFAICT).

Can someone with appropriate permissions please fix this? Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.