Giter Club home page Giter Club logo

encoding-down's Introduction

encoding-down

An abstract-leveldown implementation that wraps another store to encode keys and values.

📌 This module will soon be deprecated, because its functionality is included in abstract-level.

level badge npm Node version Test Coverage Standard Common Changelog Donate

Introduction

Stores like leveldown can only store strings and Buffers. Other types, though accepted, are serialized before storage, which is an irreversible type conversion. For a richer set of data types you can wrap such a store with encoding-down. It allows you to specify an encoding to use for keys and values independently. This not only widens the range of input types, but also limits the range of output types. The encoding is applied to all read and write operations: it encodes writes and decodes reads.

Many encodings are builtin courtesy of level-codec. The default encoding is utf8 which ensures you'll always get back a string. You can also provide a custom encoding like bytewise - or your own!

Usage

Without any options, encoding-down defaults to the utf8 encoding.

const levelup = require('levelup')
const leveldown = require('leveldown')
const encode = require('encoding-down')

const db = levelup(encode(leveldown('./db1')))

db.put('example', Buffer.from('encoding-down'), function (err) {
  db.get('example', function (err, value) {
    console.log(typeof value, value) // 'string encoding-down'
  })
})

Can we store objects? Yes!

const db = levelup(encode(leveldown('./db2'), { valueEncoding: 'json' }))

db.put('example', { awesome: true }, function (err) {
  db.get('example', function (err, value) {
    console.log(value) // { awesome: true }
    console.log(typeof value) // 'object'
  })
})

How about storing Buffers, but getting back a hex-encoded string?

const db = levelup(encode(leveldown('./db3'), { valueEncoding: 'hex' }))

db.put('example', Buffer.from([0, 255]), function (err) {
  db.get('example', function (err, value) {
    console.log(typeof value, value) // 'string 00ff'
  })
})

What if we previously stored binary data?

const db = levelup(encode(leveldown('./db4'), { valueEncoding: 'binary' }))

db.put('example', Buffer.from([0, 255]), function (err) {
  db.get('example', function (err, value) {
    console.log(typeof value, value) // 'object <Buffer 00 ff>'
  })

  // Override the encoding for this operation
  db.get('example', { valueEncoding: 'base64' }, function (err, value) {
    console.log(typeof value, value) // 'string AP8='
  })
})

And what about keys?

const db = levelup(encode(leveldown('./db5'), { keyEncoding: 'json' }))

db.put({ awesome: true }, 'example', function (err) {
  db.get({ awesome: true }, function (err, value) {
    console.log(value) // 'example'
  })
})
const db = levelup(encode(leveldown('./db6'), { keyEncoding: 'binary' }))

db.put(Buffer.from([0, 255]), 'example', function (err) {
  db.get('00ff', { keyEncoding: 'hex' }, function (err, value) {
    console.log(value) // 'example'
  })
})

Usage with level

The level module conveniently bundles encoding-down and passes its options to encoding-down. This means you can simply do:

const level = require('level')
const db = level('./db7', { valueEncoding: 'json' })

db.put('example', 42, function (err) {
  db.get('example', function (err, value) {
    console.log(value) // 42
    console.log(typeof value) // 'number'
  })
})

API

db = require('encoding-down')(db[, options])

  • db must be an abstract-leveldown compliant store
  • options are passed to level-codec:
    • keyEncoding: encoding to use for keys
    • valueEncoding: encoding to use for values

Both encodings default to 'utf8'. They can be a string (builtin level-codec encoding) or an object (custom encoding).

Custom encodings

Please refer to level-codec documentation for a precise description of the format. Here's a quick example with level and async/await just for fun:

const level = require('level')
const lexint = require('lexicographic-integer')

async function main () {
  const db = level('./db8', {
    keyEncoding: {
      type: 'lexicographic-integer',
      encode: (n) => lexint.pack(n, 'hex'),
      decode: lexint.unpack,
      buffer: false
    }
  })

  await db.put(2, 'example')
  await db.put(10, 'example')

  // Without our encoding, the keys would sort as 10, 2.
  db.createKeyStream().on('data', console.log) // 2, 10
}

main()

With an npm-installed encoding (modularity ftw!) we can reduce the above to:

const level = require('level')
const lexint = require('lexicographic-integer-encoding')('hex')

const db = level('./db8', {
  keyEncoding: lexint
})

Contributing

Level/encoding-down is an OPEN Open Source Project. This means that:

Individuals making significant and valuable contributions are given commit-access to the project to contribute as they see fit. This project is more like an open wiki than a standard guarded open source project.

See the Contribution Guide for more details.

Donate

Support us with a monthly donation on Open Collective and help us continue our work.

License

MIT

encoding-down's People

Contributors

dependabot[bot] avatar greenkeeper[bot] avatar huan avatar juliangruber avatar marcuslyons avatar meirionhughes avatar ralphtheninja avatar vweevers avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

encoding-down's Issues

An in-range update of remark-github is breaking the build 🚨

Version 7.0.4 of remark-github was just published.

Branch Build failing 🚨
Dependency remark-github
Current Version 7.0.3
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

remark-github is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • coverage/coveralls: Coverage pending from Coveralls.io (Details).
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).

Release Notes 7.0.4
  • b9e7e2d Fix support for usernames ending in dashes
  • 1689ec6 Refactor code-style
Commits

The new version differs by 12 commits.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Action required: Greenkeeper could not be activated 🚨

🚨 You need to enable Continuous Integration on all branches of this repository. 🚨

To enable Greenkeeper, you need to make sure that a commit status is reported on all branches. This is required by Greenkeeper because we are using your CI build statuses to figure out when to notify you about breaking changes.

Since we did not receive a CI status on the greenkeeper/initial branch, we assume that you still need to configure it.

If you have already set up a CI for this repository, you might need to check your configuration. Make sure it will run on all new branches. If you don’t want it to run on every branch, you can whitelist branches starting with greenkeeper/.

We recommend using Travis CI, but Greenkeeper will work with every other CI service as well.

Once you have installed CI on this repository, you’ll need to re-trigger Greenkeeper’s initial Pull Request. To do this, please delete the greenkeeper/initial branch in this repository, and then remove and re-add this repository to the Greenkeeper integration’s white list on Github. You'll find this list on your repo or organiszation’s settings page, under Installed GitHub Apps.

Decode values for 'put' event?

Not sure if this might be bad for performance, but I noticed the decoders don't apply to db events.

For my project I've implemented it like so:

class Db extends Levelup {
  on(ev, cb) {
    const { decode } = this.codecs.valueEncoding;
    if (ev === 'put' && decode) {
      return onProto.call(
        this,
        'put',
        (k, v) => cb(k, decode(v))
      );
    }
    return onProto.call(this, ev, cb);
  }
}

Maybe we could add event interception as an option?

Tweak readme

  • Rewrite short description, the current only describes history
  • Explain what encodings do, perhaps with an example
  • Explain how it relates to serialization and storage
  • const -> var
  • const db = .. -> db = .. in api header

See Level/levelup#456 (comment)

Version 10 of node.js has been released

Version 10 of Node.js (code name Dubnium) has been released! 🎊

To see what happens to your code in Node.js 10, Greenkeeper has created a branch with the following changes:

  • Added the new Node.js version to your .travis.yml
  • The new Node.js version is in-range for the engines in 1 of your package.json files, so that was left alone

If you’re interested in upgrading this repo to Node.js 10, you can open a PR with these changes. Please note that this issue is just intended as a friendly reminder and the PR as a possible starting point for getting your code running on Node.js 10.

More information on this issue

Greenkeeper has checked the engines key in any package.json file, the .nvmrc file, and the .travis.yml file, if present.

  • engines was only updated if it defined a single version, not a range.
  • .nvmrc was updated to Node.js 10
  • .travis.yml was only changed if there was a root-level node_js that didn’t already include Node.js 10, such as node or lts/*. In this case, the new version was appended to the list. We didn’t touch job or matrix configurations because these tend to be quite specific and complex, and it’s difficult to infer what the intentions were.

For many simpler .travis.yml configurations, this PR should suffice as-is, but depending on what you’re doing it may require additional work or may not be applicable at all. We’re also aware that you may have good reasons to not update to Node.js 10, which is why this was sent as an issue and not a pull request. Feel free to delete it without comment, I’m a humble robot and won’t feel rejected 🤖


FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Action required: Greenkeeper could not be activated 🚨

🚨 You need to enable Continuous Integration on all branches of this repository. 🚨

To enable Greenkeeper, you need to make sure that a commit status is reported on all branches. This is required by Greenkeeper because we are using your CI build statuses to figure out when to notify you about breaking changes.

Since we did not receive a CI status on the greenkeeper/initial branch, we assume that you still need to configure it.

If you have already set up a CI for this repository, you might need to check your configuration. Make sure it will run on all new branches. If you don’t want it to run on every branch, you can whitelist branches starting with greenkeeper/.

We recommend using Travis CI, but Greenkeeper will work with every other CI service as well.

Once you have installed CI on this repository, you’ll need to re-trigger Greenkeeper’s initial Pull Request. To do this, please delete the greenkeeper/initial branch in this repository, and then remove and re-add this repository to the Greenkeeper integration’s white list on Github. You'll find this list on your repo or organiszation’s settings page, under Installed GitHub Apps.

module tries to decode an empty string and fails when using createKeyStream and JSON

const levelup = require('levelup');
const leveldown = require('leveldown');
const encode = require('encoding-down');

var enc = encode(leveldown('./mydb'), {
	valueEncoding: 'json',
	keyEncoding: 'json',
});
var lv = levelup(enc);

(async function() {
	// adding some random data
	await lv.put('asdf.1', 'aaaaaa');
	await lv.put('asdf.2', 'bbbbbb');
	await lv.put('asdf.3', 'cccccc');

	lv.createKeyStream()
		.on('data', function(data) {
			console.log('data');
			console.log(data);
		})
		.on('error', function(err) {
			console.log('error');
			console.log(err);
		})
		.on('end', function() {
			console.log('end');
		});
})();

output:

error
EncodingError: Unexpected end of JSON input
    at (...)/node_modules/encoding-down/index.js:105:17
    at (...)/node_modules/leveldown/node_modules/abstract-leveldown/abstract-iterator.js:24:16
    at (...)/node_modules/leveldown/iterator.js:43:7
    at _combinedTickCallback (internal/process/next_tick.js:131:7)
    at process._tickCallback (internal/process/next_tick.js:180:9)

write Options ignored in chained batch

with a custom encoder:

let fooEncoding = {
  encode: (x) => x.message,
  decode: (x) => { return { message: String(x) }; },
  type: "foo",
  buffer: false
}

when you batch with the option to use the encoder:

await new Promise(done =>
  db.batch()
    .put({ message: "john" }, "adams")
    .put({ message: "james" }, "kirk")
    .write({ keyEncoding: fooEncoding }, done));

then iterate:

let it = db.iterator({ keyEncoding: fooEncoding });

while (true) {
  let item = await new Promise((r, x) => {
    it.next((err, key, value) => {
      if (err == null && key === undefined && value === undefined) r();
      if (err != null) x(err);
      r({ key: key, value: value });
    });
  });

  if (item == undefined) break;

  console.log(item);
}

keys is stored as "[object Object]" for both.

If you go through put directly:

  await new Promise((r, x) =>
    db.put(
      { message: "john" }, "adams",
      { keyEncoding: fooEncoding },
      (err) => { if (err) x(err); r() }
    ));

  await new Promise((r, x) =>
    db.put(
      { message: "james" }, "kirk",
      { keyEncoding: fooEncoding },
      (err) => { if (err) x(err); r() }
    ));

OR, use the array batch version:

  await new Promise((r, x) =>
    db.batch([
      { type: "put", key: { message: "john" }, value: "adams" },
      { type: "put", key: { message: "james" }, value: "kirk" }],
      { keyEncoding: fooEncoding },
      (e) => ((e) ? x(e) : r())));

You get the expected result:

{ key: { message: 'james' }, value: 'kirk' }
{ key: { message: 'john' }, value: 'adams' }

Add test for numbers

To assert that the default utf8 encoding stringifies them. Override _serialize* if necessary, to assert that it's the encoding that does the stringification.

This will fill the gap left by Level/abstract-leveldown#140. Well, sort of -- that one is about serialization, this is about encoding. But historically, putting a number into a level store stringifies it. That's what I intend to add coverage for.

Because we gave (and are increasing) stores the freedom to serialize how they see fit; it's now on encoding-down to provide universal/isomorphic behavior.

Promise/async support for custom encoder?

Is Promise/async supported?

I'm want to use 3rd party libraries to modify values on read and write. The library expose the functions as promise. Can I use this with encoding-down?

Like

{
  encode: async function (data) {
    return jose.JWE.createEncrypt({ format: 'compact' }, key).update(input).final();
  },
  decode: async function (data) {
    return jose.JWE.createDecrypt(key).decrypt(input)
  },
  buffer: Boolean,
  type: 'example'
}

Or are old style callback planned?

{
  encode: function (data, cb) {
    return jose.JWE.createEncrypt({ format: 'compact' }, key).update(input).final().then(function(result) {
      cb(result)
    });
  },
  decode: function (data, cb) {
    return jose.JWE.createDecrypt(key).decrypt(input).then(function(result) {
      cb(result)
    });
  },
  buffer: Boolean,
  type: 'example'
}

An in-range update of memdown is breaking the build 🚨

Version 1.2.5 of memdown just got published.

Branch Build failing 🚨
Dependency memdown
Current Version 1.2.4
Type devDependency

This version is covered by your current version range and after updating it in your project the build failed.

As memdown is “only” a devDependency of this project it might not break production or downstream projects, but “only” your build or test tools – preventing new deploys or publishes.

I recommend you give this issue a high priority. I’m sure you can resolve this 💪

Status Details
  • continuous-integration/travis-ci/push The Travis CI build failed Details

Commits

The new version differs by 8 commits.

  • 1abc47b 1.2.5
  • 90e51b1 Merge pull request #69 from Level/greenkeeper/initial
  • f0b4b76 split up zuul on multiple matrix entries for each browser + test on latest only
  • 2610905 :arrow_up: bump abstract-leveldown
  • b9eb981 Update README.md
  • 9131113 docs(readme): add Greenkeeper badge
  • fb4a9cb chore(travis): whitelist greenkeeper branches
  • fe61dcd chore(package): update dependencies

See the full diff

Not sure how things should work exactly?

There is a collection of frequently asked questions and of course you may always ask my humans.


Your Greenkeeper Bot 🌴

breaks old level range api when null is significant (as in bytewise)

by shifting encoding inside a *-down level no longer supports the range api correctly.

https://github.com/Level/abstract-leveldown/blob/master/abstract-leveldown.js#L201-L220

strips out nullish ranges, which is reasonable for leveldown, but was supported in levelup, and null, undefined and '' are significant values in some encodings (bytewise, charwise). In level@3 encodings are managed via encoding-down, which, since it's inside a level down, hits this check and removes the range. This is breaks some code which was using bytewise to describe ranges, in particular:

flumedb/flumeview-level#11

I think the simplest fix here is for encoding-down to override AbstractLevelDOWN#iterator so that cleanRangeOptions is not called

https://github.com/Level/abstract-leveldown/blob/master/abstract-leveldown.js#L228

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.