Giter Club home page Giter Club logo

pako's Introduction

pako

CI NPM version

zlib port to javascript, very fast!

Why pako is cool:

  • Results are binary equal to well known zlib (now contains ported zlib v1.2.8).
  • Almost as fast in modern JS engines as C implementation (see benchmarks).
  • Works in browsers, you can browserify any separate component.

This project was done to understand how fast JS can be and is it necessary to develop native C modules for CPU-intensive tasks. Enjoy the result!

Benchmarks:

node v12.16.3 (zlib 1.2.9), 1mb input sample:

deflate-imaya x 4.75 ops/sec ±4.93% (15 runs sampled)
deflate-pako x 10.38 ops/sec ±0.37% (29 runs sampled)
deflate-zlib x 17.74 ops/sec ±0.77% (46 runs sampled)
gzip-pako x 8.86 ops/sec ±1.41% (29 runs sampled)
inflate-imaya x 107 ops/sec ±0.69% (77 runs sampled)
inflate-pako x 131 ops/sec ±1.74% (82 runs sampled)
inflate-zlib x 258 ops/sec ±0.66% (88 runs sampled)
ungzip-pako x 115 ops/sec ±1.92% (80 runs sampled)

node v14.15.0 (google's zlib), 1mb output sample:

deflate-imaya x 4.93 ops/sec ±3.09% (16 runs sampled)
deflate-pako x 10.22 ops/sec ±0.33% (29 runs sampled)
deflate-zlib x 18.48 ops/sec ±0.24% (48 runs sampled)
gzip-pako x 10.16 ops/sec ±0.25% (28 runs sampled)
inflate-imaya x 110 ops/sec ±0.41% (77 runs sampled)
inflate-pako x 134 ops/sec ±0.66% (83 runs sampled)
inflate-zlib x 402 ops/sec ±0.74% (87 runs sampled)
ungzip-pako x 113 ops/sec ±0.62% (80 runs sampled)

zlib's test is partially affected by marshalling (that make sense for inflate only). You can change deflate level to 0 in benchmark source, to investigate details. For deflate level 6 results can be considered as correct.

Install:

npm install pako

Examples / API

Full docs - http://nodeca.github.io/pako/

const pako = require('pako');

// Deflate
//
const input = new Uint8Array();
//... fill input data here
const output = pako.deflate(input);

// Inflate (simple wrapper can throw exception on broken stream)
//
const compressed = new Uint8Array();
//... fill data to uncompress here
try {
  const result = pako.inflate(compressed);
  // ... continue processing
} catch (err) {
  console.log(err);
}

//
// Alternate interface for chunking & without exceptions
//

const deflator = new pako.Deflate();

deflator.push(chunk1, false);
deflator.push(chunk2); // second param is false by default.
...
deflator.push(chunk_last, true); // `true` says this chunk is last

if (deflator.err) {
  console.log(deflator.msg);
}

const output = deflator.result;


const inflator = new pako.Inflate();

inflator.push(chunk1);
inflator.push(chunk2);
...
inflator.push(chunk_last); // no second param because end is auto-detected

if (inflator.err) {
  console.log(inflator.msg);
}

const output = inflator.result;

Sometime you can wish to work with strings. For example, to send stringified objects to server. Pako's deflate detects input data type, and automatically recode strings to utf-8 prior to compress. Inflate has special option, to say compressed data has utf-8 encoding and should be recoded to javascript's utf-16.

const pako = require('pako');

const test = { my: 'super', puper: [456, 567], awesome: 'pako' };

const compressed = pako.deflate(JSON.stringify(test));

const restored = JSON.parse(pako.inflate(compressed, { to: 'string' }));

Notes

Pako does not contain some specific zlib functions:

  • deflate - methods deflateCopy, deflateBound, deflateParams, deflatePending, deflatePrime, deflateTune.
  • inflate - methods inflateCopy, inflateMark, inflatePrime, inflateGetDictionary, inflateSync, inflateSyncPoint, inflateUndermine.
  • High level inflate/deflate wrappers (classes) may not support some flush modes.

pako for enterprise

Available as part of the Tidelift Subscription

The maintainers of pako and thousands of other packages are working with Tidelift to deliver commercial support and maintenance for the open source dependencies you use to build your applications. Save time, reduce risk, and improve code health, while paying the maintainers of the exact dependencies you use. Learn more.

Authors

Personal thanks to:

  • Vyacheslav Egorov (@mraleph) for his awesome tutorials about optimising JS code for v8, IRHydra tool and his advices.
  • David Duponchel (@dduponchel) for help with testing.

Original implementation (in C):

  • zlib by Jean-loup Gailly and Mark Adler.

License

  • MIT - all files, except /lib/zlib folder
  • ZLIB - /lib/zlib content

pako's People

Contributors

alippai avatar andr83 avatar coldcoff avatar desudesutalk avatar dignifiedquire avatar diogoteles08 avatar igornovozhilov avatar jarofghosts avatar jhgg avatar josephfrazier avatar jsoref avatar kirill89 avatar kyranet avatar oldshensheep avatar oriog avatar puzrin avatar rlidwka avatar sheetjsdev avatar thenickdude avatar tinolange avatar trevorhreed avatar xmikus01 avatar xubor avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pako's Issues

Not recognized as a valid gzip in Java (java.util.zi.GZIPInputStream)

Hi,

I tried to gzip with pako on client side and send it to my server (java) and to ungzip it there by using GZIPInputStream , but it yells that is not a valid gzip archive, can it be that data that gzipped with pako can't be ungzipped with other gzip implementation (for example Java GZIPInputStream) ?

Pako won't work in ie8

When I try to deflate a simple object (copied it from the example) it return empty string. When I try it in new browsers it works just fine. This are the lines.

var test = { my: 'super', puper: [456, 567], awesome: 'pako' };

var binaryString = pako.deflate(JSON.stringify(test), { to: 'string' }

there is difference between pako and 7zip

I have an UT to compare the zip content. The animated.zip is generated by 7zip.

"jszip": "2.4.0",
"pako": "0.2.5",

The result is different:

"PK\u0003\u0004\n\u0000\u0000\u0000\b\u0000\u0000\u00001E??\u0000A§\f\u0000\u0000O\u000f\u0000\u0000\f\u0000\u0000\u0000animated
"PK\u0003\u0004\n\u0000\u0000\u0000\b\u0000\u0000\u0000yC??\u0000A-\f\u0000\u0000O\u000f\u0000\u0000\f\u0000\u0000\u0000animated

This is the code:

var jsZIP = require('jszip');
var pako = require("pako");

jsZIP.compressions.DEFLATE.compress = function (input) {
  return pako.deflateRaw(input);
};

describe('Animated GIF compression', function() {
  it('Compression animated.gif', function() {
    var zip = new jsZIP();
    var fileContent = fs.readFileSync(
      path.join(__dirname, 'animated.gif'), {
        encoding: 'base64'
      }
    );
    zip.file('animated.gif', fileContent, {
      base64: true,
      date: new Date("Sep 17, 2014")
    });
    var data = zip.generate({
      type: 'nodebuffer',
      compression: 'DEFLATE'
    });
    var zipContent = fs.readFileSync(
      path.join(__dirname, 'animated.zip')
    );
    assert.strictEqual(data.toString('binary'), zipContent.toString('binary'));
  });
});

Segfault in linux in js-git

With the following JS file, I get a segfault using latest stable node and [email protected] from npm.

var inflate = require('pako').inflate;

var input = "eJy9VU1vGzcQve+vGDiH2Khk3XoIUh/ipomKpIc4rVEEBUQtR1paXA5Lcq3o3/cNKclqf0B9MFbkfLx582ZY2LoyNzF2P53+uu6rHpLLZGiLj7XJbMnyM3uJI4dCHJ5dkqDft0SPAwdaUjaHC/MljWwCHWTqrITX8NGYG+c5kwSyLu/g+qdM7UIdrSlGnTNuE/fFH2Dxh8uT8f5AMPAiu9yNUz+QdzsGvJIMvJ0E42scSdTLGD0Xpr0rQ00IK4ZxsCeTaAIj9O/BcurKwDSI2JliJZOY1kn22YUt6dUlMNomE4caqU9sitoE3lMQC9B63E3RtnP1TSKFEm84cegBCDyBxFQTjaYW0A8mbPkWnMM+TikK0siGzi0o0nKp8WULojdlI2lEIlNqtEwxCRCOI/J3v5pn89AnF8G8yYeKbhRUZ/qeczZrrxQsgW8KGYUhhfblfkgy8jslurpkkaAUJeXTe1orjkFyQYv3vCaUk0F/Q1FD4ccoa6W98oh7PSqasGQUugy5oKEEe+WoJaRvjwj2gObwX9dDKTG/WSz6enW7FdkCLPq6QMasNgvLxTi/KFW9F7TMj8pciPDOuqd1WK+f7OZJeuM2Pjwh1hDWfru56bpXr+ij6XfaLACshHfdg9TuWO6dRYn6vTegG20Y2EeImtaT8xbYdT72nBV9EfFg8ytIeH0kE5X+4lIuM7qYpsx+M1f2NKkKFTQlWpx0uTh3tQaDi85QrkMUGHBaKGBpGOpPmH4Q2nKjs3FGlaVzc9ymFjKYZ8YkdsYDoD0AYRsuVOh3BY7TdnhDb8/0e5nsFCvx/f3n3z4+2fv9j3/fdR3NadlCttHWwG1XDNMa8HbYB9C1PcxetKtzIoG17ecMzaEmyFyUlLyo3nkB8zvN88lMAeOupX2LiedaeDkXdhqU/0kzdSKm82roJRT+juHjMNF1ctuhzHvv+t0NOZ0I7oiIx1gOdeNo39TN8waqgBSw4ujqkwN3n2WCxj40/r5wlCut/T3CJ1q1VSPBfT9ix7Ze0bXUPZIIctmdOlyluk+unMb8hAR7EFrEf4ZstIpoMMItQOvX0c7j/FaTf6nl3Gs5dX5xt2rktvwb8dieJzft7nnL/qu099+jpKrZn7Hyr2rsh2ZggCLpLjvGul6C3LbsLcw574rERjr0YyBr1FrTFZNU8HCo78Cq8rK6qcEfTTkKJqssHbpEOQLnftClhPnhCikfedEHCejqg1T99TWDrnlGEvm0EdXkbasfKkPXIVL0JN/N8BjqbjstZ9Qx4jmYNdjW1iwtIwb4WKnR5UpTiNg/OD2Hu/2P5Ld4MJJRxwvFl+PUL/G0QOBQn23PyIsuPXhSS10hGUbhJUCT7+oe2k3if3i3qrJkLGS4jmTQEt1dXfcPivzbTg==";
inflate(new Uint8Array(new Buffer(input, "base64")));
tim@localhost:~/Downloads/test$ node -v
v0.10.31
tim@localhost:~/Downloads/test$ node problem.js 
Segmentation fault (core dumped)

I narrowed the issue down to just this exact version of node. The segfault doesn't happen much in v0.10.30.

It's most likely a bug in v8 that you happen to be triggering, but I thought you should know.

I get the segfualt on linux, but OSX is also having segfaults on some other similar sample data.

adler32 optimization

I've tested a few ways to optimize the 65521 mod operation. As evidenced by http://jsperf.com/mod-65521 replacing the mod with a bit-based approximation is significantly faster on safari, comparable in firefox and chrome, and about 10% faster in node 0.10.29.

Also:

// Set limit ~ twice less than 5552, to keep
// s2 in 31-bits, because we force signed ints.
// in other case %= will fail.

How did you arrive at that logic? As per my calculation, the upper threshold can be boosted to 3850 without overflowing a 31-bit integer:

(* Run this in Mathematica *)
F[n_] := Reduce[x*(x + 1)*n/2 + (x + 1)*(65521) < (2^31 - 1) && x > 0, x, Integers]
F[255] (*  x \[Element] Integers && 1 <= x <= 3854 *)

Terrible performance on IE11

Deflating in IE11 has terrible performance. IE11 takes about 40 times longer than one of the other popular browsers. I compressed the same block of 25MB (high-compressable) data and measured the results:

Windows v8.1 (Dell Latitude M4500)

  • Chrome 38.0.2125.101m - 1.037ms
  • Internet Explorer 11 - 39.783ms
  • Firefox 32.0.3 - 930ms
  • Opera 24.0.1558.64 - 1.162ms

OS X v10.9.5 Mavericks (Apple MacBook Pro, late 2011, 2.2Ghz)

  • Chrome 38.0.2125.101m - 1.128ms
  • Safari 7.1 (9537.85.10.17.1) - 827ms
  • Firefox 32.0.3 - 1.248ms
  • Opera 24.0.1558.64 - 940ms

Node stream support?

This library looks like it supports node streams, but doesn't actually. It also makes it difficult to support streams because the last call to push has to contain the last chunk, when node streams usually end after emitting the last chunk.

This could be simplified by separating .push(chunk, false) and .push(chunk, true) into a .end(opt_chunk). Then it would be relatively easy to write a stream wrapper for this library.

Unless I'm just doing things wrong? Or there is a way to write an empty value to .push?

I would ideally want this to work:

var binaryCSV = require('binary-csv')
var createReadStream = require('filereader-stream');
createReadStream({ some gziped html5 file object })
  .pipe(pako.inflator())
  .pipe(binaryCSV())
  .on('data', function(line) { console.log('line', line) });

BOM is not removed when converting inflated output to string

It looks like pako is ignoring BOM and leaves it where it was when converting the inflated buffer to string. I think it would be better if BOM was either removed automatically, or, at least there was an option to do this, because it's the source of those hard-to-catch kind of bugs, when things sometimes look differently from what they are in reality. Please, run the following code snippet to see what I mean.

// deflate UTF-8 encoded "hello" with BOM mark
var buf = pako.gzip([0xEF, 0xBB, 0xBF, 0x68, 0x65, 0x6C, 0x6C, 0x6F]);

// inflate it again
var str = pako.inflate(new Uint8Array(buf), {to: 'string'});

console.log('The result string looks like this: "' + str + '"' );
console.log('The first character is: 0x' + str.charCodeAt(0).toString(16) +
  ' and not 0x' + "h".charCodeAt(0).toString(16) + '("h") as you would expect');
// Produced output:
//
// The result string looks like this: "hello"
// The first character is: 0xfeff and not 0x68("h") as you would expect

Inflate implementation

Hey -- I have a hand-written inflate implementation available that I wrote for js-git over here. It might not match style exactly, but I was wondering if it would be of use to you?

readme

Missing closing '}' for try block

try {
var result = pako.inflate(compressed);
catch (err) {
console.log(err);
}

inflateRaw on nodejs : RangeError: Offset/length out of range

See Stuk/jszip#126 for the original file.

On nodejs v0.10.26, with pako 0.1.1 and pako's master, I get an exception when using inflateRaw on a compressed content.

The compressed content is here : https://github.com/dduponchel/jszip/blob/issue126-file/sheet4.data
Or, with JSZip, you can get it with :

var zip = new JSZip(fs.readFileSync('./apachepoi_46535.xlsx'));
zip.files['xl/worksheets/sheet4.xml']._data.getCompressedContent();

I get the following stack trace :

> pako.inflateRaw(content);
RangeError: Offset/length out of range.
    at Object.fnTyped.arraySet (/home/dduponchel/projects/pako/lib/utils/common.js:42:12)
    at Object.inflate (/home/dduponchel/projects/pako/lib/zlib/inflate.js:910:13)
    at Inflate.push (/home/dduponchel/projects/pako/lib/inflate.js:198:27)
    at inflate (/home/dduponchel/projects/pako/lib/inflate.js:327:12)
    at Object.inflateRaw (/home/dduponchel/projects/pako/lib/inflate.js:347:10)
    at repl:1:7
    at REPLServer.self.eval (repl.js:110:21)
    at Interface.<anonymous> (repl.js:239:12)
    at Interface.EventEmitter.emit (events.js:95:17)
    at Interface._onLine (readline.js:202:10)

Improve tests coverage

Check coverage reports & make sure that all code/branches covered.

Now some important conditions are missed (probably, chunk-related)

consider change var names

next_out, pending_out, next_in are not pointers in this port, and can counfuse people about those nature. I think, it would be better to use something like

  • input / in
  • output / out
  • pending_out_index

  • consider rename
  • check other vars with strange names
  • update comments for var definitions

gzip function makes browser unresponsive

I am calling the gzip function and passing plain old JSON. When I do, the browser freezes. I look in activity monitor and Chrome helper is taking up 100% cpu.

node.js 0.11 - some inflate benchmarks crashes on level 0

Seems NOT pako bug, caused by previous test + v8 gc bug in typed arrays.

  1. It happens only on level 0 and only in benchmark with typed array.
  2. If imaya-inflate test deleted, then pako-inflate tests runs ok.

Investigate, what happens.

decompression to binary string

Here is how I compress binary string (char codes above 255) with pako:

var charData = xhr.responseText.split('').map(function(x){return x.charCodeAt(0);});
var binData = new Uint8Array(charData);
var data = pako.deflate(binData, {level:"9"});

Here is how I decompress data back:

var data2 = pako.inflate(xhr.responseText);

Now, how can I get original string in JavaScript from this object?
I tried methods like this:

A.
pako.inflate(xhr.responseText, {to:"string"});
B.
String.fromCharCode.apply(null, data2);
C.
for (var i = 0, l = data2.length; i < l; i++) { result += String.fromCharCode(parseInt(array[i], 2)); }

All of these methods bring different data as JavaScript string than original.
When I save decompressed pako.inflate(xhr.responseText) to a file (using functions with a.download) then dempressed file has exactly the same bytes as original file (so compression and decompression works correctly, without any byte modification).

deflate fails on provided sample

when provided sample placed to fixtures, multiple tests fail.


I tried pako on this file (a git index of the vim plugin YouCompleteMe) with the following script :

var fs = require("fs");
var pako = require("pako");

var content = fs.readFileSync("index");

var u8 = new Uint8Array(content.length);
for(var i = 0; i < content.length; i++) {
  u8[i] = content[i];
}

var result = pako.inflateRaw(
  pako.deflateRaw(u8)
);

pako.inflateRaw doesn't seem to finish. It might comes from deflateRaw : the result doesn't match the output of zlib.deflateRaw.

Error decoding content

I am trying to upload some gzipped json content to s3 and keep running into an ERR_CONTENT_DECODING_FAILED error, which would suggest there is something awry with the encoding. Hopefully there is something straightforward that I've missed.

Here is what I'm using on the client:

      $upload.http({
        method: 'PUT',
        headers: {'Content-Type': 'application/json', 'Content-Encoding': 'gzip'},
        url: response.s3URL,
        data: pako.gzip(JSON.stringify(networkForUpload), {to: 'string'})
      }).success(function(data) {
       //Do more stuff
      });

And the server to get the signed url:

      s3.getSignedUrl('putObject', {
        Bucket: config.bucketNames.uploadedNetworks,
        Expires: 300,
        Key: network._id.toString(),
        ContentType: 'application/json',
        ContentEncoding: 'gzip',
        ACL: 'public-read'
      }, function(err, data) {
      //Do more stuff
      });

Better buffer management on small chunks in wrappers

Now onData is emited if any data output available, even if buffer not filled. That's not critical, but can be done better: create new buffer when previous is full, on flushes & on finish.

Note, that inflate needs additional checks, because inflate_fast requires at least 6 bytes available in output buffer.

Release 0.2 planing

  • check coverage report, add the rest of tests if necessary (errors, especially in deflate)
  • gzip custom headers write
  • cleanup structures (useless fields in gzip header and so on)
  • review all commented code lines, and add explanation, why those not needed.
  • rename variables (see #19)
  • suggorates support for strings
  • cleanup strange api methods like deflatePrime and others

Release 0.1.0 planing

deflate:

  • Z_FIXED strategy (postponed)
  • gzip crc

Inflate:

  • implement
  • fix bugs / skipped tests (ungzip)
  • try to optimize code tables - replace code class stubs with Int32Array magick

Other:

  • review zstream.message, try to unify message get/use between inflate/deflate
  • docs
    • inflate
    • inline examples
  • сheck if it worth to move rare functions to separate place, to minimize browser lib size (postponed, unclear now).
  • bower (browser versions in dist folder)
  • ??? move constants definitions locally (can reduce browserified size & speed)

String/Buffer support?

Not sure, if that's really needed. Could be convenient to pack/unpack json, for example. Possible cases:

  • deflate input:
    • ucs2 string (for example, to pack json) - can be converted to utf8 binary string
    • "binary" string (each char is [0...255])
  • deflate output:
    • (?) binary string
    • (?) base64 string
  • inflate input:
    • (?) binary string
  • inflate output:
    • binary string
    • usc2 string (for example, unpack json)
    • (?) base64 string

If anyone is strong in client-side programming, and can explain, why string conversions will be good in this lib - share your knowlege and examples in this ticket.

Revisit supported modes for inflate/deflate wrappers

Inspired by #34. Current docs for external API is not precise and can confuse users, that pako high-level interface really support all modes, that can be used via zlib directly. That's not true.

We should review docs, and decide, how to fix and provide correct recommendations, when it's better to use low-level zlib API ./lib/zlib directly with custom wrapper:

  1. Add missed modes, if that's simple and will not make wrapper code completely mad.
  2. Fix docs to say clear which modes are supported and not supported.

Progress bar (feature request)

I found no way to make pako report its progress. That's an issue because browser lags for several seconds when the page is being loaded. Could you add some callbacks, please?

utf-8 Chinese string

hi man.
I met a question when i used Gzip to decompress String.When I decompress a utf-8 Chinese String.it will be wrong.
Because i use ajax to exchange data.And the xhr.responseText is gziped by server.
so i need to decompress the data on browser.
can you give me some advice?
thx : )

Certain patterns disable v8 optimization

First off: very awesome library!

FYI: to see these de-optimizations, in node you can run a command with the --trace_opt and --trace_deopt flags.

In trying to chase down performance problems in https://github.com/SheetJS/js-xlsx, I noticed one bad pattern that can easily be improved:

[disabled optimization for inflate, reason: SwitchStatement: non-literal switch label]

Taking a gander at the relevant file, it seems like you could replace the state names with the corresponding values with some comments. For example, the head state for the switch block would look like:

    switch (state.mode) {
    case 1: /* STATE_HEAD */

That way, you can satiate v8 and simplify grepping for the state values.

Using "window" as a local variable inside a closure

Hey ya,

I've noticed that you declare "window" and use it throughout the library but this interferes with the externally named global "window" when instrumenting code for coverage stats. I replaced "window" in my cloned version of the repo with the variable name "sWindow" (although obviously any name would do) and the library still worked but didn't interfere with code coverage either.

FYI I'm using the library via browserify to compile all the project JS and modules to a single JS file for browser consumption. After that I have a qunit process running in grunt that goes over the whole thing and checks against unit tests and code coverage.

causes ReferenceError in chrome 38

using as part of browserify-zlib.

inftrees.js L170 Uncaught ReferenceError: Invalid left-hand side expression in prefix operation,

reproduce by simply creating an index.js

var zlib = require('zlib')

and
browserify -d index.js > bundle.js

drop the script tag into html and load the page

fix docs

The API docs for inflate and inflateRaw currently say that data is “input data to compress”.

Aren't they decompressing?

Please provide (minimal) client-side examples

In my case, I want to unzip some gzipped strings.
What are the minimum steps I need to do this client side / browser?

I don't understand which files I need.
I would expect justdist/pako_deflate.min.js but it requires files not in the dist directory.

Please provide a minimal example. Cooler would be examples for decode too, in the README.MD, but that's just thinking about other people who stumble across this interesting module - not for me.

ps I am using requirejs if that makes things easier.

Care about custom GZIP headers

Currently custom gzip headers can not be written/loaded. That doesn't seems to make sence on practice, but can be done. Especially for inflate.

Request for simple crc32 API

Some use cases require crc32 calculation separately of the Deflate algorithm (for example when writing PNG image files).

Since Pako includes crc32 functionality, it would be nice to have an API to access it. One simple function crc32(crc, buf, pos, len) (I would change the parameter order from the current function, with pos before len) would be sufficient. The size of the minified Pako files should only very slightly be affected.

Of course, implementing this functionality is quite trivial, but it would be more convenient if Pako's built-in function could be used.

Pako is great!

Z_SYNC_FLUSH not working

I have a use case where I want to make a TCP binary protocol for internal use. The protocol has a lot of repeated values eating up bandwidth. It would be great if I could apply a generic deflate/inflate over the stream.

I will be sending many smallish messages (< 1k) and I need the data to flush right away. But it is a stream and I want to reuse symbols from earlier in the stream. Looking at the options, Z_SYNC_FLUSH appears to be exactly what I need, but it's not working. I can't get Deflate to onData anything unless I send a Z_FINISH. But then the stream is dead and I can't send later messages.

var Deflate = require('pako').Deflate;
var Z_SYNC_FLUSH = require('pako').Z_SYNC_FLUSH;

var d = new Deflate();
d.onData = function (chunk) {
  console.log("onData", chunk);
};
d.push("some data", Z_SYNC_FLUSH);
d.push("some more data", Z_SYNC_FLUSH);

If you run this code in node, the onData handler is never fired. It only outputs data if I send a Z_FINISH message.

Segmentation fault on inflator.push

Running the provided test suite causes a segmentation fault. Tested on OS X 10.10 GM Candidate with node v0.10.31.


  Small input chunks
    ✓ deflate 100b by 1b chunk 
    ✓ deflate 20000b by 10b chunk (98ms)
    ✓ inflate 100b result by 1b chunk 
    ✓ inflate 20000b result by 10b chunk (99ms)

  Dummy push (force end)
    ✓ deflate end 
[1]    36807 segmentation fault  mocha

The line that is causing it is inflator.push(data);, inside Dummy push (force end) > inflate end.

Inside the lib, this line seems to be causing the error status = zlib_inflate.inflate(strm, c.Z_NO_FLUSH);.

I have been debugging with console.log instead of firing up a real debugger and I can't make sense of the findings I did in inffast.

do {
  console.log(1);
  //...
  console.log(2);
} while (_in < last && _out < end);
console.log(3);

This prints 2 and then dies.

Pako fails inflate on concatenated gzip files

Use inflate.push to add data to pako inflate stream on gzip files created by concatenating them. Pako fails when you try and push more data past the first gz file. This is a single gzip file created by concatenating multiple gzip files together. The files inflate fine in zlib, etc. Inflating the file outside of Pako and then re-deflate allows the file to be inflated correctly by Pako.

To reproduce:

  1. Create two different gzip files and then concatenate them together.
  2. Try to inflate in Pako using inflate.push with file slices smaller than the total files size.
  3. When you try to push another slice onto the inflate stream PAST the first concatenated gz file the push method returns false with no error code.

JSFiddle of example code that fails on concatenated gzip files:
http://jsfiddle.net/jlovell19/d3km6db6/5/

Per the RFC:
"A gzip file consists of a series of "members" (compressed data sets). [...] The members simply appear one after another in the file, with no additional information before, between, or after them."

Advanced usage of gzip explaining concatenated gzip files:
http://www.gnu.org/software/gzip/manual/gzip.html#Advanced-usage

I can provide a sample file if needed.

Don't force/allow uint32 when not needed

  1. We use >>> for right shift for safety, but that's required only for 32bit values. In other places that can cause unnecessary boxing.
  2. (?) += for bit operations can cause overflow. May be, should replace with |= or force |0 on result for 32-bit vars.

RangeError: Size is too large (or is negative)

I am getting this on BB10 and in some cases on iOS Safari (when app is launched from Home Screen). Seems like there is some limit on buffers processing at a same time.

Any idea how to fix it?

Download failed of the latest release.

Hi,
I faced the issue that I could not install Pako using npm.
First error I got was:

  • error shasum check failed for C:\Users\mgaert\AppData\Local\Temp\npm-8108-77644fd9\registry.npmjs.org\pako-\pako-0.2.5.tgz
  • error Expected: 36df19467a3879152e9adcc44784f07d0a80c525
  • error Actual: c469fd3ea0829492caf628d208b6094e78063f05

Second error I could not download manually frum git.

Info:
Im I am using npm v2.1.18 and notejs v0.10.35

Regards
Michael

Binary gzip string not in correct gzip format

var output = pako.gzip("{test:'hello world'}", { to: 'string' });

fs.writeFile("./gzip-test.txt", output, function(err) {
  if(err) {
    return console.log(err);
  }

  console.log("The file was saved!");
});

Produces this output in a file:

1F C2 8B 08 00 00 00 00 00 00 03 C2 AB 2E 49 2D 2E C2 B1 52 C3 8F 48 C3 8D C3 89 C3 89 57 28 C3 8F 2F C3 8A 49 51 C2 AF 05 00 37 C2 90 12 C3 89 14 00 00 00

According to this:

http://en.wikipedia.org/wiki/Gzip

The output should start with a 1F 8B. It has a C2 in there, otherwise it looks good. I am trying to gzip a big block of JSON in the browser and send it to the backend. The Java backend is throwing an error saying that this is not in GZip format.

Error on Safari iOS 5.1

Hello,

First of all thank you for your great work.
I have an issue on iOS 5.1, the error is :
object uint8array ' is not a valid argument for 'function.prototype.apply'

I can resolve it if STR_APPLY_OK is set to false. So I've replace :
try { String.fromCharCode.apply(null, [0]); } catch( __ ) { STR_APPLY_OK = false; }
with
try { var test = new Uint8Array();String.fromCharCode.apply(null, test);} catch( __ ) { STR_APPLY_OK = false; }
and now it's ok (STR_APPLY_OK is still set to true in modern browsers).

Hope it can help.

Thank you

Guillaume

Reduced build for raw deflate/inflate?

I don't know, if it really worth to spend time for saving lib size. But if someone interested:

  1. Let me know about your use case
  2. Cut manually unused code and report how much kilobytes was saved for minified+gzipped js.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.