evanoxfeld / node-unzip Goto Github PK
View Code? Open in Web Editor NEWnode.js cross-platform unzip using streams
License: MIT License
node.js cross-platform unzip using streams
License: MIT License
Error: invalid signature: 0xcb633d21
at /var/www/xxx/unzip/lib/parse.js:63:13
at process._tickCallback (node.js:415:13)
Feature request:
Perhaps emit a bomb
or error
event when encountering a zip bomb.
Hey,
I am using your library and tracked this error through all the way through your library to the node ZLib. The actual error was:
throw er; // Unhandled 'error' event
^
Error: invalid code lengths set
at Zlib._binding.onerror (zlib.js:295:17)
Process finished with exit code 8
It makes it through the first 3 out of 5 files in the zip but then I keep getting this invalid code length on a file with the name "WIDMIR - FUTURE HVAC FUME HOOD AND SILENCER LOCATIONS - 2nd Floor Addendum 4.pdf"
Its a long name but no crazy characters. I am not sure if it is in the body or what. The actual error has no stack trace but looking investigating took me to here:
https://github.com/uber/nodejs/blob/master/deps/zlib/inflate.c#L869
I can see that it is some how getting passed through the following:
https://github.com/EvanOxfeld/node-unzip/blob/master/lib/parse.js#L136
It doesn't look like a lot of other people are experiencing this issue and I'm not great with the c language. Any help would be much appreciated.
Thanks!
Using the example on the README, the get undefined
for the value of entry.size.
Is that a known issue? Couldn't find anything in any other issue.
RT
if the file name contains Chinese characters, after extracting, the file name broken.
I'm piping a request stream into the zip stream, and getting duplicate entry logs:
dist/
dist/css/
dist/css/bootstrap-theme.css
dist/css/bootstrap-theme.min.css
dist/css/bootstrap.css
dist/css/bootstrap.css
dist/css/bootstrap.min.css
dist/css/bootstrap.min.css
dist/fonts/
dist/fonts/glyphicons-halflings-regular.eot
dist/fonts/
dist/fonts/glyphicons-halflings-regular.eot
dist/fonts/glyphicons-halflings-regular.svg
dist/fonts/glyphicons-halflings-regular.svg
dist/fonts/glyphicons-halflings-regular.ttf
dist/fonts/glyphicons-halflings-regular.woff
dist/js/
dist/js/bootstrap.js
dist/js/bootstrap.min.js
var fileCnt = 0;
var fileComplete = function() {
fileCnt--;
if (fileCnt == -1)
done();
}
outPipe = unzip.Parse()
.on('entry', function (entry) {
if (entry.type != 'File')
return entry.autodrain();
console.log(entry.path);
fileCnt++;
var outPath = path.resolve(outDir, entry.path);
mkdirp(path.dirname(outPath), function(err) {
entry.pipe(
fs.createWriteStream(outPath)
.on('finish', fileComplete)
);
});
})
.on('close', fileComplete);
request({
uri: 'https://github.com/twbs/bootstrap/releases/download/v3.0.0/bootstrap-3.0.0-dist.zip',
headers: { 'accept': 'application/octet-stream' },
strictSSL: false
})
.on('response', function(res) {
res.pause();
res.pipe(outPipe);
res.on('error', err);
res.resume();
})
.on('error', err);
The finish event is never calling for each file, resulting in the operation timing out.
I also tried using the direct unzip stream and this is stalling too.
This module depends on fstream which depends on graceful-fs which monkey patches fs. The decision to use graceful-fs should be left to the application, not to a single module it depends on.
It looks like this project has been abandoned as there were not updates for almost a year.
Since 0.1.6
there's been a lot of errors like this.
One of the packages that have this issue is:
http://reality.sgiweb.org/maxw/tmp/bower/Open_Sans.zip
With 0.1.5
it works with the above zip file. Also works with common tools to extract zip files on OS.
Unfortunately, 0.1.5
is not compatible with node 0.10.x
.
JetBrains TeamCity can pack artifacts as zip files. I'm trying to unzip a file from my build artifacts with unzip
. It's failing with error :
Error: invalid signature: 0x47d68d75
at path\to\unzip\unzip\lib\parse.js:63:13
at process._tickCallback (node.js:415:13)
BTW "bower install" fails on such zip-files as it uses unzip
.
The Zip file itself is OK. It can be unpacked with WinRAR and official PKZIP from PKWARE without any errors.
When I created new zip files with WinRAR and PKWARE's PKZIP unzip
fails on them in another way:
(node) warning: Recursive process.nextTick detected. This will break in the next version of node. Please use setImmediate for recursive deferral.
RangeError: Maximum call stack size exceeded
What format of ZIP the module does support? It seems to be incompatible with official PKZIP.
Hi there,
I'm trying to extract files one by one through the entry callback and build up an array of files once each one has been extracted.
Once the last entry has been extracted, I then want to call my callback and pass through the entries to it. Close seems to be called before I finish extracting my files.
Is there a callback that is called once all of the entries have finished extracting? Or do i need to implement something myself?
The code looks like the below:
fs.createReadStream(Card.getSrcOrigin())
.pipe(unzip.Parse())
.on('entry', function(entry) { _self._extractEntry(entry, Card) })
.on('close', oncomplete)
.on('error', function(){
console.log(arguments);
});
/**
* Extracts an entry from an unzip callback entry
* @param {entry} entry - A file within the zip file
* @param {Card} Card - The current Card that is being extracted
*/
this._extractEntry = function(entry, Card){
if(!this._validate.entry(entry)){
return;
}
var targetFile = Card.getCode() + '-' + uuid.v4() + path.extname(entry.path);
targetFile = path.join(destination, targetFile);
entry.pipe(fs.createWriteStream(targetFile))
.on('close', function(){
Card.addFile(targetFile);
});
}
I downloaded the zip file from: http://geolite.maxmind.com/download/geoip/database/GeoLite2-City-CSV.zip
I wrote this script to extract it:
#!/usr/bin/env node
var fs = require('fs');
var unzip = require('unzip');
var reader = fs.createReadStream('GeoLite2-City-CSV.zip');
var parser = unzip.Parse();
reader.on('error', function (error) {
throw error;
});
parser.on('error', function (error) {
throw error;
});
parser.on('entry', function (entry) {
console.log(entry.type, entry.path);
entry.autodrain();
});
parser.on('close', function () {
console.log('done');
});
reader.pipe(parser);
// vim:set sw=4 et:
Which yields these strange errors:
File GeoLite2-City-CSV_20140401/GeoLite2-City-Locations.csv
File GeoLite2-City-CSV_20140401/COPYRIGHT.txt
File GeoLite2-City-CSV_20140401/LICENSE.txt
File GeoLite2-City-CSV_20140401/LICENSE.txt
File GeoLite2-City-CSV_20140401/LICENSE.txt
File GeoLite2-City-CSV_20140401/LICENSE.txt
File GeoLite2-City-CSV_20140401/GeoLite2-City-Blocks.csv
File GeoLite2-City-CSV_20140401/GeoLite2-City-Blocks.csv
File GeoLite2-City-CSV_20140401/GeoLite2-City-Blocks.csv
File GeoLite2-City-CSV_20140401/GeoLite2-City-Blocks.csv
File GeoLite2-City-CSV_20140401/GeoLite2-City-Blocks.csv
File GeoLite2-City-CSV_20140401/GeoLite2-City-Blocks.csv
File GeoLite2-City-CSV_20140401/GeoLite2-City-Blocks.csv
File GeoLite2-City-CSV_20140401/GeoLite2-City-Blocks.csv
/Users/herzi/Documents/Workspace/geoip/unzip.js:14
throw error;
^
Error: too many length or distance symbols
at Zlib._binding.onerror (zlib.js:295:17)
It looks like parsing this file fails. OSX 10.9's unzip
in the command line works fine.
Piping in a zip stream directly from archiver, results in a series of invalid signature
errors. It seems as if all bytes in the stream are checked for a signature. On the other hand, the same zip gets extracted correctly when going through file streams first.
See @ctalkington's comments in archiverjs/node-archiver#91. Also, here's a test case.
Try to unzip http://www.feedbooks.com/userbook/14338.epub file.
It extracts only some files with partial content.
I've got a script that downloads zips from github and unzips them into a temporary directory.
On machines with 4 gigs or less, it's crashing on any file over 2 megs.
https://github.com/gomobile/sample-counting-beads - that is a sample that I'm getting the zip from.
Below is the message I am seeing. The signature changes each time, but it happens on mac, linux, and windows. On my macbook pro with 8 gigs of ram, it works fine (I've pulled down 20 meg files).
Error: invalid signature: 0x7d70fadf
at C:\my_project\node_modules\unzip\lib\parse.js:1:1127
at process._tickCallback (node.js:351:13)
require('request')('https://github.com/yui/pure/releases/download/v0.3.0/pure-0.3.0.zip')
.pipe(unzip.Parse())
.on('entry', function(entry){
if (entry === 'Directory') {
console.log('directory found!');
}
})
there' s no Directory entry found.
The parser starts reading from the beginning of the file, and assumes that the necessary information will be contained within the Local File Header records found starting there. This isn't a valid assumption, since many zip files store information such as the compressed and uncompressed sizes in the Central Directory File Header instead. Since the parser doesn't check for a zero compressed size, it tries to decompress zero bytes, and then assumes that the next Local File Header starts at the current position, resulting in an "invalid signature" error.
Should the readme have a var fstream = require('fstream')
in it?
For this zip file, unzip.Extract
never fires error
nor close
which causes my process to hang.
Setting verbose: true
gives us the following output:
Archive: /home/ozten/apk-factory-service/tmp/application.zip
inflating: META-INF/zigbert.rsa
inflating: 404.html
Using the unzip command line on Ubuntu successfully decompresses this archive and there are many more entries.
When trying to unzip a large archive (about 45MB that should expand to about 100MB), it fails with this output:
events.js:72
throw er; // Unhandled 'error' event
^
Error: invalid distance too far back
at Zlib._binding.onerror (zlib.js:295:17)
Any idea what this means?
I got this error while my zip is definitely a valid one.
Referencing this issue nodejs/node-v0.x-archive#6384 here as it started out with me trying to use unzip with a github repo release zipfile
Summary: any release zipfile from a github repositiory will not work correctly by just piping the stream to unzip, e.g. as per example. fs.createReadStream('path/to/archive.zip').pipe(unzip.Extract({ path: 'output/path' }));
As tjfontaine demonstrated in the response to issue the signature has to be read from the stream first before piping stuff to unzip.
More background on the signature: https://users.cs.jmu.edu/buchhofp/forensics/formats/pkzip.html
Here's an example: http://jqueryui.com/resources/download/jquery-ui-1.10.0.custom.zip
CPU is at 100%.
Please note that in older versions, it was giving me signature error. But after upgrading, I'm getting this behaviour.
This bug has been reported in twitter/bower: bower/bower#225
I dunno if this is related to #8, probably is.
Please try this file
https://www.dropbox.com/s/b9u4b431q0rpc4k/test.zip
and it will try an error:
Fatal error: invalid block type
Right now, node-unzip depends on node-pullstream 0.0.2, whereas latest version is 0.0.4. With 0.0.2 version node-unzip falls with stack overflow on large archives. It will be great if you could update the dependency and update package at npm
{ [Error: invalid distance code] errno: -3, code: 'Z_DATA_ERROR' }
Currently this library is unusable for me because it reports entry.size
as undefined
and silently ignores (entry
not called) files that are larger than a few kilobytes. I'm using the second piece of example code to do this.
Calling unzip
on the command line works until this is fixed.
Running npm test
using node-0.11.13-linux-x64 causes the tests to fail due to timeouts, during which one of the cores is pegged at 100%.
npm test
> [email protected] test /path/to/node-unzip
> tap ./test/*.js
not ok ./test/compressed.js ............................. 0/1
Command: "node" "compressed.js"
TAP version 13
not ok 1 ./test/compressed.js
---
exit: ~
timedOut: true
signal: SIGTERM
command: "node" "compressed.js"
...
1..1
# tests 1
# fail 1
not ok ./test/fileSizeUnknownFlag.js .................... 0/1
Command: "node" "fileSizeUnknownFlag.js"
TAP version 13
not ok 1 ./test/fileSizeUnknownFlag.js
---
exit: ~
timedOut: true
signal: SIGTERM
command: "node" "fileSizeUnknownFlag.js"
...
1..1
# tests 1
# fail 1
not ok ./test/pipeSingleEntry.js ........................ 1/2
Command: "node" "pipeSingleEntry.js"
TAP version 13
ok 1 should be equal
not ok 2 ./test/pipeSingleEntry.js
---
exit: ~
timedOut: true
signal: SIGTERM
command: "node" "pipeSingleEntry.js"
...
1..2
# tests 2
# pass 1
# fail 1
not ok ./test/uncompressed.js ........................... 0/1
Command: "node" "uncompressed.js"
TAP version 13
not ok 1 ./test/uncompressed.js
---
exit: ~
timedOut: true
signal: SIGTERM
command: "node" "uncompressed.js"
...
1..1
# tests 1
# fail 1
total ................................................... 1/5
not ok
npm ERR! Test failed. See above for more details.
npm ERR! not ok code 0
One of the files in my zip is a JSON file. I would like to stream in the zip and when I get to this file pull the contents directly into a string rather than write to a file then read from that file.
Is that sort of thing possible?
There are some issue , when i unzip a zip file(inlcude two docx file) on mac os.
error like :
events.js:71
throw arguments[1]; // Unhandled 'error' event
^
TypeError: Invalid non-string/buffer chunk
at chunkInvalid (/Users/twer/project/opensource/vital-signs/node_modules/unzip/node_modules/readable-stream/lib/_stream_readable.js:341:10)
at readableAddChunk (/Users/twer/project/opensource/vital-signs/node_modules/unzip/node_modules/readable-stream/lib/_stream_readable.js:128:12)
at Match.Readable.push (/Users/twer/project/opensource/vital-signs/node_modules/unzip/node_modules/readable-stream/lib/_stream_readable.js:115:10)
at Match.Transform.push (/Users/twer/project/opensource/vital-signs/node_modules/unzip/node_modules/readable-stream/lib/_stream_transform.js:141:32)
at Match._matchFn (/Users/twer/project/opensource/vital-signs/node_modules/unzip/lib/parse.js:161:31)
at Match._transform (/Users/twer/project/opensource/vital-signs/node_modules/unzip/node_modules/match-stream/match.js:39:10)
at Match.Transform._read (/Users/twer/project/opensource/vital-signs/node_modules/unzip/node_modules/readable-stream/lib/_stream_transform.js:180:10)
at Match.Transform._write (/Users/twer/project/opensource/vital-signs/node_modules/unzip/node_modules/readable-stream/lib/_stream_transform.js:168:12)
at doWrite (/Users/twer/project/opensource/vital-signs/node_modules/unzip/node_modules/readable-stream/lib/_stream_writable.js:211:10)
at writeOrBuffer (/Users/twer/project/opensource/vital-signs/node_modules/unzip/node_modules/readable-stream/lib/_stream_writable.js:201:5)
at Match.Writable.write (/Users/twer/project/opensource/vital-signs/node_modules/unzip/node_modules/readable-stream/lib/_stream_writable.js:172:11)
at write (/Users/twer/project/opensource/vital-signs/node_modules/unzip/node_modules/readable-stream/lib/_stream_readable.js:550:24)
at flow (/Users/twer/project/opensource/vital-signs/node_modules/unzip/node_modules/readable-stream/lib/_stream_readable.js:559:7)
at Readable.pipe (/Users/twer/project/opensource/vital-signs/node_modules/unzip/node_modules/readable-stream/lib/_stream_readable.js:527:7)
at process.startup.processNextTick.process._tickCallback (node.js:244:9)
If any compressed file had exec permission before compression, it will lose its permission on extraction.
Enviroment:
OS X 10.8.2
Node 0.8.12
NPM 1.1.63
Steps:
git clone https://github.com/nearinfinity/node-unzip.git
cd node-unzip
npm install
touch blah
chmod +x blah
ls -la blah
echo "blah has exec permission."
zip blah.zip blah
rm blah
node -e "require('fs').createReadStream('./blah.zip').pipe(require('./unzip').Extract({ path: '.' }));"
ls -la blah
echo "Now, blah was supposed to have exec permission but it doesn't."
(node) warning: Recursive process.nextTick detected. This will break in the next version of node.Please use setImmediate for recursive deferral.
The message kept pumping up while doing the extracting then finally throw 'Maximum call stack size exceeded' and program died.
My unzip is version 0.1.9 and with node v0.10.23.And the size of my zip file is only 3.53MB
For some files I get a slew of messages that look like this:
(node) warning: Recursive process.nextTick detected. This will break in the next
version of node. Please use setImmediate for recursive deferral.
Followed by this:
RangeError: Maximum call stack size exceeded
My node version is:
v0.10.5
My unzip version is:
0.1.8
My code looks like this:
var parse_archive = function(path, callback) {
var unzipParser = unzip.Parse();
var entries = [];
unzipParser.on('close', function() {
console.log("Closing");
callback(null, entries);
});
unzipParser.on('error', function(err, dirPath) {
console.log("Error unzipping " + dirPath);
callback(err);
});
console.log("parse: " + path);
fs.createReadStream(path)
.pipe(unzipParser)
.on('entry', function(entry) {
if (entry.type === "File")
{
entries.push(entry.path);
entry.autodrain();
}
});
};
And due to that takes forever to install.
The offending file is package/examples/eclipse-jee-juno-SR1-macosx-cocoa-x86_64.tar.gz.zip
in the archive (https://registry.npmjs.org/unzip/-/unzip-0.1.3.tgz).
Getting an error after updating to Node 0.10.0:
TypeError: undefined is not a function
at Until.PullStream._flush (/Users/n23618/PROJECTS/buddy-dependencies/node_modules/unzip/node_modules/pullstream/pullstream.js:112:5)
at Until.<anonymous> (_stream_transform.js:131:12)
at Until.g (events.js:175:14)
at Until.EventEmitter.emit (events.js:117:20)
at finishMaybe (_stream_writable.js:332:12)
at endWritable (_stream_writable.js:339:3)
at Until.Writable.end (_stream_writable.js:326:5)
at /Users/n23618/PROJECTS/buddy-dependencies/node_modules/unzip/lib/parse.js:277:36
at process._tickCallback (node.js:415:13)
Code:
request.get(this.url)
.pipe(fs.createWriteStream(filename))
// Error downloading
.on('error', function(err) {
fn('fetching ' + self.url + ' failed: ' + err);
})
.on('close', function() {
extractor = unzip.Extract({path: self.temp});
writer = fstream.Writer(self.temp);
writer.on('close', fn);
// Unzip
fs.createReadStream(filename)
.pipe(unzip.Parse())
// Store path to unzipped package
.on('entry', function(entry) {
if (!self.location && entry.type === 'Directory') self.location = path.resolve(self.temp, entry.path);
})
// Error unzipping
.on('error', function() {
fn('unzipping archive: ' + filename);
})
.pipe(writer);
});
This has an error
module.exports = function zipHandler(request, response, next) {
request.pipe(unzip.Parse());
}
error is:
Error: invalid signature: 0x8074b50
at node_modules/unzip/lib/parse.js:63:13
at process._tickCallback (node.js:415:13)
This works, the file is written to http.zip and I can unzip in in my terminal.
module.exports = function zipHandler(request, response, next) {
request.pipe(require('fs').createWriteStream("./http.zip"));
}
Looks like the data coming over the wire is correct but piping the request to unzip.Parse() doesn't work, what should I be doing?
When I try to unzip a file on Heroku, I am getting a Maximum call stack size exceeded if the zip file contains a file of over ~110kb. If all files are less than that, it doesn't seem to matter how big the zip file itself is. Just seems to choke on files inside the zip if larger than ~110kb.
I am using pipe:
fs.createReadStream('test.zip').pipe(unzip.Extract({ path: 'test/' }))
Here is the Heroku crash log:
domain_thrown: true,
POST /config { [RangeError: Maximum call stack size exceeded]
domain:
{ domain: null,
_events: { error: [Function] },
_maxListeners: 10,
members: [ [Object], [Object] ] } }
Move the emit('entry') from just below unzip.js:111 where hasEntryListener is init to 166 below self._processDataDescriptor.
That is when size data is available.
I continue to be floored by the response to this project. While the original implementation was written on a lark, I carried on as a means to better understand streams, arguably the killer feature of NodeJS. Since that time, I've changed companies, ceased personally using this library, and experienced a number of major life events, particularly becoming a father.
I also learned that the zip format is problematic and cannot be perfectly streamed -- see this Apache doc. That said, if despite the major hangups as well as quality non-streaming NodeJS modules, such as decompress, there's continued interest in this project, I'll work to take this project to a stable 1.0.0, merge outstanding pull requests, and most importantly, seek collaborators who use this library in production.
Thank you and let me know your thoughts below.
Unzip stops prematurely in Node 0.10.0 without an error when inflating compressed files that are larger than the zlib stream's highWaterMark (16 kb). I believe the issue is the line self._untilStream.pipe(vars.compressedSize, inflater).pipe(entry) - the inflater stream never emits 'end' or 'finish'.
Code:
if (fileSizeKnown) {
entry.size = vars.uncompressedSize;
self._untilStream.pipe(vars.compressedSize, inflater).pipe(entry);
}
The error surfaced while I was investigating #19.
Hello,
I'm packaging atom [1] for debian [2].
Your software is there included.
I found some issues in your software:
Regards,
Jörg Frings-Fürst
[1] http://atom.io
[2] https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=747824
[3]
// Copyright 20xx-20xx your name
// License: [your license eg. MIT / GPL-2 ....]
Create an empty file:
$ touch a.zip
Try and unzip using this module:
var fs = require('fs')
, unzip = require('unzip')
, os = require('os')
, zipFile = fs.createReadStream(__dirname + '/a.zip')
, unzipper = unzip.Extract({ path: os.tmpdir() + '/test', verbose: true })
zipFile.on('error', function (err) {
console.error('Zip file error:', err)
})
unzipper.on('error', function (err) {
console.error('Unzipper error:', err)
})
unzipper.on('close', function () {
console.log('Unzipper closed')
})
zipFile.pipe(unzipper)
Hangs forever. Outputting the following verbose log only:
/t/node-unzip-test ❯❯❯ node test.js ⏎
Archive: /private/tmp/node-unzip-test/a.zip
I dont know too much about zip files in general. Is there any way to detect corruption and emit an error somewhere along the line when things arent as they should be?
I created a test scenario with a full disk condition, and I couldn't catch the error in my application using node-unzip.
Is this an issue, or I forgot something?
Code:
var fs = require('fs');
var unzip = require('unzip');
try {
var zip = unzip.Extract({ 'path': '/media/ramdrive', verbose : true });
zip.on('error', function (error) {
console.log('zip.on: ', error);
});
var file = fs.createReadStream('file.zip').pipe(zip);
file.on('error', function (error) {
console.log('file.on: ', error);
});
}
catch (err) {
console.log('error: ', err);
}
Output:
events.js:72
throw er; // Unhandled 'error' event
^
Error: ENOSPC, write
I have a ZIP file containing a single 600 MB file. Extracting it by piping to unzip.Extract results in a 0 byte file.
I get stack overflow in the following example:
var unzip = require('unzip'),
fs = require('fs');
unzip.Parse().end(fs.readFileSync('test.zip'));
Error:
node_modules/unzip/node_modules/over/overload.js:18
var args = Array.prototype.slice.call(arguments);
^
RangeError: Maximum call stack size exceeded
Looks like the problem is in inflateServiceRequest()
function (after commit 64199f4).
zip is here: https://docs.google.com/file/d/0BwYD4dygLavbY3VBblM5QTBZVUE/edit?usp=sharing
When I create an archive using OS X's Finder, and then try to process the archive using unzip, it hangs on the first "entry" event. Here's an example zip file that hangs.
https://dl.dropbox.com/u/47777/Archive.zip
as a point of reference, I've been able to get this error when slowly streaming a zip file to the parser:
/node_modules/unzip/node_modules/readable-stream/transform.js:183
readcb(null, chunk);
^
RangeError: Maximum call stack size exceeded
I try to strip off crx headers before piping the data to unzip.
Although it's a valid zip file after transforming node-unzip is throwing an invalid signature error.
I've extracted a simple test which can be found here:
https://github.com/berstend/unzip-bug
My unzip routine has stopped working this week because the 'end' event isn't firing.
A code sample that doesn't work is:
function unzip_archive(archive, callback) {
console.log("unzip_archive", archive);
fs.createReadStream(archive).pipe(unzip.Parse()).on('entry', function(entry) {
console.log("unzip", entry.path);
entry.pipe(fs.createWriteStream(path.join('temp', entry.path)));
}).on('end', function(err) {
console.log("unzip");
callback(err);
}).on('error', function(err) {
console.log("unzip error", err);
}).on('finish', function(err) {
console.log("unzip finish", err);
});
}
I'm running node v0.8.19 and my npm list:
┬ [email protected]
├─┬ [email protected]
│ └─┬ [email protected]
│ └── [email protected]
├── [email protected]
├─┬ [email protected]
│ ├── [email protected]
│ ├── [email protected]
│ ├── [email protected]
│ └── [email protected]
├── [email protected]
├─┬ [email protected]
│ └── [email protected]
├── [email protected]
└── [email protected]
I tried node v0.10 but several libraries are still updating (including unzip) so I've switched back to v0.8, but I'm still not seeing the end event.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.