koajs / compress Goto Github PK
View Code? Open in Web Editor NEWCompress middleware for koa
License: MIT License
Compress middleware for koa
License: MIT License
Hi, could you reversion v2 of this module and make it available on npm? Many thanks!
After upgrade to v8 i got error in console
Error: stream.push() after EOF at readableAddChunk (_stream_readable.js:227:30) at Gzip.Readable.push (_stream_readable.js:195:10) at Gzip.Transform.push (_stream_transform.js:151:32) at Zlib.callback (zlib.js:430:16)
I see that my app works with much higher CPU load and produces 20% throughput at the same time.
3.1.0 with no options: fast (119MB/s)
4.0.0 / 4.0.1 with no options: slow (20 MB/s)
4.0.x with { br: false }
: slow
4.0.x with { br: false, gzip: false, deflate: false }
: fast again
same machine, same nodejs version (12.16.3)
3.1.0 produces compressed output (I checked)
trying to trace it further
for each test, we'd need to tear down via server = app.listen(); server.close()
TypeError: Cannot read property 'write' of null
at Zlib.callback (zlib.js:455:34)
maybe you have some ideas on this, but I'd definitely like if we didn't have to shadow the default stuff in Koa for these kinds of middleware, it'll definitely become error-prone (and just not fun to work with). Thankfully I don't think too many will sit "under" like this one does, it was a decent PITA with connect as well, but maybe we can work on improving that.
Branch | Build failing π¨ |
---|---|
Dependency | koa |
Current Version | 2.3.0 |
Type | devDependency |
This version is covered by your current version range and after updating it in your project the build failed.
koa is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.
The new version differs by 27 commits.
418bb06
2.4.0 β added missing 2.3.0 changelog
c68a696
2.3.0
687b732
travis: test node@9
53a4446
expose the Application::handleRequest method (#950)
85ff544
deps: update min engines (#1040)
6029064
HTTP/2 has no status message (#1048) (#1049)
18e4faf
Update fresh to ^0.5.2 to close vulnerability (#1086)
e8a024c
docs: ddd Chinese docs link for v2.x (#1092)
1e81ea3
docs: update babel setup (#1077)
43a1df8
test: Remove --forceExit flag for Jest (#1071)
0168fd8
docs: Update middleware.gif (#1052)
e1e030c
docs: command is wrong for running tests (#1065)
77ca429
test: replace request(app.listen()) with request(app.callback())
7f577af
meta: update AUTHORS (#1067)
f3ede44
docs: fix dead link to logo image (#1069)
There are 27 commits in total.
See the full diff
There is a collection of frequently asked questions. If those donβt help, you can always ask the humans behind Greenkeeper.
Your Greenkeeper Bot π΄
this is my middleware code
app.use(compress({ filter: function (content_type) { return /text/i.test(content_type) }, threshold: 1, flush: require('zlib').Z_SYNC_FLUSH }))
And this is my response code
ctx.body = 'Hello world'
ctx.compress = true
ctx.set('Content-Type', 'text/plain')
ctx.set('content-encoding', 'gzip')
now when I hit the url localhost:3000/test, I get the error saying
ERR_CONTENT_DECODING_FAILED
But when I hit same url using CURL, I get the text saying
Hello World
I guess it's apparent that data is not getting compressed otherwise curl wouldn't have shown plain text. And I think Chrome error is due to content-encoding being gzip but actual data being plain text. May be I'm wrong but this is sure that there is some problem is my code otherwise chrome should have shown Hello World without error, isn't?
Bummer man. How is the client supposed to keep track of load progress?
people are going to complain if we always set vary
The format for the brotli options is wrong. So brotli quality level still remains at 11.
Since brotli is the default this took me forever to find the reason why requests took so long. Since the actual compression is native, koa-compress doesn't show up on the profiler or when timing the middleware.
There is already an open PR #165
Something like this behavior here for express's compression module: expressjs/compression#53
In __tests__.index.js
there is a section:
it('should support Z_SYNC_FLUSH', (done) => {
const app = new Koa()
app.use(compress({
flush: zlib.constants.Z_SYNC_FLUSH
}))
// and so on
})
Is flush
really a top-level option? Or should it be:
app.use(compress({
gzip: { // or deflate?
flush: zlib.constants.Z_SYNC_FLUSH
}
}))
The test succeeds anyway: with the existing code, with added gzip
, even when the flush
line is commented out.
I don't know what effects should be tested but it looks like they are not observed. I think it should be clarified and the test updated.
would be great to have there i think! nice touch with the manual override :D
Right now, I have to choose between compressing my responses and logging them...
I tried to go down the road of unzipping/inflating them after, but decided that maybe the library could help out a bit.
I purpose something along the lines of adding a new prop onto the response object called pre_compressed_body
or something similar so if someone wants to log the response they can.
It would be nice if this added a response.originalLength
property to make it easy to send trace information about the "real" response length. This would mimic the existing request.originalUrl
that Koa maintains.
Branch | Build failing π¨ |
---|---|
Dependency | supertest |
Current Version | 3.2.0 |
Type | devDependency |
This version is covered by your current version range and after updating it in your project the build failed.
supertest is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.
#509 - Fix #486, bug in _assertBody, switch to deepStrictEqual
(thanks @mikelax)
#510 - Refactor test files to use const/let (thanks @rimiti)
The new version differs by 10 commits.
e910e85
chore: Prepare for v3.3.0 release.
bd864de
Merge pull request #511 from visionmedia/bugfix-486-equal
101fbf5
Merge branch 'master' into bugfix-486-equal
04230bb
Merge pull request #510 from visionmedia/refact-const-let
510a7ae
bugfix: 486 Change method to use deepStrictEqual. (#509)
913150d
chore(.editorconfig) [*.md] block removed
82e0828
refact(test/supertest.js) vars replaced by const and let
5443136
chore(.editorconfig) configuration file created
7233ba6
chore(.eslintrc) parserOptions option added to use es6
322ebf6
bugfix: 486 Change method to use deepStrictEqual.
See the full diff
There is a collection of frequently asked questions. If those donβt help, you can always ask the humans behind Greenkeeper.
Your Greenkeeper Bot π΄
By sending Accept-Encoding: gzip, deflate
the response should be encoded with either gzip or deflate.
If there are any other previous requests that included other encodings, the response may be encoded with an unexpected algorithm.
encodingWeights
by multiple requests.Accept-Encoding: gzip, deflate, br
Content-Encoding: br
Accept-Encoding: gzip, deflate
Content-Encoding: br
const Koa = require('koa');
const compress = require('koa-compress');
const axios = require('axios');
const app = new Koa();
app.use(compress());
app.use((ctx) => {
ctx.body = {...}; // long json
});
app.listen(3000);
async function test() {
const res1 = await axios.post('http://localhost:3000', {}, {
headers: { 'Accept-Encoding': 'gzip, deflate, br' },
});
console.log(res1.headers['content-encoding']); // br
const res2 = await axios.post('http://localhost:3000', {}, {
headers: { 'Accept-Encoding': 'gzip, deflate' },
});
console.log(res2.headers['content-encoding']); // br
}
setTimeout(test, 1000);
If the encoding doesn't use one of the expected algorithms, it will break the clients
Here an interesting link : https://hacks.mozilla.org/2015/11/better-than-gzip-compression-with-brotli/
Hello, while working on a REST API I noticed that JSON bodies are always compressed, no threshold check at all.
I found it very confusing when I used curl to hit my endpoint and got a compressed response dumped to my console. It seems like if curl doesn't send the Accept-Encoding
header up it does not decompress the response if it gets a compressed response in return.
Nevertheless, I think a client that does not send Accept-Encoding
should not be assumed to support gzip
or any other compression scheme - by default they should not get any compressed responses.
This error has been showing up on my production web server for ABA Collection. It appears to be due to a way compress
is interacting with HTTP/2 or at least when using http2
.
Based on all of my debugging, it seems as though it is returning after the response has already been sent.
I'll work on a PR and at least get a test up and running that shows the error.
Sending Accept-Encoding: gzip, or Accept-Encoding: identity, if the response size is more than the threshold, the response get encoded with Brotli.
I used the default settings: app.use(compress());
I went back to 3.1.0 which works perfectly.
Is both header and body compressed? or just body?
Also, what happens if i have already compressed the body myself (using brotli for example). Will thise cause an issue? (re-compress is not a good idea). What happens in this case?
Although docs says
You can always enable compression by setting this.compress = true
Mistake in docs or code? :)
I've recently upgraded from 3.1.0
to 5.0.1
and without any other change, I noticed a huge slow down in my application.
Requests that previously took about 50ms suddenly took over 1s to complete. Downgrading to 3.1.0
again solved this issue.
I wonder if this has something to do with the new Brotli support? But since Node 10 is not actually supporting that natively, maybe it should then be disabled there?
The default for options.defaultEncoding
is documented to be idenity
. Is this a typo? The word "idenity" doesn't seem to be the name of anything I can find. If it's meant to be "identity", what does that even mean in this context?
Edit: I've come across other people referring to "no encoding" as identity encoding. I guess this jargon is like the identity function in fp.
The option:
threshold
Minimum response size in bytes to compress. Default 1024 bytes or 1kb.
Can't be set to 0
(compress always) because the code var threshold = !options.threshold ? 1024 : ...
. When 0
, the checking fails. It should be var threshold = options.threshold === undefined ? 1024 : ...
(or even != null
).
Would you accept a PR for this or do you think the minimum threshold should be 1
since there is no point in compressing a 0 byte response? (But not being able to use 0
to indicate "always" is kinda counter-intuitive).
I am using koa-compress module to compress responses.
Following ways I have tried to use it.
import compress from "koa-compress";
import { constants } from "zlib";
Method 1).
app.use(compress());
Method 2).
app.use(compress({
filter: function (content_type) { return ( /json/i.test(content_type) || /text/i.test(content_type)) },
threshold: 1024,
gzip: {
flush: constants.Z_NO_FLUSH,
level: constants.Z_BEST_COMPRESSION
}
}));
The response json object I receive is not compressed.
I have tried adding "Accept-Encoding" in both ways application/gzip
& gzip
.
But no success.
Following versions I am currently using.
"@types/koa-compress": "^4.0.0",
"koa-compress": "^4.0.1",
We are seeing issues with koa-compress v2.1.0. The issue is that node v8 upgrade is mixed up in minor release of the module. So, there are references of async/await keywords in [email protected]
See below code in index.js
of the published version.
return async (ctx, next) => {
ctx.vary('Accept-Encoding')
await next()
NOTE: we do not have to support flush()
. however, we could attach the compression stream to ctx
incase someone wanted access to that.
Can this be used without koa, with just a http server? If so, how?
I know it's a little bit out of the KOA architecture. But say we want to handle the response ourselves, koa-compress will always log an "ERR_HTTP_HEADERS_SENT"
Sure I can delete the body after it's sent or do some other weird hack, but it seems like koa-compress should check ctx.headerSent or ctx.respond`
A use case for this is with a next.js custom server, which I'm making a middleware for next.js to use in Strapi. Here's the middleware code
const router = new Router();
const handle = nextApp.getRequestHandler();
router.get('*', async (ctx, next) => {
await next();
if (ctx.response.status === 404) {
ctx.respond = false;
await handle(ctx.req, ctx.res);
}
});
strapi.app.use(router.routes());
Branch | Build failing π¨ |
---|---|
Dependency |
eslint-plugin-flowtype
|
Current Version | 2.46.3 |
Type | devDependency |
This version is covered by your current version range and after updating it in your project the build failed.
eslint-plugin-flowtype is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.
<a name"2.47.0">
The new version differs by 10 commits.
45e86d8
Merge branch 'pnevyk-feat/array-style'
1d664d7
docs: correct documentation
a916617
Merge branch 'master' into feat/array-style
8f86c4b
docs: add eslint-config-flowtype-essential (#328)
6320bee
feat: Refactor array-style-...
rules
b95dd31
feat: Improve error messages
687f82b
feat: Change default array notation for simple types to "verbose"
4a6f03d
feat: Implement fixation in array style rules
afd4210
feat: Implement array style rules
1232069
docs: Add documentation for array style rules
See the full diff
There is a collection of frequently asked questions. If those donβt help, you can always ask the humans behind Greenkeeper.
Your Greenkeeper Bot π΄
Hi,
v2.1.0 has published two days ago. However, it started using async/await instead of Promise. This I think should be a major version bump as it's a breaking change.
Hi, occasionally getting following error
Error: stream.push() after EOF
sometime also seeing
TypeError: Cannot read property 'write' of null
File "zlib.js", line 469, col 32, in Zlib.callback
only package we use that references zlib is koa-compress with next config
configuration
app.use(compress({
flush: require('zlib').Z_SYNC_FLUSH
}))
using node 8.
Any idea on why this happens and how to fix this?
This middleware looks useful, but I'm not sure how to use it. Perhaps more examples with explanations would help?
I'd like to use brotli if the client accepts it, and fallback to gzip if not. In the readme, your example disables brotli and later states,
Brotli compression is supported in node v11.7.0+, which includes it natively.
Is brotli disabled because Node includes it natively or because that's just the example?
The current encodings are, in order of preference: br, gzip, deflate
Does the order of the encodings in the options object matter or is the "preference" always the same order?
Does one encoding fallback to another if not supported?
I'm unable to help with contributing these changes because I don't understand how it works, but I think answering these questions in the readme would help others understand. Or maybe it's just me..
@uhop curious your opinion on this - I had to do the following to make compression work fast for https://forwardemail.net:
https://github.com/ladjs/web/blob/6c2c09e5f78f403194883d169a4823dec73da743/index.js#L152-L159
we can just do res = await request(server)
now
A declarative, efficient, and flexible JavaScript library for building user interfaces.
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. πππ
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google β€οΈ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.