Comments (14)
Can you post code I can use to reproduce? I tried it out and it works just fine with Express 4.12.3
from compression.
Most likely your request is not greater than 1kb (please read the docs, requests lower than 1kb are not compressed by default since it's a waste and can actually make them larger. See https://github.com/expressjs/compression#threshold for how to alter this). When you use res.send()
the content-length
header is sent; when you use res.write
+ res.end
and don't write out a content-length
header yourself, this module does not know if your request is going to be over 1kb and will simply always compress.
from compression.
Yea that makes sense. But it does mean there's a bit of difference between the 2 usages. Would be great if that was documented.
from compression.
Please feel free to submit a PR with the documentation you desire :)
from compression.
But TL;DR if you didn't already know there was a 1kb limit from the documentation, I'm not sure how you would have read the documentation regarding this behavior before asking, either... I'm always happy to answer questions, but the longer the documentation becomes, the less likely people are to read it.
from compression.
Is it not possible to always calculate the Content-Length even if I'm using res.write(data); res.end()
?
Right now I have observed that res.send(data)
has an automatic content length. res.end(data)
has an automatic content length. The only one that doesn't is res.write(data); res.end()
. Isn't there some way of accumulating the write chunks and incrementally counting it?
I'm currently writing a middleware that is facing this problem of non-determinism depending on how the user decides to send out content (res.send vs res.end vs res.write&res.end).
from compression.
There isn't unless we buffer up the data in memory. We have to know if the response is going to be compressed before we can write even the first chunk to the client. res.write
is the "I know what I'm doing and want chunks". This is just how Node.js works and there is nothing that can change here. You can always create a fork of this module that buffer's chunked responses in memory, though it'll use a lot of memory and mean people can no longer stream large files from disk.
Just think how you're supposed to send a 4GB file: you call res.write
for each chunk such that you don't have to read 4GB of data into memory at all once and crash your process. If this module were to buffer everything to calculate the size, then people's servers are going to crash. There is an absolutely standard way to indicate the size to this module: simply set the content-length
response header...
from compression.
This is, in fact, literally the reason Express offers the res.send
method: because Node.js gives you very low-level things and you need to do a lot of work yourself to get things write. If all you wanted to do was write out a string, res.send
from Express will do all the "boring" work for you like setting the status code, the ETag
header, the Content-Length
header, the Content-Type
header and more. If you're not using that method, you pretty much need to implement that all yourself. You can always file a bug report with Node.js itself asking for automatic Content-Length
calculation, and then it would be there for this module to read, but it's not something that will ever be added to this module.
from compression.
Just a question regarding the Content-Length
: Is it the length of the gzipped data OR is it the length of the data before gzipping?
from compression.
You set it to the length of the content you are writing to res
; this module will manipulate the header for you. That's what is known as "separate of concerns" in computer science :) When you add the Content-Length
header, you are only concerns with what you know: the length of the content you are writing. Something else (like this module) that transforms the response will handle updating any appropriate headers.
from compression.
Yes but I want to know if my Content-Length
is going the changed by your middleware? Can you give me a direct answer? I specifically want to know about the Content-Length that will be in the HTTP response header.
from compression.
We remove the Content-Length
header and send your response with Transfer-Encoding: chunked
instead.
from compression.
Ok thank you.
from compression.
Can you give me a direct answer?
I literally directly answered your question and you threw in a second question. You asked:
Is it the length of the gzipped data OR is it the length of the data before gzipping?
I answered:
When you add the Content-Length header, you are only concerns with what you know: the length of the content you are writing.
This thread is locked, I'm sorry.
from compression.
Related Issues (20)
- Setting Vary header although caching is disabled HOT 1
- "drain" event listener leak when using res.once("drain"); can't use res.removeListener("drain") HOT 3
- compresssion doesn't work ,the vue.txt is 2m HOT 2
- Content-Type: application/json; charset=utf-8 No effect HOT 2
- Question: Why this middleware HOT 2
- Corrupted compressed .js-files for Mac OS / Safari -clients HOT 11
- Is compression working when node server is running on a container? HOT 2
- middleware fails when the request has more than 1 values for accept-encoding header HOT 2
- Is compression result cached? HOT 1
- change Transfer-encoding HOT 1
- Why does the data size increase after compression HOT 1
- Force size to be a minimum... HOT 2
- Chunked encoding is broken after using this middleware HOT 1
- Using a current debug version HOT 1
- Deflate backwards HOT 7
- Compression instrumentation (before/after compression hooks) HOT 2
- Angular Not Compressing? HOT 2
- compression not working json payload HOT 6
- Crash when compressing characters like ū HOT 1
- Express returns a non-compliant HTTP/206 response when gzip is enabled HOT 9
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from compression.