Giter Club home page Giter Club logo

compression's Issues

threshold doesnt behave as expected if resp.writeHead is used

var compression = require('compression')
var express = require('express')

var app = express()

// compress all requests 
 app.use(compression())

app.get('/', function (req, res) {
        res.writeHead(200, {"Content-Type" : "text/plain"});
        res.end('Hello World!');
})

app.listen(8000);

Curl call - curl -H "Accept-Encoding: gzip" -vvv http://hostname:8000/

----- I found there is slight error in res.end, where it doesnt check for threshold, if there is _header, which is causing this error. (my guess)

If I do sthing like this in res.end, it fixes the error

      if (!this._header) {
        len = Number(this.getHeader('Content-Length')) || len
        checkthreshold(len)
        this._implicitHeader()
      } else {
       //Check threshold irrespective
       checkthreshold(len)
      }
      if (!stream || !compress) { //adding the no compress too
        return end.call(res, chunk, encoding)
      }

But I am not sure if it has consequences, in res.write, there is a check for threshold only if there is a Content-Length header.

So I guess the fix might be little complicated than this.

Hello, need help about compression module

Hi,

Just upgrade expressjs from 3.3.4 to 4.6.1.

I installed compression 1.0.9 module, but found something weird.

It seems gzip compression only works with res.write()

Here is my testing code

var express = require('express')
var compression = require('compression');

var app = express();
app.use(compression());

app.get('/', function (req, res, next) {
    res.setHeader('Content-Type', 'application/json');
    var msg = JSON.stringify({'text':'hello'});
    res.write(msg);
    res.end();
});

app.listen(3000);

And the response header is

Connection  keep-alive
Content-Encoding    gzip
Content-Type    application/json
Date    Thu, 14 Aug 2014 09:27:02 GMT
Transfer-Encoding   chunked
Vary    Accept-Encoding
X-Powered-By    Express

It looks good. But when I changed res.write(msg); res.end(); to res.send(msg).

Then the response header changes to

Connection  keep-alive
Content-Length  16
Content-Type    application/json; charset=utf-8
Date    Thu, 14 Aug 2014 09:30:58 GMT
Etag    W/"10-2312940092"
Vary    Accept-Encoding
X-Powered-By    Express

Does this mean the response is not gzip compressed?

And also res.render(...) is not working either. I don't have this issue with expressjs 3.3.4.

Did I use compression module improperly? Any help will be appreciated. Thanks very much.

Does not work with Server-Sent Events

So I'm working on a small application which uses Server-Sent Events...everything works perfectly in my local environment, but the server refuses to push out any messages when the app runs on heroku.

I spent quite a few hours trying to debug this.. Then I found out it's because I'm using compression middleware in production environment. everything works again as soon as I remove it.

I'm using the basic SSE implementation, not using any third party library for that. You should be able to reproduce this issue by adding this middleware to any expressjs + SSE code examples. (like this one: https://github.com/TravelingTechGuy/express-eventsource)

I'm fairly new to expressjs. I'd be great if this can be solved (or if anyone can provide a workaround solution).

Thanks!

option to disable DEFLATE encoding

I ran into a problem serving compressed requests to Internet Explorer because of the zlib wrapper included with DEFLATE'd responses which, while correct according to the RFC, is not supported by IE. https://connect.microsoft.com/IE/feedback/details/1007412/interop-wininet-does-not-support-content-encoding-deflate-properly . In lieu of doing some user-agent detection or always removing the wrapper, it would be nice to have an option to disable DEFLATE usage in favor of gzip. I'm completely willing to submit a PR but wanted to open an issue first for discussion.

regression introduced in 1.0.4?

Some requests on static files seem to hang when using 1.0.4 (1.0.3) works fine.

here's a piece of code to reproduce (at least with chrome 35.0.1916.114 and safari 7.0.4 (9537.76.4))

var express = require('express');
var compress = require('compression');
var app = express();
var server = app.listen(1234, "127.0.0.1", function() {
  console.log('Server listening on 127.0.0.1:1234', __dirname);
});

app.use(compress());
app.use("/", express.static(__dirname, { maxAge: 3 * 24 * 60 * 60 }));

to start, copy above to index.js

npm install express
npm install compression

tested with with express 4.2.0 and compression 1.0.2

then go tot http://localhost:1234/bootstrap.min.css, the request will hang (pending state in chrome network tab)

Do not compress for Cache-Control: no-transform by default

There has been a few questions raised about how a software project can opt-out of compression transparently, without requiring a user to configure this middleware to avoid the other project's output or have the other project be aware of compression. The current idea is that compression can, when determining to compress, check if there is a Cache-Control response header and if it has no-transform declared, skip compression.

Request with "HEAD" method will error

Errorl log =>

Error: write after end
  at ServerResponse.OutgoingMessage.write (_http_outgoing.js:409:15)
  at ServerResponse.res.write (/home/xxx/app/node_modules/compression/index.js:95:17)
  at ServerResponse.res.end (/home/xxx/app/node_modules/compression/index.js:112:14)
  at ServerResponse.send (/home/xxx/app/node_modules/express/lib/response.js:191:8)
  at fn (/home/dan/app/node_modules/express/lib/response.js:900:10)
  at Immediate._onImmediate (/home/xxx/app/node_modules/express-handlebars/lib/utils.js:34:13)

Allow users to pass in their own compression-stream function

It looks like FF now accepts brotli compression. I was going to take a stab at implementing a native addon that links libbrotli. Does that seem worthwhile?

We could use this: https://www.npmjs.com/package/brotli, which uses an emscriptenized version of brotli.

Does anyone know of any others brotli modules that could be used? Otherwise, I don't mind starting the C++ add on.

I've been taking a look at how the gzip stream is used here, so I think I can make a nice library that implements the same interface.

net::ERR_EMPTY_RESPONSE with browsers

Hi, I got an net::ERR_EMPTY_RESPONSE as soon as requests are above the threshold. Here is an example:

var app = require('express')();
var compression = require('compression');

app.use(compression({ threshold: 512 }));
app.use(function (req, res, next) {
    res.send(Array(513).join("."));
    return next();
});

app.listen(9000);

I tested this under Mac OSX Yosemite and Windows 7 and Ubuntu.

What is strange is that it works with a curl http://localhost:9000 command.

Did you have any idea on how to fix that?

Compression runs but response is not encoded

Hi,
I created a very basic example on my machine (Windows 10, NodeJs 4.2.1, Express 4.13.3) but I can't make compression module to run properly.

I'm making a simple request with Accept-Encoding : gzip.
Using DEBUG=compression I can see that compression module make compression (or at least logs it)

Example app listening at http://:::3000
  compression gzip compression +0ms

But content is not compressed!
This is the RAW output of my request. It misses "Content-Encoding : gzip" header and content is not gzipped.

HTTP/1.1 200 OK
X-Powered-By: Express
Content-Type: text/plain
ETag: "472456355"
Vary: Accept-Encoding
Date: Sun, 18 Oct 2015 13:27:45 GMT
Connection: keep-alive
Transfer-Encoding: chunked

c
Hello World!
0

Below there is the code I used.

var compression = require('compression')
var express     = require('express')

var app = express()

app.use(compression({
    threshold : 0
}))

app.get('/', function (req, res) {  
    res.setHeader('Content-Type', 'text/plain')
        res.send('Hello World!');
});

var server = app.listen(3000, function () {

    var host = server.address().address;
    var port = server.address().port;

    console.log('Example app listening at http://%s:%s', host, port)
});

I tried also removing node_modules and running

npm cache clean
npm install

Any suggestions? I can't believe I am the only one with this type of issue!

Many thanks,
Emanuele

Document zlib options

Based on the zlib options, it seems like there are some extra properties that can be used but aren't very well documented there either (level, memLevel, and strategy). Does it make sense to at least show which ones can be accepted?

Does not work with Firefox over SPDY

On Chrome, all is fine, but on Firefox, it never compresses anything.
Firefox headers differ from Chrome -s, only by having Connection: "keep-alive" added.
When setting DEBUG=compression, it keeps saying the following:
compression no compression: not acceptable
What could be the problem?

ember-cli doesn't compress

Hi, first things first: thank you for your work!

I have an ember-cli app using express 4.1.1. After the following changeset:

diff --git a/package.json b/package.json
index 76016d6..eae7373 100644
--- a/package.json
+++ b/package.json
@@ -25,6 +25,7 @@
     "broccoli-coffee": "^0.1.0",
     "broccoli-ember-hbs-template-compiler": "^1.5.0",
     "express": "^4.1.1",
+    "compression" : "^1.0.8",
     "body-parser": "^1.2.0",
     "glob": "^3.2.9",
     "ember-cli-ic-ajax": "0.1.0",
@@ -42,6 +43,7 @@
     "broccoli-coffee": "^0.1.0",
     "broccoli-ember-hbs-template-compiler": "^1.5.0",
     "express": "^4.1.1",
+    "compression" : "^1.0.8",
     "body-parser": "^1.2.0",
     "glob": "^3.2.9",
     "ember-cli-ic-ajax": "0.1.0",
diff --git a/server/index.js b/server/index.js
index 5621428..ff5cb2d 100644
--- a/server/index.js
+++ b/server/index.js
@@ -9,12 +9,16 @@

 var newrelic   = require('newrelic');
 var express    = require('express');
+var compression = require('compression');
 var bodyParser = require('body-parser');
 var globSync   = require('glob').sync;
 var routes     = globSync('./routes/*.js', { cwd: __dirname }).map(require);

 module.exports = function(emberCLIMiddleware) {
   var app = express();
+
+  app.use(compression())
+
   app.use(bodyParser());

   routes.forEach(function(route) { route(app); });

I don't see encoded connections when I expect them, like:

$ curl -I -H 'Accept-Encoding: gzip,deflate' localhost:4200/assets/vendor.js
HTTP/1.1 200 OK
X-Powered-By: Express
Last-Modified: Sun, 06 Jul 2014 03:19:28 GMT
Cache-Control: private, max-age=0, must-revalidate
Content-Length: 2436528
Content-Type: application/javascript
Date: Sun, 06 Jul 2014 03:21:48 GMT
Connection: keep-alive

Why can it be? Thank you again!

Zlib partial flush

I replaced my uses of res.flush with compression({ flush: zlib.Z_PARTIAL_FLUSH }). Should this be documented as an automatic alternative?

error on install

0 info it worked if it ends with ok
1 verbose cli [ 'C:\\Program Files\\nodejs\\\\node.exe',
1 verbose cli   'C:\\Program Files\\nodejs\\node_modules\\npm\\bin\\npm-cli.js',
1 verbose cli   'install',
1 verbose cli   '[email protected]' ]
2 info using [email protected]
3 info using [email protected]
4 verbose node symlink C:\Program Files\nodejs\\node.exe
5 error Error: Invalid version: "0.3"
5 error     at Object.module.exports.fixVersionField (C:\Program Files\nodejs\node_modules\npm\node_modules\read-package-json\node_modules\normalize-package-data\lib\fixer.js:183:13)
5 error     at C:\Program Files\nodejs\node_modules\npm\node_modules\read-package-json\node_modules\normalize-package-data\lib\normalize.js:30:38
5 error     at Array.forEach (native)
5 error     at normalize (C:\Program Files\nodejs\node_modules\npm\node_modules\read-package-json\node_modules\normalize-package-data\lib\normalize.js:29:15)
5 error     at final (C:\Program Files\nodejs\node_modules\npm\node_modules\read-package-json\read-json.js:342:33)
5 error     at then (C:\Program Files\nodejs\node_modules\npm\node_modules\read-package-json\read-json.js:126:33)
5 error     at C:\Program Files\nodejs\node_modules\npm\node_modules\read-package-json\read-json.js:266:40
5 error     at fs.js:266:14
5 error     at C:\Program Files\nodejs\node_modules\npm\node_modules\graceful-fs\graceful-fs.js:105:5
5 error     at Object.oncomplete (fs.js:107:15)
6 error If you need help, you may report this *entire* log,
6 error including the npm and node versions, at:
6 error     <http://github.com/npm/npm/issues>
7 error System Windows_NT 6.2.9200
8 error command "C:\\Program Files\\nodejs\\\\node.exe" "C:\\Program Files\\nodejs\\node_modules\\npm\\bin\\npm-cli.js" "install" "[email protected]"
9 error cwd G:\www\nodejs-and-angularjs-basic-skeleton
10 error node -v v0.10.28
11 error npm -v 1.4.9
12 verbose exit [ 1, true ]

callback parameter is missing from 'write' and 'end' methods

The methods 'write' and 'end' don't accept a 'callback' parameter - the underlaying http response stream does. Therefore there is a different runtime behaviour when sending off the data in chunks (waiting for the response to be written) when using compression (callback will not be called and the program hangs)

res.write not sending chunks until res.end() is called

I'm using node v.0.12.0 with express (v4.4.1) and compression (v1.6.0)

I'm sending back about 80MB of dynamically generated data (not from FS or DB) in multiple res.write() calls. When I add the compression middle ware (with no options passed), I don't see any traffic (using WireShark) from the Node server until the res.end() is called. When res.end() is called, there is a sudden burst of chunked data.

But when the compression module is not used, I see chunked responses going on the wire. The size of each of these chunked fragments seems to be between 10K and 16K (based on WireShark).

The only header information I am setting happens before the res.write() calls and is:

res.setHeader('Content-Type', 'application/json');

Any reason why the data is getting buffered before the res.end() call?

content-length for gzipped data?

Is there a way to get the content-length value for data that was run through compression, so that even in chucked/varied transfer mode, the final content length can be send along for browsers that rely on that value to properly terminate the transfer?

GZIP components before Express server starts

Originally this question was on a pre-existing issue thread, but thought I would open up a new question to keep it separated.

There was a question about caching the results of compression, but I think maybe it's better to GZIP assets before the server starts - seems less messy and even more efficient.

So, if this is possible, it would probably require streaming the results of the compression to the filesystem using the command line or a build system like Gulp or Grunt.

In my case, I am only working with one deployment file, so this would be very straightforward. So I am looking for a way to configure this module so that it can be used with a build system like grunt or gulp, or if someone has a better idea...this is more of a discussion I guess.

Using compression in spdy server push

Is there any way to do this? To somehow use this middleware while pushing cached data trough spdy with push? I might be able to do the gziping of the data, but if it's large, chunking becomes a problem, and I really don't want to deal with the headers manually.

Cache compressed output

Hi

I wrote a module to cache dynamic content with a relative short TTL because I have a lot of requests of data that clients can accept being a couple of seconds outdated.

I'm using this module to compress the output but I realized that even though I send a cached result it's compressing it each time. Is there a way I can cache the compressed output and thus not compressing it on each request?

Why don't export default filter to user?

I wanna set some custom filter for specific request, but I wanna use default filter for others.

But the module.exports override the exports.filter function, so i can't call it in my filter function.

Not working with static

I have this simple server:

var express = require('express');
var app = express();
var compression = require('compression');

app.use(express.static(__dirname + '/dist'));
app.use(compression({ threshold: 0 }));

app.all("/*", function(req, res, next) {
  res.sendFile('index.html', { root: __dirname + '/dist' });
});

var server = app.listen(3000, function () {

  var host = server.address().address;
  var port = server.address().port;

  console.log('Example app listening at http://%s:%s', host, port)

});

If I set DEBUG=compression and then run node app.js , nothing show up.
My response headers doesn't have Content-Encoding: gzip .

I tried using compression middleware before the static, but got no success as well.

Any ideas?

caching compressed static content

Is there a way in which when using compression on static content served using express static, the compressed content can be cached somewhere so that we can avoid the need to re-compress stuff often?

Compression returns (invalid) Content-Length which hangs HTTP connections.

The middleware does not process and modify headers that are passed as arguments to the writeHead() function. If the Content-Length is passed to the writeHead(), it is returned unaltered, with a value that after compression is invalid.

A breaking example:

var connect = require('connect');
var compress = require('compression');
var http = require('http');

var app = connect()
  .use(compress({
    threshold: 0
  }))
  .use(function(req, res) {
    res.setHeader('Content-Type', 'text/plain')
    res.writeHead(200, {'Content-Length': 3})
    res.end('foo');
  });

http.createServer(app).listen(5000);

This returns following headers:

Connection:keep-alive
Content-Encoding:gzip
Content-Length:3
Content-Type:text/plain
Date:Wed, 26 Mar 2014 19:16:26 GMT
Vary:Accept-Encoding

A failing unit test

There are other related issues caused by the same bug. Following headers are incorrectly processed if they are passed to the writeHead function: Vary, Content-Type, Content-Encoding.

defaults for ARM

The default gzip compression level is 6. A bit high compared to eg. hardware load balancers that usually default to 1.

Not a big problem on for example a modern Intel/AMD CPU. But causes a major slowdown when the compression module is installed on a site that runs on fx. a Raspberry Pi.

So here's a feature request.

if (process.config.variables.host_arch == 'arm') level = 1;

In other words, if NodeJS was compiled and is running on ARM (either v6l or v7), then set a default compression level of 1, to avoid extremely sluggish sites after installing the compression module.

prefer deflate only as last resort

Sending conforming DEFLATE responses are not accepted by any version of IE (IE 11 is current as of this writing); see some convo in #25. DEFLATE is still useful for clients that only support it and there's no real reason not to support it, but may as well send gzip even if the client says it prefers deflate more, as long as the client lists that it can accept gzip.

Can't render headers after they are sent to the client.

var express = require('express'), app = express(), server = require('http').createServer(app), io = require('socket.io').listen(server);

app.use(require('compression')());
app.use('/public', express.static(__dirname + '/public'));
app.use(require('errorhandler')());
io.set('transports', ['websocket']);
io.set('log level', 1);
io.enable('browser client minification');
io.enable('browser client etag');
io.enable('browser client gzip');
server.listen(8080);

http.js:733
throw new Error('Can't render headers after they are sent to the client.'
^
Error: Can't render headers after they are sent to the client.
at ServerResponse.OutgoingMessage._renderHeaders (http.js:733:11)
at ServerResponse.writeHead (http.js:1174:20)
at ServerResponse.writeHead (./node_modules/express/lib/patch.js:51:20)
at ServerResponse.res.writeHead (./node_modules/compression/index.js:130:47)
at ServerResponse._implicitHeader (http.js:1131:8)
at ServerResponse.res.write (./node_modules/compression/index.js:108:14)
at write (_stream_readable.js:583:24)
at flow (_stream_readable.js:592:7)
at ReadStream.pipeOnReadable (_stream_readable.js:624:5)
at ReadStream.EventEmitter.emit (events.js:92:17)

Bring styling inline with newer modules.

I think we should change functions to function () { and remove all unnecessary semicolons, as that is the style used in all the other "newer" modules.

Ping @jonathanong

Unfortunately you already released 1.0.0 to npm. :S

Not that it will make a difference API-wise, but still.

Use Vary on 304

A 304 response won't have content, so the onHeaders listener will return (at least with the default filter function) before setting the Vary header. However, the Vary header should be set, even on 304, for resources that would otherwise be compressable. This comes straight from RFC 7232, which says:

The server generating a 304 response MUST generate any of the following header fields that would have been sent in a 200 (OK) response to the same request: Cache-Control, Content-Location, Date, ETag, Expires, and Vary.

Content Encoding Error

In Chrome I get a blank error page, in Firefox I get this:

Content Encoding Error

The page you are trying to view cannot be shown because it uses an invalid or unsupported form of compression.

    Please contact the website owners to inform them of this problem.

curl -v -H "Accept-Encoding: gzip, deflate" --compressed http://localhost:9000/ works correctly.

I'm using compression via connect 2.19.6 via grunt-contrib-connect and I'm not sure how to best debug the issue. Our code was working at least until last week, but stopped working about today (maybe around July 17) , so I'm wondering if a new dependency broke something.

Unfortinly our code isn't open source so I can't share it, but we're including compression via the connect.compress() middleware which precedes the other middleware.

More advanced compression "filter"

Some compression libraries will lower the compression level and even stop compressing if the memory consumption gets too high, and then they will start compressing again once the system has calmed down. Is there something available like that here?

Chrome Not Showing Any Compression Occuring

I'm having trouble getting compression to work properly. I've tried reading a lot of resources and your issue pages but I'm still stuck. Oddly, when I use services like http://www.whatsmyip.org/http-compression-test/ it shows that the service is compressed. But when I use chrome to simple try and go to the service, I dont see any compression happening. When I turn on DEBUG=compression I see the message

compression gzip compression +8m.

So I'd like to believe it is working, but the response in the browser shows no compression. Any advice would be great and appreciated! Thanks!

I have a seemingly simple test server/service

var compression = require('compression');
var express = require('express');

var app = express();

//turn on compression
app.use(compression({ threshold: 0 }));

// simple get service to get 10000 random numbers
app.get('/events', function (req, res) {
    console.log(req.headers);

    var obj = [];

    for (var i = 0; i < 10000; i++) {
        obj.push({
            number: Math.random()
        });
    }

    res.send(obj);
});

//start the server
var server = app.listen(80, function () {
    console.log('Address', server.address());
    console.log('Listening on port %d', server.address().port);
});

The headers in my req are

    { via: '1.1 DETHQTMG109',
     'user-agent': 'Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko)   Chrome/39.0.2171.95 Safari/537.36',
      host: 'usaievents.cloudapp.net',
      'cache-control': 'max-age=0',
      accept: 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8',
      'accept-language': 'en-US,en;q=0.8',
      connection: 'Keep-Alive',
      'if-modified-since': 'Tue, 20 Jan 2015 15:21:29 GMT', 
      'if-none-match': 'W/"YxqKyY8MDi7MUoNr2htsjg=="',
      'accept-encoding': 'gzip' 
    }

Can not make compression work with static files

Please help. Compression does not work. There is no Content-Encoding in responses

var express = require('express');
var compression  = require('compression');
var app = express();

app.set('port', (process.env.PORT || 5000));
app.use(compression());
app.use(express.static(__dirname + '/public'));

app.listen(app.get('port'), function() {
  console.log("Node app is running at localhost:" + app.get('port'));
});

event emitter leak in compression

Here is my keep-alive function.

var ka = setInterval(function() {
    res.write(": keep-alive\n");
    res.flush();
}, 15000);
res.on('close', function() {
    clearInterval(ka);
});

Used on top of a HTTP response that delivers a stream of JSON updates, formatted as text/event-stream (hence the flush).

The keep-alive is useful for SSE streams that only update once in a while, such as transmitting new data points on graphs that have only one data point per minute.

Usually there is a (network) router or two among the hops between client and server, that has a TCP state table with a timeout of 30 seconds, so the above helps to keep the channel open until the next event is ready for the client.

The expressjs/compression module, when enabled, has this side effect:

(node) warning: possible EventEmitter memory leak detected. 11 drain listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
    at Gzip.addListener (events.js:239:17)
    at Gzip.Readable.on (_stream_readable.js:665:33)
    at Gzip.once (events.js:265:8)
    at Gzip.Zlib.flush (zlib.js:448:10)
    at ServerResponse.flush (compression/index.js:70:16)
    at null._repeat (sample.js:2:8)
    at wrapper [as _onTimeout] (timers.js:264:19)
    at Timer.listOnTimeout (timers.js:92:15)

I'm not adding any event listeners, however. Just flushing the output stream after writing.

Hmm, thinking about it, the keep-alive function is irrelevant. When using text/event-stream, the first big chunk of data will usually be pushed with just a single flush at the end, but once the stream gets "up to date" or whatever, every complete JSON object written will be followed by a flush().

Probably just random chance that this happened to me inside the keep-alive function...

Anyway, the error disappears if the compression module is disabled.

Can't say it's working.

I am using this module to compress output from a FastCGI bridge. The problem is that there is no compression done at all. The code snippet:

    BIRDmain.use(compression({
        treshold: 1,
        filter: function(req,res) {
            console.log("Compression...");
            return compression.filter(req, res);
        }
    }));
    BIRDmain.use(php({
        root: config.base,
        index: "app.php",
        serverName: config.BIRD3.url,
        serverHost: config.BIRD3.host,
        serverPort: config.BIRD3.http_port
    }));

Using Chrome's dev-tools to look at the request shows that Transfer-encoding is chunked. Vary: is set, but I dont see any other indicies that the module is running. I did use a custom filter function to see if it is called at all - which it is.

What could be the reasons for it not working? Here's an example:

[email protected] ~ $ curl -v http://localhost:8080 --header "Accept-encoding: gzip,*"
* Rebuilt URL to: http://localhost:8080/
* Hostname was NOT found in DNS cache
*   Trying 127.0.0.1...
* Connected to localhost (127.0.0.1) port 8080 (#0)
> GET / HTTP/1.1
> User-Agent: curl/7.38.0
> Host: localhost:8080
> Accept: */*
> Accept-encoding: gzip,*
>
< HTTP/1.1 200 OK
< X-Powered-By: PHP/5.5.14
< Set-Cookie: PHPSESSID=16ann5b3vk6fa28tdrnod60g23; path=/
< Expires: Thu, 19 Nov 1981 08:52:00 GMT
< Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
< Pragma: no-cache
< Content-type: text/html
< Vary: Accept-Encoding
< X-Response-Time: 68.581ms
< Date: Wed, 14 Jan 2015 20:04:17 GMT
< Connection: keep-alive
< Transfer-Encoding: chunked

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.