Giter Club home page Giter Club logo

Comments (29)

kartikk221 avatar kartikk221 commented on May 25, 2024 1

The bug with the discarded access errors when streaming files under high load has been resolved.

The Response.send() method now calls the uWS.HttpResponse.endWithoutBody() method under the hood when you have written a custom content-length header and call the Response.send() method with no body. I was surprised to find that there wasn't much documentation or event types in uWebsockets.js for this method hence it wasn't previously implemented.

You may update your version to the recent v6.2.1 and see If your problems above are resolved.

from hyper-express.

kartikk221 avatar kartikk221 commented on May 25, 2024 1

Sounds good! In terms of optimizations, for ther most part everything is optimized up to par. The only thing that HyperExpress currently struggles with is that local benchmarks yield considerably lower performance results compared to over the network results. It seems that the cause for this is that when you benchmark locally, there is almost zero network delay or low-level processing going on and all of the data is being directly piped from and to uWebsockets.js/HyperExpress and thus the Node internal cycles just aren't able to keep up with the delays. If you can possibly figure out ways to reduce callbacks or heavy calls made from the Server._handle_uws_request() method onwards in the lifecycle of a request then that would be a huge improvement on that side of things for optimization. I've tried profiling the CPU already when benchmarking and the actual CPU usage is extremely low for all components hence the culprit is likely the Node internal queues/loops which throttle the performance in local benchmarks.

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024 1

After testing with wrk I found out that my solution is not fast enough yet (I'm thinking there is a bug or a local behavior when autocannon gives this high results. Warmup? Bug?)
Hyper-Express

> wrk -t4 -c2500 -d60s http://127.0.0.1:5002

Running 1m test @ http://127.0.0.1:5002
  4 threads and 2500 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    66.06ms   32.61ms   1.45s    99.82%
    Req/Sec     9.65k     1.50k   12.63k    69.27%
  2300181 requests in 1.00m, 146.97MB read
Requests/sec:  38274.17
Transfer/sec:      2.45MB

My solution

> wrk -t4 -c2500 -d60s http://127.0.0.1:5000

Running 1m test @ http://127.0.0.1:5000
  4 threads and 2500 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    56.60ms   36.62ms   1.12s    99.67%
    Req/Sec    11.42k     1.28k   15.76k    85.60%
  2723551 requests in 1.00m, 174.02MB read
Requests/sec:  45316.63
Transfer/sec:      2.90MB

uWebSockets.js

> wrk -t4 -c2500 -d60s http://127.0.0.1:5001

Running 1m test @ http://127.0.0.1:5001
  4 threads and 2500 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    21.58ms    9.57ms 454.37ms   99.68%
    Req/Sec    29.46k     3.03k   62.31k    73.99%
  7021639 requests in 1.00m, 455.35MB read
Requests/sec: 116849.68
Transfer/sec:      7.58MB

I'm not sure how can I make a PR, because I partially reworked your code, but I'll try when I get some real performance boost

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024 1

By the way, I tested my npm partially, and for now didn't noticed any bugs. So you can Install it to see the difference. Unfortunately I can't test it on a VPS, becuase I'm from country which is under economic sanctions

from hyper-express.

kartikk221 avatar kartikk221 commented on May 25, 2024 1

Try benchmarking uQuick and uWebsockets.js on a cloud provider instance. See if benchmarking over network brings the results between the two much closer as I have found the results of benchmarks over network to a different machine much more production accurate vs. local benchmarks over 127.0.0.1/localhost.

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

Middleware StaticFiles.js

const fs = require('fs')
const zlib = require('zlib')

const mimeTypes = require('mime-types')
const accepts = require('accepts')
const compressible = require('compressible')

const resolveFile = (file, indexFile = '') => {
  return fs.promises.stat(file, {
    bigint: false
  }).then((stats) => {
    if (stats.isDirectory()) {
      return resolveFile(path.join(file, indexFile))
    } else if (stats.isFile()) {
      return [file, stats]
    } else {
      throw new Error(file + 'is not a file')
    }
  })
}

const destroy = (dataTransfer) => {
  if (dataTransfer.readable && !dataTransfer.readable.destroyed) dataTransfer.readable.destroy()
  if (dataTransfer.writable && !dataTransfer.writable.destroyed) dataTransfer.writable.destroy()
}

const StaticFiles = (options = {}) => {
  const opts = {
    root: path.resolve('www'),
    indexFile: 'index.html',
    compress: true,
    compressionThreshold: 1024,
    ...options
  }

  return async (req, res) => {
    const dataTransfer = {
      readable: null,
      writable: null
    }
    res.on('abort', () => destroy(dataTransfer))

    if (req.method !== 'GET' && req.method !== 'HEAD') {
      return res.status(405).header('Allow', 'GET, HEAD').vary('Accept-Encoding').send()
    }

    try {
      const [file, stats] = await resolveFile(req.path === '/' ? path.join(opts.root, opts.indexFile) : path.normalize(path.join(opts.root, req.path)), opts.indexFile)
      const mimeType = mimeTypes.lookup(path.extname(file))

      if (req.method === 'HEAD') {
        return res.status(200)
          .header('Content-Type', mimeType)
          .header('Content-Length', stats.size.toString())
          .header('Last-Modified', stats.mtime.toUTCString())
          .vary('Accept-Encoding')
          .send(undefined, undefined, true)
      }

      dataTransfer.readable = fs.createReadStream(file)
      dataTransfer.readable.once('end', () => !dataTransfer.readable.destroyed && dataTransfer.readable.destroy())

      // Compression
      let compression = null
      // Compression candidate?
      if (compressible(mimeType) && stats.size >= opts.compressionThreshold) {
        const accept = accepts(req)
        let method = accept.encoding(['gzip', 'deflate', 'identity'])

        // we really don't prefer deflate
        if (method === 'deflate' && accept.encoding(['gzip'])) {
          method = accept.encoding(['gzip', 'identity'])
        }

        // compression possible
        if (method && method !== 'identity') {
          compression = method
        }
      }

      res.status(200)
        .header('Content-Type', mimeType)
        .header('Last-Modified', stats.mtime.toUTCString())
        .vary('Accept-Encoding')

      if (compression) {
        res.header('Content-Encoding', compression)

        dataTransfer.writable = zlib.createGzip()

        dataTransfer.writable.once('end', () => !dataTransfer.writable.destroyed && dataTransfer.writable.destroy())

        return dataTransfer.readable
          .pipe(dataTransfer.writable)
          .pipe(res)
      } else {
        return res.stream(dataTransfer.readable, stats.size)
      }
    } catch (ex) {
      console.log(ex)
      return !res.completed && res.status(404).send('File not found: ' + req.path)
    }
  }
}

module.exports = StaticFiles

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

Response.js changes (notice withoutBody parameter, it's needed for HEAD method, otherwise uWebSockets will send a Content-Length: 0 header

send (body, closeConnection, withoutBody) {
    // Ensure response connection is still active
    if (!this.#completed) {
      // Initiate response to write status code and headers
      this._initiate_response()

      // Stop downloading further body chunks as we are done with the response
      this.#wrapped_request._stop_streaming()

      // Mark request as completed and end request using uWS.Response.end()
      const sent = !withoutBody ? this.#raw_response.end(body, closeConnection) : this.#raw_response.endWithoutBody()

      // Emit the 'finish' event to signify that the response has been sent without streaming
      if (!this.#streaming) this.emit('finish', this.#wrapped_request, this)

      // Call any bound hooks for type 'complete' if no backpressure was built up
      if (sent && !this.#completed) {
        // Mark request as completed if we were able to send response properly
        this.#completed = true

        // Emit the 'close' event to signify that the response has been completed
        this.emit('close', this.#wrapped_request, this)
      }

      return sent
    }

    return false
  }

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

How I use the middleware:

Server.use(StaticFiles())

Server.get('/*', () => {})
Server.head('/*', () => {})

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

@kartikk221 Thanks for quick response. This variable is unused at the moment?

let last_offset = this.write_offset;

I got another error when testing your solution:

        this.drain(() => {
             ^

TypeError: this.drain is not a function

Btw, autocannon stopped working for me (probably because of encoding, not sure), so I tested it with wrk -t12 -c400 -d30s http://HOST:PORT

from hyper-express.

kartikk221 avatar kartikk221 commented on May 25, 2024

The last_offset is most likely going to be removed, I just kept it there temporarily as there may have been a potential need for it with backpressuring.

In regards to the error with this.drain not being a function. I'm not sure why you are seeing at because Response.drain is a function that is part of the documentation and internally used. Would you be able to provide the full stack trace of that error?

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

@kartikk221 Sure, here is a full error message:

/something/Response.js:429
        this.drain(() => {
             ^

TypeError: this.drain is not a function
    at Response._stream_chunk (/something/Response.js:429:14)
    at ReadStream.<anonymous> (/something/Response.js:467:43)
    at ReadStream.emit (events.js:315:20)
    at addChunk (_stream_readable.js:309:12)
    at readableAddChunk (_stream_readable.js:284:9)
    at ReadStream.Readable.push (_stream_readable.js:223:10)
    at internal/fs/streams.js:226:14
    at FSReqCallback.wrapper [as oncomplete] (fs.js:539:5)
npm ERR! code ELIFECYCLE
npm ERR! errno 1

Don't look at line numbers, because I modified the original file: this line is in _stream_chunk() method

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

This error appears only under load-testing

from hyper-express.

kartikk221 avatar kartikk221 commented on May 25, 2024

Can you try adding a try...catch around the this.drain call in your file and console.log the this property on error to see If the scoping is correct because I don't see how the drain method cannot exist.

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024
Response {
  _writableState: WritableState {
    objectMode: false,
    highWaterMark: 16384,
    finalCalled: false,
    needDrain: false,
    ending: false,
    ended: false,
    finished: false,
    destroyed: false,
    decodeStrings: true,
    defaultEncoding: 'utf8',
    length: 0,
    writing: false,
    corked: 0,
    sync: true,
    bufferProcessing: false,
    onwrite: [Function: bound onwrite],
    writecb: null,
    writelen: 0,
    afterWriteTickInfo: null,
    buffered: [],
    bufferedIndex: 0,
    allBuffers: true,
    allNoop: true,
    pendingcb: 0,
    prefinished: false,
    errorEmitted: false,
    emitClose: true,
    autoDestroy: true,
    errored: null,
    closed: false
  },
  _events: [Object: null prototype] {
    finish: [Function: bound onceWrapper] { listener: [Function (anonymous)] },
    abort: [ [Function (anonymous)], [Function (anonymous)] ]
  },
  _eventsCount: 2,
  _maxListeners: undefined,
  locals: {},
  [Symbol(kCapture)]: false
}
Response {
  _writableState: WritableState {
    objectMode: false,
    highWaterMark: 16384,
    finalCalled: false,
    needDrain: false,
    ending: false,
    ended: false,
    finished: false,
    destroyed: false,
    decodeStrings: true,
    defaultEncoding: 'utf8',
    length: 0,
    writing: false,
    corked: 0,
    sync: true,
    bufferProcessing: false,
    onwrite: [Function: bound onwrite],
    writecb: null,
    writelen: 0,
    afterWriteTickInfo: null,
    buffered: [],
    bufferedIndex: 0,
    allBuffers: true,
    allNoop: true,
    pendingcb: 0,
    prefinished: false,
    errorEmitted: false,
    emitClose: true,
    autoDestroy: true,
    errored: null,
    closed: false
  },
  _events: [Object: null prototype] {
    finish: [Function: bound onceWrapper] { listener: [Function (anonymous)] },
    abort: [ [Function (anonymous)], [Function (anonymous)] ]
  },
  _eventsCount: 2,
  _maxListeners: undefined,
  locals: {},
  [Symbol(kCapture)]: false
}
Response {
  _writableState: WritableState {
    objectMode: false,
    highWaterMark: 16384,
    finalCalled: false,
    needDrain: false,
    ending: false,
    ended: false,
    finished: false,
    destroyed: false,
    decodeStrings: true,
    defaultEncoding: 'utf8',
    length: 0,
    writing: false,
    corked: 0,
    sync: true,
    bufferProcessing: false,
    onwrite: [Function: bound onwrite],
    writecb: null,
    writelen: 0,
    afterWriteTickInfo: null,
    buffered: [],
    bufferedIndex: 0,
    allBuffers: true,
    allNoop: true,
    pendingcb: 0,
    prefinished: false,
    errorEmitted: false,
    emitClose: true,
    autoDestroy: true,
    errored: null,
    closed: false
  },
  _events: [Object: null prototype] {
    finish: [Function: bound onceWrapper] { listener: [Function (anonymous)] },
    abort: [ [Function (anonymous)], [Function (anonymous)] ]
  },
  _eventsCount: 2,
  _maxListeners: undefined,
  locals: {},
  [Symbol(kCapture)]: false
}
Response {
  _writableState: WritableState {
    objectMode: false,
    highWaterMark: 16384,
    finalCalled: false,
    needDrain: false,
    ending: false,
    ended: false,
    finished: false,
    destroyed: false,
    decodeStrings: true,
    defaultEncoding: 'utf8',
    length: 0,
    writing: false,
    corked: 0,
    sync: true,
    bufferProcessing: false,
    onwrite: [Function: bound onwrite],
    writecb: null,
    writelen: 0,
    afterWriteTickInfo: null,
    buffered: [],
    bufferedIndex: 0,
    allBuffers: true,
    allNoop: true,
    pendingcb: 0,
    prefinished: false,
    errorEmitted: false,
    emitClose: true,
    autoDestroy: true,
    errored: null,
    closed: false
  },
  _events: [Object: null prototype] {
    finish: [Function: bound onceWrapper] { listener: [Function (anonymous)] },
    abort: [ [Function (anonymous)], [Function (anonymous)] ]
  },
  _eventsCount: 2,
  _maxListeners: undefined,
  locals: {},
  [Symbol(kCapture)]: false
}
Response {
  _writableState: WritableState {
    objectMode: false,
    highWaterMark: 16384,
    finalCalled: false,
    needDrain: false,
    ending: false,
    ended: false,
    finished: false,
    destroyed: false,
    decodeStrings: true,
    defaultEncoding: 'utf8',
    length: 0,
    writing: false,
    corked: 0,
    sync: true,
    bufferProcessing: false,
    onwrite: [Function: bound onwrite],
    writecb: null,
    writelen: 0,
    afterWriteTickInfo: null,
    buffered: [],
    bufferedIndex: 0,
    allBuffers: true,
    allNoop: true,
    pendingcb: 0,
    prefinished: false,
    errorEmitted: false,
    emitClose: true,
    autoDestroy: true,
    errored: null,
    closed: false
  },
  _events: [Object: null prototype] {
    finish: [Function: bound onceWrapper] { listener: [Function (anonymous)] },
    abort: [ [Function (anonymous)], [Function (anonymous)] ]
  },
  _eventsCount: 2,
  _maxListeners: undefined,
  locals: {},
  [Symbol(kCapture)]: false
}
Response {
  _writableState: WritableState {
    objectMode: false,
    highWaterMark: 16384,
    finalCalled: false,
    needDrain: false,
    ending: false,
    ended: false,
    finished: false,
    destroyed: false,
    decodeStrings: true,
    defaultEncoding: 'utf8',
    length: 0,
    writing: false,
    corked: 0,
    sync: true,
    bufferProcessing: false,
    onwrite: [Function: bound onwrite],
    writecb: null,
    writelen: 0,
    afterWriteTickInfo: null,
    buffered: [],
    bufferedIndex: 0,
    allBuffers: true,
    allNoop: true,
    pendingcb: 0,
    prefinished: false,
    errorEmitted: false,
    emitClose: true,
    autoDestroy: true,
    errored: null,
    closed: false
  },
  _events: [Object: null prototype] {
    finish: [Function: bound onceWrapper] { listener: [Function (anonymous)] },
    abort: [ [Function (anonymous)], [Function (anonymous)] ]
  },
  _eventsCount: 2,
  _maxListeners: undefined,
  locals: {},
  [Symbol(kCapture)]: false
}
Response {
  _writableState: WritableState {
    objectMode: false,
    highWaterMark: 16384,
    finalCalled: false,
    needDrain: false,
    ending: false,
    ended: false,
    finished: false,
    destroyed: false,
    decodeStrings: true,
    defaultEncoding: 'utf8',
    length: 0,
    writing: false,
    corked: 0,
    sync: true,
    bufferProcessing: false,
    onwrite: [Function: bound onwrite],
    writecb: null,
    writelen: 0,
    afterWriteTickInfo: null,
    buffered: [],
    bufferedIndex: 0,
    allBuffers: true,
    allNoop: true,
    pendingcb: 0,
    prefinished: false,
    errorEmitted: false,
    emitClose: true,
    autoDestroy: true,
    errored: null,
    closed: false
  },
  _events: [Object: null prototype] {
    finish: [Function: bound onceWrapper] { listener: [Function (anonymous)] },
    abort: [ [Function (anonymous)], [Function (anonymous)] ]
  },
  _eventsCount: 2,
  _maxListeners: undefined,
  locals: {},
  [Symbol(kCapture)]: false
}
Response {
  _writableState: WritableState {
    objectMode: false,
    highWaterMark: 16384,
    finalCalled: false,
    needDrain: false,
    ending: false,
    ended: false,
    finished: false,
    destroyed: false,
    decodeStrings: true,
    defaultEncoding: 'utf8',
    length: 0,
    writing: false,
    corked: 0,
    sync: true,
    bufferProcessing: false,
    onwrite: [Function: bound onwrite],
    writecb: null,
    writelen: 0,
    afterWriteTickInfo: null,
    buffered: [],
    bufferedIndex: 0,
    allBuffers: true,
    allNoop: true,
    pendingcb: 0,
    prefinished: false,
    errorEmitted: false,
    emitClose: true,
    autoDestroy: true,
    errored: null,
    closed: false
  },
  _events: [Object: null prototype] {
    finish: [Function: bound onceWrapper] { listener: [Function (anonymous)] },
    abort: [ [Function (anonymous)], [Function (anonymous)] ]
  },
  _eventsCount: 2,
  _maxListeners: undefined,
  locals: {},
  [Symbol(kCapture)]: false
}

from hyper-express.

kartikk221 avatar kartikk221 commented on May 25, 2024

Are you able to see the Response.drain method source code in your Response.js file? It is on line 314 in the original source code If you need to search around in your modified file.

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

Oh, my bad. I was thinking it was something from uWebSockets (WebSockets) and removed it. Now it looks like everything is working, thanks! Btw I benchmarked parseInt() vs Number() on my node v 14.15.0 and I see a slight difference where Number() is faster. I'm talking about this line:

if (name.toLowerCase() === 'content-length') this.#custom_content_length = parseInt(value);

I will try to find more places where I can improve the performance and let you know when I make my repository publicly available

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

Okay, I will look into it

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

@kartikk221 Do you get the same (https://github.com/kartikk221/hyper-express/blob/master/docs/Benchmarks.md) or higher/lower numbers from benchmarking the latest version of Hyper-Express?
From my benchmarks (Hello World, autocannon --renderStatusCodes --debug --warmup [ -c 2500 -d 30 ] -c 2500 -d 30 -p 4 http://HOST:PORT) on localhost for Hyper-Express I get:

┌─────────┬────────┬────────┬────────┬────────┬───────────┬──────────┬────────┐
│ Stat    │ 2.5%   │ 50%    │ 97.5%  │ 99%    │ Avg       │ Stdev    │ Max    │
├─────────┼────────┼────────┼────────┼────────┼───────────┼──────────┼────────┤
│ Latency │ 146 ms │ 159 ms │ 182 ms │ 189 ms │ 160.19 ms │ 22.04 ms │ 409 ms │
└─────────┴────────┴────────┴────────┴────────┴───────────┴──────────┴────────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬──────────┬──────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg      │ Stdev    │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼──────────┼─────────┤
│ Req/Sec   │ 62207   │ 62207   │ 72639   │ 211967  │ 89134.55 │ 40229.35 │ 62188   │
├───────────┼─────────┼─────────┼─────────┼─────────┼──────────┼──────────┼─────────┤
│ Bytes/Sec │ 4.17 MB │ 4.17 MB │ 4.87 MB │ 14.2 MB │ 5.97 MB  │ 2.7 MB   │ 4.17 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴──────────┴──────────┴─────────┘
┌──────┬─────────┐
│ Code │ Count   │
├──────┼─────────┤
│ 200  │ 1961029 │
└──────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 22

1971k requests in 31.77s, 131 MB read

uWebSockets.js

┌─────────┬────────┬────────┬────────┬────────┬───────────┬─────────┬────────┐
│ Stat    │ 2.5%   │ 50%    │ 97.5%  │ 99%    │ Avg       │ Stdev   │ Max    │
├─────────┼────────┼────────┼────────┼────────┼───────────┼─────────┼────────┤
│ Latency │ 188 ms │ 204 ms │ 243 ms │ 255 ms │ 206.39 ms │ 22.2 ms │ 521 ms │
└─────────┴────────┴────────┴────────┴────────┴───────────┴─────────┴────────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬─────────┬───────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg     │ Stdev │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼───────┼─────────┤
│ Req/Sec   │ 196607  │ 196607  │ 196607  │ 196607  │ 196544  │ 0     │ 196528  │
├───────────┼─────────┼─────────┼─────────┼─────────┼─────────┼───────┼─────────┤
│ Bytes/Sec │ 13.4 MB │ 13.4 MB │ 13.4 MB │ 13.4 MB │ 13.4 MB │ 0 B   │ 13.4 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴─────────┴───────┴─────────┘
┌──────┬─────────┐
│ Code │ Count   │
├──────┼─────────┤
│ 200  │ 1768768 │
└──────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 9

1779k requests in 36.88s, 120 MB read

My slightly reworked Hyper-Express:

┌─────────┬────────┬────────┬────────┬────────┬───────────┬──────────┬────────┐
│ Stat    │ 2.5%   │ 50%    │ 97.5%  │ 99%    │ Avg       │ Stdev    │ Max    │
├─────────┼────────┼────────┼────────┼────────┼───────────┼──────────┼────────┤
│ Latency │ 114 ms │ 148 ms │ 176 ms │ 212 ms │ 150.64 ms │ 22.96 ms │ 485 ms │
└─────────┴────────┴────────┴────────┴────────┴───────────┴──────────┴────────┘
┌───────────┬─────────┬─────────┬─────────┬─────────┬───────────┬──────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%     │ 97.5%   │ Avg       │ Stdev    │ Min     │
├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼──────────┼─────────┤
│ Req/Sec   │ 69503   │ 69503   │ 195967  │ 204415  │ 171121.24 │ 44119.97 │ 69460   │
├───────────┼─────────┼─────────┼─────────┼─────────┼───────────┼──────────┼─────────┤
│ Bytes/Sec │ 4.66 MB │ 4.66 MB │ 13.1 MB │ 13.7 MB │ 11.5 MB   │ 2.96 MB  │ 4.65 MB │
└───────────┴─────────┴─────────┴─────────┴─────────┴───────────┴──────────┴─────────┘
┌──────┬─────────┐
│ Code │ Count   │
├──────┼─────────┤
│ 200  │ 2224433 │
└──────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 13

2234k requests in 33.87s, 149 MB read

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

Btw in my dependencies I'm having "uWebSockets.js": "github:uNetworking/uWebSockets.js#binaries" instead of "uWebSockets.js": "github:uNetworking/uWebSockets.js#v20.10.0" because I found that it's updated more often for Ubuntu

from hyper-express.

kartikk221 avatar kartikk221 commented on May 25, 2024

Hey, the results above look phenomenal!
Feel free to make a PR request, I'd love to see how you were able to increase the performance on localhost.

Regarding the version specification in dependencies, I mainly specify the version number because uWebsockets.js seems to receive quite a bit of updates/changes that all aren't properly documented at times. So usually I prefer to manually update, test and include newer versions as they are released by the maintainer into HyperExpress.

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

@kartikk221 Here is a current results of my solution (hello world)
I will try to improve the results within a week and then publish it.

> wrk -t4 -c2500 -d60s http://127.0.0.1:5000

Running 1m test @ http://127.0.0.1:5000
  4 threads and 2500 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    39.71ms   20.13ms 929.36ms   99.48%
    Req/Sec    16.08k     1.62k   21.00k    70.22%
  3832203 requests in 1.00m, 244.86MB read
Requests/sec:  63776.71
Transfer/sec:      4.08MB

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

@kartikk221 Here is a repo: https://github.com/piliugin-anton/uQuik

Docs and usage will be added soon after testing my npm package

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

What I've done is:

  • Copied all dependencies which I think will not be updated soon
  • Updated busboy to latest version, added a fix for pausing
  • Added a fix for endWithoutBody condition
  • Removed private properties
  • Made smart (memorized) getters for some propeties, in order to speed up the framework
  • Optimized every possible function call
  • Changed error handling

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

What I've found is:

  • Static files should be served with performant server, like nginx
  • Same for compression

This server must be used for backends only, like it's written in @alexhultman repo (https://github.com/uNetworking/uWebSockets.js)
I will add a simple nginx configuration file and pm2 file for production use soon

from hyper-express.

kartikk221 avatar kartikk221 commented on May 25, 2024

100% since Node.js is the one that has to process the data through the streams implementation. For that reason pure static serving is best suited for NGINX or any other tailored webserver for such purpose.

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

Yes, the only bottleneck to get closer to uWebSockets performance is streams initialization, I guess

from hyper-express.

kartikk221 avatar kartikk221 commented on May 25, 2024

Yeah, the actual initialization of the streams isn't heavy, it's just that when you're benchmarking locally, the requests are coming in faster than the Node internal loops are able to process logic hence you have higher delays. This is why the benchmark results in the HyperExpress benchmarks are done over network on a 1 vCPU instance that mimics real world deployments.

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

In theoretical production use with PM2 (cluster mode, 8 cores on my laptop) I get:
uquik

> wrk -t4 -c2500 -d60s http://127.0.0.1:5000

Running 1m test @ http://127.0.0.1:5000
  4 threads and 2500 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    13.97ms    9.40ms  93.48ms   68.42%
    Req/Sec    40.00k     9.60k   78.02k    73.93%
  9514328 requests in 1.00m, 617.00MB read
Requests/sec: 158387.79
Transfer/sec:     10.27MB

uWebSockets.js

> wrk -t4 -c2500 -d60s http://127.0.0.1:5000

Running 1m test @ http://127.0.0.1:5000
  4 threads and 2500 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     5.83ms    3.18ms  49.95ms   84.54%
    Req/Sec    69.75k     4.14k   80.36k    78.35%
  16599746 requests in 1.00m, 1.05GB read
Requests/sec: 276517.64
Transfer/sec:     17.93MB

from hyper-express.

piliugin-anton avatar piliugin-anton commented on May 25, 2024

Okay, I will try when I have a chance

from hyper-express.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.