Giter Club home page Giter Club logo

fast-proxy's Introduction

fast-proxy

CI NPM version

Node.js framework agnostic library that enables you to forward an http request to another HTTP server. Supported protocols: HTTP, HTTPS, HTTP2

This library was initially forked from fastify-reply-from: https://github.com/fastify/fastify-reply-from

fast-proxy powers: https://www.npmjs.com/package/fast-gateway 🚀

Install

npm i fast-proxy

Usage

The following examples describe how to use fast-proxy with restana:

Gateway:

const { proxy, close } = require('fast-proxy')({
  base: 'http://127.0.0.1:3000'
  // options
})
const gateway = require('restana')()

gateway.all('/service/*', function (req, res) {
  proxy(req, res, req.url, {})
})

gateway.start(8080)

Remote service:

const service = require('restana')()
service.get('/service/hi', (req, res) => res.send('Hello World!'))

service.start(3000)

Using imports:

import fastProxy from 'fast-proxy'

const { proxy, close } = fastProxy({
  base: 'http://127.0.0.1:3000'
})

Benchmarks

Please see: https://github.com/jkyberneees/nodejs-proxy-benchmarks

API

Options

base

Set the base URL for all the forwarded requests. Will be required if http2 is set to true Note that path will be discarded.

queryString

Set the query string parser and stringifier. By default fast-proxy uses the fast-querystring module. Configuration can be changed like so:

{
  base: 'http://localhost:3001/',
  queryString: {
    parse: (value) => qs.parse(value),
    stringify: (value) => qs.stringify(value),
  }
}

http2

Set to true if target server is http2 enabled.

undici

Set to true to use undici instead of require('node:http'). Enabling this flag should guarantee 20-50% more throughput.

This flag could controls the settings of the undici client, like so:

...
  base: 'http://localhost:3001/',
  undici: {
    connections: 100,
    pipelining: 10
  }
...

See undici demo at: demos/gateway-undici.js

cacheURLs

The number of parsed URLs that will be cached. Default: 100.

Use value = 0 to disable the caching mechanism

requests.http and requests.https

Allows to optionally overwrite the internal http and https client agents implementation. Defaults: http and https.

For example, this could be used to add support for following redirects, like so:

...
  requests: {
    http: require('follow-redirects/http'),
    https: require('follow-redirects/https')
  }
...

If using undici or http2 this settings are ignored!

keepAliveMsecs

Defaults to 1 minute, passed down to [http.Agent][http-agent] and [https.Agent][https-agent] instances.

maxSockets

Defaults to 2048 sockets, passed down to [http.Agent][http-agent] and [https.Agent][https-agent] instances.

rejectUnauthorized

Defaults to true, passed down to [https.Agent][https-agent] instances. This needs to be set to false to reply from https servers with self-signed certificates.

Extended configurations

Other supported configurations in https://nodejs.org/api/http.html#http_new_agent_options can also be part of the opts object.

close

Optional "on close resource release" strategy. You can link this to your application shutdown hook as an example.

proxy(originReq, originRes, source, [opts])

Enables you to forward an http request to another HTTP server.

proxy(
  originReq,                          // http.IncomingMessage 
  originRes,                          // http.ServerResponse
  req.url,                            // String -> remote URL + path or path if base was set
  {}                                  // Options described below
)

opts

base

Optionally indicates the base URL for the current request proxy. When used, the global base config is overwriten.

This configuration value is ignored when using HTTP2.

onResponse(req, res, stream)

Called when an http response is received from the source. The default behavior is pump(stream, res), which will be disabled if the option is specified.

rewriteRequestHeaders(req, headers)

Called to rewrite the headers of the request, before them being sent to the downstream server. It must return the new headers object.

rewriteHeaders(headers)

Called to rewrite the headers of the response, before them being copied over to the outer response. It must return the new headers object.

request

Extended options supported by http[s].request method (https://nodejs.org/api/http.html#http_http_request_options_callback) The following options are dynamically assigned: method, port, path, hostname, headers, agent.

http2 options are limited to timeout only, while undici supports none.

queryString

Replaces the original querystring of the request with what is specified. This will get passed to querystring.stringify.

Related topics

Contributions

Special thanks to fastify-reply-from developers for creating a production ready library from where we could initially fork.

License

MIT

fast-proxy's People

Contributors

anonrig avatar cemremengu avatar delvedor avatar dependabot[bot] avatar eomm avatar fdawgs avatar greenkeeper[bot] avatar jcbain avatar jkyberneees avatar mcollina avatar nickfla1 avatar nileshmali avatar q0rban avatar sebinsua avatar thejones avatar trygve-lie avatar uzlopak avatar vincent178 avatar yohayg avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fast-proxy's Issues

Undici crashes server

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

?

Plugin version

No response

Node.js version

14.17.6

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

Description

Proxy crashes server when proxying multiple similar requests.

 err: {
      "type": "TypeError",
      "message": "Cannot read property 'upgrade' of undefined",
      "stack":
          TypeError: Cannot read property 'upgrade' of undefined
              at Parser.2 (/[email protected]/node_modules/undici/lib/core/client.js:546:17)
    }

There is a bug fixed in undici, which seem fixed in later versions. nodejs/undici#756

Confirmed on my machine that the error does not occur on undici=4.9.5

Steps to Reproduce

nodejs/undici#756 has a comment on how to reproduce:

nodejs/undici#756 (comment)

Expected Behavior

No response

Response being written after being sent because res.sent is undefined

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

4.14.1

Plugin version

2.1.0

Node.js version

16.x

Operating system

Linux

Operating system version (i.e. 20.04, 11.3, 10)

NA

Description

When the request encounters and error and that error is passed to the callback it is possible to get into a scenario where the response attempts to write after it has already been sent because res.sent is undefined. This results in an uncaught exception [ERR_STREAM_WRITE_AFTER_END]: write after end to be thrown if a response has already been sent.

Looking at the type of res argument (http.ServerResponse) there isn't an attribute called sent so it seems like this we will always get into this conditional here .

Steps to Reproduce

Unsure of how to replicate this.

Expected Behavior

A check to see if all writeable data has been flushed in order to get into this conditional. Perhaps checking response.writeableFinished

Bump version?

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

After the 2.1.0 release, undici dependency has been updated 4a6ca91
It solved a blocking issue happening with the latest version available on npm #49 (which I also encountered)

Isn't that significant enough to update the version and the npm package?

undici Pool not updated when using "base" option under Proxy function

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

3.24.1

Plugin version

2.1.0

Node.js version

16.8.0

Operating system

Windows

Operating system version (i.e. 20.04, 11.3, 10)

Windows 10 21H1

Description

If you set a base option to the Proxy function, the header "Location" will not be rewrite because of the Undici Pool not updated.

At the start of the plugin, we provide a base option, this option will provide the base URL for the Undici Pool creation under the "buildRequest" function in request.js

So if you have "http://google.com" as your default base option and under the Proxy function you have "http://127.0.0.1:3001" the location header will be "https://google.com/3001", so the request will "fail".

Steps to Reproduce

const { proxy } = require('../index')({ base: 'https://google.com', undici: { connections: 100, pipelining: 10 } }) const service = require('restana')() service.all('/*', (req, res) => proxy(req, res, req.url, { base: 'http://127.0.0.1:3001', rewriteRequestHeaders (req, headers) { delete headers.connection return headers } })) service.start(3000)

Expected Behavior

The request should go trough with the location header set by the base option under the Proxy function.

Extend support for post with application/x-www-form-urlencoded

🚀 Feature Proposal

Currently for POST actions fast-proxy support plain/text and application/json.

Motivation

Adding support for additional application/x-www-form-urlencoded

Example

curl -i -H "content-type: application/x-www-form-urlencoded" -X POST -d 'a=1' http://localhost:4000/api/v1/service

Handling the body is likewise:

          if (req.body instanceof Stream) {
            body = req.body
          } else if (typeof req.body === 'string') {
            body = req.body
            populateHeaders(headers, body, 'text/plain')
          } else {
            body = JSON.stringify(req.body)
            populateHeaders(headers, body, 'application/x-www-form-urlencoded')
          }

It should be:

const qs = require('querystring')

          if (req.body instanceof Stream) {
            body = req.body
          } else if (typeof req.body === 'string') {
            body = req.body
            populateHeaders(headers, body, 'text/plain')
          } else if (headers['content-type'] === 'application/x-www-form-urlencoded') {
            const qs = require('querystring');
            body = qs.stringify(req.body)
            populateHeaders(headers, body, 'application/x-www-form-urlencoded')
          } else {
            body = JSON.stringify(req.body)
            populateHeaders(headers, body, 'application/x-www-form-urlencoded')
          }

Should support undici timeout err.code

🐛 Bug Report

undici timeout err.code=UND_ERR_REQUEST_TIMEOUT is not handled properly, therefore instead of getting error 504 I get error 500

We need to add in index.js line 88 the undici timeout error code as well:

} else if (err.code === 'ECONNRESET' || err.code === 'UND_ERR_REQUEST_TIMEOUT' ) {

To Reproduce

Steps to reproduce the behavior:
in undici set the time out to a low number:

    undiciOptions: {
      connections: 4000,
      pipelining: 20,
      proxyCacheSize: 50,
      proxyCacheTtl: 1000 * 60 * 60 * 24,
      requestTimeout: 1000
    }

in test server open a route with sleep:
for example:

  .get('/api/v1/my-route/timeout', async (req, res) => {
    await sleep(3000);
    res.end();
  })

Expected behavior

Should handle timeout code from undici

Note: I think the following error codes should also be added: ESOCKETTIMEDOUT and ETIMEDOUT

Your Environment

  • node version: 12
  • fast-proxy version: 1.6.3
  • os: Mac

lib/util.js security feature issue

I want to proxy on a per - route basis. I tried but keep getting an error in kubernetes containers and is behind an nginx ingress.

You have already researched for similar issues?

// issue ref: https://github.com/fastify/fast-proxy/issues/42
function buildURL (source, reqBase) {
  const dest = new URL(source, reqBase)

  // if base is specified, source url should not override it
  if (reqBase) {
    if (!reqBase.endsWith('/') && dest.href.length > reqBase.length) {
      reqBase = reqBase + '/'
    }

    if (!dest.href.startsWith(reqBase)) {
      throw new Error('source must be a relative path string')
    }
  }

  return dest
}

What are you trying to achieve, or the steps to reproduce?

this works:

 upstream: 'https://example.com',
  prefix: '/example',
  rewritePrefix: 'https://example.com/',
  http2: false,
  type: 'JWT',
  beforeHandler: [Function: beforeHandler] // auth works this way
}

this does not work, when I put it into kubernetes, but seems to work when I directly access on my local machine instance:

 upstream: 'https://examplek8-service',
  prefix: '/example',
  rewritePrefix: 'http://examplek8-service:1111/health/,
  http2: false,
  type: 'JWT',
  beforeHandler: [Function: beforeHandler] // auth works this way
}

What was the result you received?

error I get is

{"statusCode":500,"error":"Internal Server Error","message":"source must be a relative path string"}

What did you expect?

{success:true} // health endpoint on kubernetes.

Context

  • node version: 12
  • fastify version: >=0.37.0
  • os: Mac, Windows: mac, alpine container
  • any other relevant information: I'd like a way to turn this security feature off in a config , or find some way to proxy .I am coming from express and koa to fastly, maybe there is a simple request-get-as-proxy command I'm missing?

Please read this entire template before posting any issue. If you ignore these instructions
and post an issue here that does not follow the instructions, your issue might be closed,
locked, and assigned the missing discussion label.

^^ affirmative.

When using onResponse the remote http statusCode is not being passed to fn and therefore all status are 200

🐛 Bug Report

When using onResponse the remote http statusCode is not being used and therefore all status are 200

To Reproduce

in opts.test.js extend the service and add an additional route that returns different status code then 200.
add a test that calls this route and expect to get the different status

line 113 in index.js

onResponse(req, res, stream)

Expected behavior

the onResponse should get the response object

onResponse(req, res, stream, response)

Your Environment

  • node version: 12
  • fastify version: latest
  • os: Mac

Your benchmarks are way off

🐛 Bug Report

I reran the benchmarks, my results are way lower than yours.

To Reproduce

Rerun the benchmarks, with the following bugfix:

-router.on(['GET', 'POST', 'PUT', 'PATCH', 'OPTIONS', 'DELETE'], '/service/*', (req, res) => {
+router.get('/service/*', (req, res) => {
wrk -t8 -c50 -d20s http://127.0.0.1:8080/service/hi

Expected behavior

Somewhat similar benchmarks.

My benchmarks:

fast-proxy-undici/0http: Requests/sec 10259.59 (HTTP pipelining = 10)
fast-proxy/0http: Requests/sec 6773.80
fast-proxy/restana: Requests/sec 6460.21
fast-proxy-undici/0http: Requests/sec 9448.67 (HTTP pipelining = 1)
fastify-reply-from: Requests/sec 5635.55
http-proxy: Requests/sec 3105.40

As you can see I'm getting max 3.3x performance gain instead of your 46.6x.
Without the above mentioned bugfix, the first test clocks in at 39305.96 Requests/sec (12x faster than http-proxy). Even then it is WAY slower compared to your benchmarks.

I don't know what is exactly going on, but I think it's fair to say that your benchmarks are wrong and misleading.

Your Environment

  • node version: 12
  • os: Linux

Multiple Remotes / Base URLs

If I get the chance to PR this I will, but did want to mention the use case and see if there is perhaps something obvious I'm missing. Also wanted to add a workaround which may or may not work (just started writing some basic tests).

Presently you can only specify a single base URL, which is required using undici or http2. I see that for undici that is a constraint in using a Pool. Perhaps then instead of a shared pool there could be a per base pool instantiated.

As a workaround I've decached the module and initialized it for each target.

const proxyAgents = {}
const getProxy = (base) => {
  if (proxyAgents[base]) return proxyAgents[base].proxy
  delete require.cache[require.resolve('fast-proxy')]
  proxyAgents[base] = require('fast-proxy')(Object.assign({ base }, proxyOpts))
  return proxyAgents[base].proxy
}

Do not use async in errorbacks

In

request({ method: req.method, url, qs, headers, body, request: reqOpts }, async (err, response) => {
an async function is used in an errorback. Doing so exposes us at a risk of memory leaks and DoS attacks in case an unplanned error (bug) occurs that would be extremely hard to track.

Remove FUNDING.yml

As this module is now maintained by the fastify organisation members. We should remove the .github folder including the FUNDING.yml configuration.

Getting 500 `invalid connection header` using fast-proxy with undici

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure it has not already been reported

fast-proxy version

2.0.0

Node.js version

12.20

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

10.15

Description

When I try to use fast-proxy with undici as client on a next.js API route to proxy browser requests i get an 500 error invalid connection header

Steps to Reproduce

  • create next.js app
  • create pages/v1/[...args].js (route to proxy)
  • add the proxy code
import { NextApiRequest, NextApiResponse } from 'next';
const { proxy } = require('fast-proxy')({
  base: process.env.EXTERNAL_API_URL,
  undici: {
     connections: 100,
     pipelining: 0,
   },
});

export default function api(req: NextApiRequest, res: NextApiResponse) {
  return runMiddleware(req, res, () => proxy(req, res, req.url, {}));
}

export const config = {
  api: {
    bodyParser: false,
  },
};

//@ts-expect-error
function runMiddleware(req, res, fn) {
  return new Promise((resolve, reject) => {
    //@ts-expect-error
    fn(req, res, (result) => {
      if (result instanceof Error) {
        return reject(result);
      }

      return resolve(result);
    });
  });
}
  • when performing a request, it get's captured and passed to the proxy, the proxy returns 500, this doesn't happen if I remove the undici options.

Expected Behavior

When I perform a proxy request with undici activated, it should not return 500

Port is lost from host header

I am not 100% sure what correct behavior should be, but the host on the header of a proxied request looses its port.

Proxy server:

const { proxy } = require('fast-proxy')({
    base: 'http://localhost:9600'
})
const gateway = require('restana')()

gateway.all('/', (req, res) => {
    proxy(req, res, req.url, {})
})

gateway.start(8080)

Target server:

const server = require('restana')();
server.get('/', (req, res) => {
    console.log(req.headers.host);
    res.send('Hello from simple http service');
});
server.start(9600);

When the proxy server is requested and proxied to the target server, the target server will print just localhost. I expect it to print localhost:8080.

Is current behavior correct or should the port be forwarded too?

It would also be nice to have a way to override the upstream http headers from the proxy to the target.

I'm happy to provide a PR to fix the missing port or add a solution to be able to set http headers on the upstream request.

someone can use req.url like '//10.10.10.10' to make SSRF attack, cause the host in headers is not expected.

when someone use fast-proxy like:

const base = 'http://somewhere.com'
const { proxy, close } = require('fast-proxy')({
      base
})

and give the req.url like "//10.10.10.10/", the result of getReqUrl method is:

URL {
  href: 'http://10.10.10.10/',
  origin: 'http://10.10.10.10',
  protocol: 'http:',
  username: '',
  password: '',
  host: '10.10.10.10',
  hostname: '10.10.10.10',
  port: '',
  pathname: '/',
  search: '',
  searchParams: URLSearchParams {},
  hash: '' }

but the result we expected is:

URL {
  href: 'http://somewhere.com/',
  origin: 'http://somewhere.com',
  protocol: 'http:',
  username: '',
  password: '',
  host: 'somewhere.com',
  hostname: 'somewhere.com',
  port: '',
  pathname: '/',
  search: '',
  searchParams: URLSearchParams {},
  hash: '' }

it can be used to make SSRF attack.
so i suggest to pollyfill the method getResUrl:
image
image

when we give new URL instance, we need to exclude some source that begin with '//', to make it like '/'.

Archiving this module / Do not use / New maintainer needed

I think we should be archiving this module, because it was born out of the will of @jkyberneees and myself to merge it and fastify-reply-from, creating a generic module for proxying in node.js.
However, such goals never materialized.

I would like to transfer this module back to @jkyberneees or archive this module if he is not interested in maintaining it anymore.


For some strange reasons I cannot add other people to do releases on this module, and I have no time to do them.

Action required: Greenkeeper could not be activated 🚨

🚨 You need to enable Continuous Integration on Greenkeeper branches of this repository. 🚨

To enable Greenkeeper, you need to make sure that a commit status is reported on all branches. This is required by Greenkeeper because it uses your CI build statuses to figure out when to notify you about breaking changes.

Since we didn’t receive a CI status on the greenkeeper/initial branch, it’s possible that you don’t have CI set up yet.
We recommend using:

If you have already set up a CI for this repository, you might need to check how it’s configured. Make sure it is set to run on all new branches. If you don’t want it to run on absolutely every branch, you can whitelist branches starting with greenkeeper/.

Once you have installed and configured CI on this repository correctly, you’ll need to re-trigger Greenkeeper’s initial pull request. To do this, please click the 'fix repo' button on account.greenkeeper.io.

feat: support websocket proxies

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

Are there any plans to support proxying of websockets? Or perhaps this is already possible and just needs documentation?

Motivation

We are currently using node-http-proxy to proxy http and ws requests. We would like to refactor to use fast-proxy instead, but it does not appear that it's possible to proxy websockets with fast-proxy.

Example

No response

Server crash when aborting wrk when using fast-proxy with 0http with low server

🐛 Bug Report

Server crash when aborting wrk when using fast-proxy with 0http with low server

To Reproduce

I have the following code:
(Happens also without undici)

    const { proxy } = fastProxy({
      base: 'http://localhost:4000',
      undiciOptions: {
        connections: 4000,
        pipelining: 20,
        requestTimeout: 5000,
      },
    });
    const { router, server } = cero({
      server: low(),
    });

    router.all('/*', (req, res) => proxy(req, res, req.url, {}));

    server.listen(3000, (socket) => {
      if (socket) {
        console.log('HTTP server ready!');
      }
    });

With an additional restana server listening on port 4000
Then run
wrk -t8 -c50 -d20s http://localhost:3000/api/v1/test
hit CTRL-C to stop wrk.

Result:
process abort on with the following error:

Invalid access of discarded (invalid, deleted) uWS.HttpResponse/SSLHttpResponse.
There was an uncaught exception. I'm now going to commit suicide; farewell all.

When I'm using 0http without low it doesn’t happen

Expected behavior

The process should not abort if I stop wrk

Your Environment

  • node version: 12
  • fast-prxy version: latest
  • os: Mac

add an optional flag not to use caching

🚀 Feature Proposal

add an optional flag not to use caching

Motivation

In my application 99% of the url are dynamically routes. i.e. /users/:userId/orders/:orderId and also use query params

therefore the cache is not required.

Example

const { proxy, close } = fastProxy({ base: null, useCache: false });

Thanks,

Yohay

[undici] Preflight requests from Safari

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

A thing to be aware of when undici is used.
In a CORS situation, preflight requests from Safari lead to 500 error.

Indeed, Safari sends an OPTIONS request with a content-length: 0 header, which is not well handled by undici in that situation.

It can be fixed by removing that content-length header, if present (Chrome doesn't send it), before calling the proxy function:

if(req.method === 'OPTIONS' && req.headers.hasOwnProperty('content-length')){
  delete req.headers['content-length'];
}
proxy(...

allow for custom error handling

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

Currently, if an error is thrown when the proxied request is made, the error is handled by fast-proxy. It would be nice for an optional callback to be passed so that the error, req, and res can be exposed to the user and handled as they see fit.

Motivation

No response

Example

function handleErrors (err, req, res) {
   If(res.writableFinished === false) {
      if(err.code ===  'SOMETHING') {
         // some logic here
     }
   }
}
proxy(req, res, source, { onError: handleErrors })

Enabling different base per request

Before you submit an issue we recommend you drop into the Gitter community or Fastify Help and ask any questions you have or mention any problems you've had getting started with Fastify.

Please read this entire template before posting any issue. If you ignore these instructions
and post an issue here that does not follow the instructions, your issue might be closed,
locked, and assigned the missing discussion label.

🚀 Feature Proposal

Enable changing the base on each request will enable dynamic routing without the need to preconfigure proxies for different targets

Example

proxy(req, res, req.url, {base: route})

code change:

url = new URL(source, base)

change:
url = new URL(source, base)
to
url = new URL(source, opts.base)

How can I test this package locally?

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

How can I test this package locally? It includes lots of great integration tests, but the documentation not include anything about it.

Only reference to testing plugins is in .github/workflows/ci.yml which refers to fastify/workflows/.github/workflows/plugins-ci.yml@v3, but for a newbie it would be super hard to trace and contribute to this package (and test it locally)

Is there an interface to use proxy in koa.js?

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

We would like to use fast-proxy in our koa application.
Is there a proxy call option that is compatible with kos?

Type for undici option does not support boolean values

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

The type for the option undici can only be satisfied by undefined or Undici.Pool.Options, in contrast of what the documentation says:

Set to true to use undici instead of require('http').
And what the code does:

const isUndici = !!opts.undici

if (typeof opts.undici !== 'object') {
opts.undici = {}
}

My proposal is to update the type as such:

export interface FastProxyOptions {
  base?: string;
  http2?: boolean;
- undici?: Undici.Pool.Options;
+ undici?: boolean | Undici.Pool.Options;
  cacheURLs?: number;
  requests?: {
    http?: Http.Agent,
    https?: Https.Agent
  };
  keepAliveMsecs?: number;
  maxSockets?: number;
  rejectUnauthorized?: boolean;
  queryString?: QueryStringModule;
}

Only match beginning of URL when stripping additional forward slashes

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

2.1.0

Plugin version

No response

Node.js version

16.x

Operating system

Linux

Operating system version (i.e. 20.04, 11.3, 10)

N/A

Description

In #42 a scenario was given where a user might be able to perform an SSRF attack by using a specially crafted URL with two leading forward slashes. The implemented fix has the side effect of stripping all double slashes from the entire request URL.

For example, given the following URL:

https://chicken.com/duck?redirect_uri=//quack-a-doodle-doo.com

The proxy will then strip the additional / off of the URL there.

The slash in a query parameter does not need to be url encoded.

The characters slash ("/") and question mark ("?") may represent data within the query component.

If I am understanding the vulnerability originally reported in #42 , it only exists if the // is at the beginning of the URL.

What I would like to propose is to only convert // if they are at the beginning of the URL.

Steps to Reproduce

Pass a request to fast-proxy with double slashes in the URL, such as /duck?redirect_uri=//quack-a-doodle-doo.com.

Expected Behavior

No response

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.