Giter Club home page Giter Club logo

aws-lambda-fastify's Introduction

Introduction

CI NPM version js-standard-style

Inspired by the AWSLABS aws-serverless-express library tailor made for the Fastify web framework.

No use of internal sockets, makes use of Fastify's inject function.

Seems faster (as the name implies) than aws-serverless-express and aws-serverless-fastify 😉

👨🏻‍💻Installation

$ npm i @fastify/aws-lambda

Options

@fastify/aws-lambda can take options by passing them with : awsLambdaFastify(app, options)

property description default value
binaryMimeTypes Array of binary MimeTypes to handle []
enforceBase64 Function that receives the response and returns a boolean indicating if the response content is binary or not and should be base64-encoded undefined
serializeLambdaArguments Activate the serialization of lambda Event and Context in http header x-apigateway-event x-apigateway-context false (was true for <v2.0.0)
decorateRequest Decorates the fastify request with the lambda Event and Context request.awsLambda.event request.awsLambda.context true
decorationPropertyName The default property name for request decoration awsLambda
callbackWaitsForEmptyEventLoop See: Official Documentation undefined
retainStage Retain the stage part of the API Gateway URL false
pathParameterUsedAsPath Use a defined pathParameter as path (i.e. 'proxy') false

📖Example

lambda.js

const awsLambdaFastify = require('@fastify/aws-lambda')
const app = require('./app')

const proxy = awsLambdaFastify(app)
// or
// const proxy = awsLambdaFastify(app, { binaryMimeTypes: ['application/octet-stream'], serializeLambdaArguments: false /* default is true */ })

exports.handler = proxy
// or
// exports.handler = (event, context, callback) => proxy(event, context, callback)
// or
// exports.handler = (event, context) => proxy(event, context)
// or
// exports.handler = async (event, context) => proxy(event, context)

app.js

const fastify = require('fastify')

const app = fastify()
app.get('/', (request, reply) => reply.send({ hello: 'world' }))

if (require.main === module) {
  // called directly i.e. "node app"
  app.listen({ port: 3000 }, (err) => {
    if (err) console.error(err)
    console.log('server listening on 3000')
  })
} else {
  // required as a module => executed on aws lambda
  module.exports = app
}

When executed in your lambda function we don't need to listen to a specific port, so we just export the app in this case. The lambda.js file will use this export.

When you execute your Fastify application like always, i.e. node app.js (the detection for this could be require.main === module), you can normally listen to your port, so you can still run your Fastify function locally.

📣Hint

Lambda arguments

The original lambda event and context are passed via Fastify request and can be used like this:

app.get('/', (request, reply) => {
  const event = request.awsLambda.event
  const context = request.awsLambda.context
  // ...
})

If you do not like it, you can disable this by setting the decorateRequest option to false.

Alternatively the original lambda event and context are passed via headers and can be used like this, if setting the serializeLambdaArguments option to true:

app.get('/', (request, reply) => {
  const event = JSON.parse(decodeURIComponent(request.headers['x-apigateway-event']))
  const context = JSON.parse(decodeURIComponent(request.headers['x-apigateway-context']))
  // ...
})

Lower cold start latency

Since AWS Lambda now enables the use of ECMAScript (ES) modules in Node.js 14 runtimes, you could lower the cold start latency when used with Provisioned Concurrency thanks to the top-level await functionality.

We can use this by calling the fastify.ready() function outside of the Lambda handler function, like this:

import awsLambdaFastify from '@fastify/aws-lambda'
import app from './app.js'
export const handler = awsLambdaFastify(app)
await app.ready() // needs to be placed after awsLambdaFastify call because of the decoration: https://github.com/fastify/aws-lambda-fastify/blob/master/index.js#L9

Here you can find the approriate issue discussing this feature.

⚡️Some basic performance metrics

@fastify/aws-lambda (decorateRequest : false) x 56,892 ops/sec ±3.73% (79 runs sampled)

@fastify/aws-lambda x 56,571 ops/sec ±3.52% (82 runs sampled)

@fastify/aws-lambda (serializeLambdaArguments : true) x 56,499 ops/sec ±3.56% (76 runs sampled)

serverless-http x 45,867 ops/sec ±4.42% (83 runs sampled)

aws-serverless-fastify x 17,937 ops/sec ±1.83% (86 runs sampled)

aws-serverless-express x 16,647 ops/sec ±2.88% (87 runs sampled)

Fastest is @fastify/aws-lambda (decorateRequest : false), @fastify/aws-lambda

⚠️Considerations

  • For apps that may not see traffic for several minutes at a time, you could see cold starts
  • Stateless only
  • API Gateway has a timeout of 29 seconds, and Lambda has a maximum execution time of 15 minutes. (Using Application Load Balancer has no timeout limit, so the lambda maximum execution time is relevant)
  • If you are using another web framework beside Fastify (i.e. Connect, Express, Koa, Restana, Sails, Hapi, Restify) or want to use a more generic serverless proxy framework, have a look at: serverless-http or serverless-adapter

🎖Who is using it?

locize is using @fastify/aws-lambda

The logos displayed in this page are property of the respective organisations and they are not distributed under the same license as @fastify/aws-lambda (MIT).

aws-lambda-fastify's People

Contributors

adrai avatar andrew0 avatar anthonyringoet avatar brunozoric avatar danielsan avatar dependabot[bot] avatar fdawgs avatar frikille avatar garygsc avatar greenkeeper[bot] avatar h4ad avatar hakimio avatar mcollina avatar melchor629 avatar metcoder95 avatar rafaelgss avatar uzlopak avatar zekth avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aws-lambda-fastify's Issues

Support binary option for multipart/form-data type

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

Hello! I'm using this module with @fastify/multipart. recently i found that when request type is multipart/form-data, it must be decoded with binary options like Buffer.from(data, 'binary') But, now it seems this module doesn't support binary options. So could i add binary option?

Motivation

When i send this request

curl --location --request POST 'http://localhost:3000/upload/templates' \
--form 'thumbnail=@"/Users/junwoopark/Downloads/24939410.png"' \
--form 'template="{}"' \
--form 'ratio="1"' \
--form 'categories=""'

lambda handler is (test with serverless-offline)

const ffy = require('fastify');
const fs = require('fs');
const fastify = ffy();

const opts = {
  attachFieldsToBody: true,
};

fastify.register(require('@fastify/multipart'), opts)

fastify.post('/upload/templates', async function (req, reply) {
  const data = await req.body;
  console.log(data);
  fs.writeFileSync('./image.png', data.thumbnail._buf);
  reply.send('done');
})

export const main = awsLambdaFastify(fastify);

buffer result must be

 thumbnail: {
    fieldname: 'thumbnail',
    filename: '24939410.png',
    encoding: '7bit',
    mimetype: 'image/png',
    file: FileStream {
      _readableState: [ReadableState],
      _events: [Object: null prototype],
      _eventsCount: 5,
      _maxListeners: undefined,
      bytesRead: 5062,
      truncated: false,
      _read: [Function (anonymous)],
      [Symbol(kCapture)]: false
    },
    fields: [Circular *1],
    _buf: <Buffer 89 50 4e 47 0d 0a 1a 0a 00 00 00 0d 49 48 44 52 00 00 00 c8 00 00 00 c8 08 02 00 00 00 22 3a 39 c9 00 00 13 8d 49 44 41 54 78 9c ec dd 79 50 53 e7 fe ... 5012 more bytes>,
    toBuffer: [AsyncFunction: toBuffer]
  },

but actual result of this version is

thumbnail: {
    fieldname: 'thumbnail',
    filename: '24939410.png',
    encoding: '7bit',
    mimetype: 'image/png',
    file: FileStream {
      _readableState: [ReadableState],
      _events: [Object: null prototype],
      _eventsCount: 5,
      _maxListeners: undefined,
      bytesRead: 7603,
      truncated: false,
      _read: [Function (anonymous)],
      [Symbol(kCapture)]: false
    },
    fields: [Circular *1],
    _buf: <Buffer c2 89 50 4e 47 0d 0a 1a 0a 00 00 00 0d 49 48 44 52 00 00 00 c3 88 00 00 00 c3 88 08 02 00 00 00 22 3a 39 c3 89 00 00 13 c2 8d 49 44 41 54 78 c2 9c c3 ... 7553 more bytes>,
    toBuffer: [AsyncFunction: toBuffer]
  },

Example

I think new option enforceBinaryOption should be added

property description default value
enforceBinaryOption false
module.exports = () => {
    options.enforceBinaryOption = options.enforceBinaryOption ?? false
     ...

    //  multipart/form-data only works if encoding is binary 
    // const payload = Buffer.from(event.body, 'binary')
    // const payload = Buffer.from(event.body, event.isBase64Encoded ? 'base64' : 'utf8')
   const payload = Buffer.from(event.body, options.enforceBinaryOption ?  'binary' :  (event.isBase64Encoded ? 'base64' : 'utf8'))
}

Clean up function like onClose hook

🚀 Feature Proposal

Hello, I am currently working with aws-lambda-fastify.
I think is good that if it has a clean up function for database disconnect, etc..

Motivation

I created a plugin for connecting a database and registered it.

As a result, I riched time out error, because I didn't disconnect a database server.
so I tried an onClose hook but it does not work :(

I thought it is good if aws-lambda-fastify has an option or any hooks similar to the onClose hook for the disconnect.

Example

I dont know its possible but my idea is such as below code

const proxy = serverlessFastify(app, {
  closeup: (fastify, done) => {
    fastify.connection.close();
    done();
  }
});

Support of Set-Cookie in HTTP API V2 Response

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

To return set-cookies using the HTTP API (Lambda 2.0) the response must have n 'cookie' object. at :

statusCode: res.statusCode,

2

To customize the response, your Lambda function should return a response with the following format.

{
"cookies" : ["cookie1", "cookie2"],
"isBase64Encoded": true|false,
"statusCode": httpStatusCode,
"headers": { "headername": "headervalue", ... },
"body": "Hello from Lambda!"
}
So just return cookies: ['name=value', 'name=value']

Source: https://docs.aws.amazon.com/apigateway/latest/developerguide/http-api-develop-integrations-lambda.html#http-api-develop-integrations-lambda.v2

Motivation

Support HTTP API V2 Lambda 2.0

Example

No response

Support for Lambda@Edge

🚀 Feature Proposal

Add support for Lambda@Edge payload, serverless-express supports it.

This is pretty much just understanding its input/output, I believe no major changes would be required as it's somewhat similar to the regular integration.

Motivation

To have a unified interface between API Gateway and Lambda@Edge.

Example

Same as above.

Does not install

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

v3.29.0

Plugin version

master

Node.js version

16.x

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

latest

Description

If I clone this repository and run npm i I get:

npm ERR! code ERESOLVE
npm ERR! ERESOLVE unable to resolve dependency tree
npm ERR!
npm ERR! While resolving: [email protected]
npm ERR! Found: [email protected]
npm ERR! node_modules/fastify
npm ERR!   dev fastify@"^3.28.0" from the root project
npm ERR!
npm ERR! Could not resolve dependency:
npm ERR! peer fastify@"^2.6.0" from [email protected]
npm ERR! node_modules/aws-serverless-fastify
npm ERR!   dev aws-serverless-fastify@"^1.0.28" from the root project
npm ERR!
npm ERR! Fix the upstream dependency conflict, or retry
npm ERR! this command with --force, or --legacy-peer-deps
npm ERR! to accept an incorrect (and potentially broken) dependency resolution.

I'm on npm v8.5.5.

Steps to Reproduce

see above

Expected Behavior

No response

URL query is not decoded when using ALB

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

all

Plugin version

2.0.2

Node.js version

14

Operating system

Linux

Operating system version (i.e. 20.04, 11.3, 10)

not applicable

Description

When using Lambda as a target for an ALB, the queryStringParameters in the Lambda event is not decoded, unlike with API Gateway. This isn't really well documented, but there's a small section in the docs that mentions this (not really sure why they put it under the "multi-value headers" section, but it applies to the queryStringParameters and multiValueQueryStringParameters properties):

If the query parameters are URL-encoded, the load balancer does not decode them. You must decode them in your Lambda function.

Steps to Reproduce

This test fails:

test('GET with encoded query values', async (t) => {
  t.plan(1)

  const app = fastify()
  app.get('/test', async (request, reply) => {
    reply.send(request.query)
  })
  const proxy = awsLambdaFastify(app)

  const ret = await proxy({
    requestContext: {
      elb: {
        targetGroupArn: 'xxx'
      }
    },
    httpMethod: 'GET',
    path: '/test',
    queryStringParameters: {
      q: 'foo%3Fbar'
    }
  }))
  t.equal(ret.body, '{"q":"foo?bar"}')
})

Expected Behavior

When the Lambda is invoked by ALB (detectable by presence of requestContext.elb object on event), the values for queryStringParameter and multiValueQueryStringParameters should be decoded before being passed to fastify.

Add easier support for callbackWaitsForEmptyEventLoop

🚀 Feature Proposal

In order to make AWS works properly with mixed callback/async invocation, one might need to modify context.callbackWaitsForEmptyEventLoop.

Nowadays, it is possible with a code like this:

const proxy = awsLambdaFastify(app)

exports.handler = function(event, context, callback) {
  context.callbackWaitsForEmptyEventLoop = false
  return proxy(event, context, callback)
}

But it would be great if it can directly provided as option. See below.

Note that in the implementation checking of the property should be done with options.hasOwnProperty or similar, since both truthy and falsy values are acceptable.

Motivation

To make typical use case easier to manage and without boilerplate code.

Example

exports.handler = awsLambdaFastify(app, {callbackWaitsForEmptyEventLoop: false})

Feature: Optimized mode

🚀 Feature Proposal

Those 2 instructions are slowing down the processing of queries: https://github.com/fastify/aws-lambda-fastify/blob/master/index.js#L31-L32

What about using https://github.com/fastify/fast-json-stringify based on this definition: https://docs.aws.amazon.com/apigateway/latest/developerguide/http-api-develop-integrations-lambda.html

I'd add a flag optimized or something similar with default value to false to avoid side effects.

Motivation

Speeding up the API

Example

const awsLambdaFastify = require('aws-lambda-fastify')
const app = require('./app')

const proxy = awsLambdaFastify(app, {optimized:true})

exports.handler = proxy

Do you suggest this wrapper over serverless-http?

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

Hey folks,

thanks for maintaining Fastify and all it's packages! Awesome work!

I've realised that we're often using https://github.com/dougmoscrop/serverless-http when running Fastify inside a Lambda function.

I was wondering, are there any downsides to this approach? Do you suggest this package, since it's in the same fastify namespace?

Thank you!

Maintainer access to the Plugins Team

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

This is a core plugin and the Plugins Team doesn't have access to it

CC @Uzlopak if you can change it

Notes on Decorating Request when reusing Fastify instance

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

Not really a bug, but just might be worth adding a note in the readme.

The new feature to decorate the request with event and context data introduces errors if you are caching the fastify instance in order to save time bootstrapping.

For example I have a NestJS application I run on lambda, to avoid having to build the app every request it keeps it in memory until the lambda instance goes down.

let cachedNestApp: NestApp;

export const handler: Handler = async (
  event: APIGatewayProxyEvent,
  context: Context,
): Promise<APIGatewayProxyResult> => {
  if (!cachedNestApp) {
    cachedNestApp = await bootstrap();
  }

  const proxy = awsLambdaFastify(cachedNestApp.instance);
  return proxy(event, context);
};

I upgraded to the latest version of aws-lambda-fastify from 1.7.1 and the new decoration by default means that the first request works fine but afterwards every request errors:

{
    "errorType": "FastifyError",
    "errorMessage": "The decorator 'awsLambda' has been added after start!",
    "code": "FST_ERR_DEC_AFTER_START",
    "name": "FastifyError",
    "message": "The decorator 'awsLambda' has been added after start!",
    "statusCode": 500,
    "stack": [
        "FastifyError: The decorator 'awsLambda' has been added after start!",
        "    at assertNotStarted (/var/task/node_modules/fastify/lib/decorate.js:127:11)",
        "    at Object.decorateRequest (/var/task/node_modules/fastify/lib/decorate.js:119:3)",
        "    at module.exports (/var/task/node_modules/aws-lambda-fastify/index.js:9:9)",
        "    at Runtime.handler (/var/task/src/aws-lambda-fastify.js:54:19)",
        "    at Runtime.handleOnce (/var/runtime/Runtime.js:66:25)"
    ]
}

I think this is expected behaviour, but just might come as a surprise to some who might upgrade and run into the same issues so i thought it might be worth noting in the readme.

Make API Gateway authorizer context available

🚀 Feature Proposal

AWS Api Gateway custom authorizers offer the ability to pass context data down stream to the handling lambda functions.
Currently there is no way to access this data from within fastify handlers.

Motivation

Access authorization context data from within fastify handlers.

Example

Looking at the existing implementation, I suggest something like the below would work:

const authContextHeader = request.headers["x-apigateway-auth-context"];
const authContext = authContextHeader ? JSON.parse(authContextHeader) : null;

url encode/decode problem

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

4.21.0

Plugin version

3.3.0

Node.js version

18.x

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

12.6.7

Description

The fastify app running locally and the same app deployed to aws lambda handle escaped url paths differently.

For example, in http://localhost:3000/a%2Fb, request.url would be /a%2Fb. However,
https://xxxxx.amazonaws.com/a%2Fb,request.url would be /a/b.

Steps to Reproduce

Here is a minimal working example: https://github.com/andyli/test-fastify-aws-lambda-url

Expected Behavior

The same app running locally and in aws lambda would give the same request.url.

Use event.pathParameters.proxy instead of event.path

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

3.3.1

Plugin version

No response

Node.js version

18.x

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

13.5

Description

While migrating from serverless-express, I noticed that paths are being processed differently by aws-lambda-fastify. It seems that serverless-express will use event.pathParameters.proxy instead of event.path, which means if on API Gateway I have it configured 'v2/{proxy+}' and I am accessing /v2/token, it will call the express router with the path /token instead of /v2/token.

I wonder if an additional flag can be added to replicate that behaviour?

Steps to Reproduce

I'm using this with NestJS:

      const expressApp = express();
      const adapter = new ExpressAdapter(expressApp);
      await initApp(adapter, appModule);
      const server = configure({
        app: expressApp,
        binarySettings: { contentTypes: ['application/pdf', 'application/x-gzip'] },
      });
      return server;

vs

      const adapter = new FastifyAdapter();
      await initApp(adapter, appModule);
      const server = awsLambdaFastify(adapter.getInstance() as FastifyInstance, {
        binaryMimeTypes: ['application/pdf', 'application/x-gzip'],
        decorateRequest: true,
      });
      return server;

and NestJS with express gives me req.route.path = '/token' while NestJS with fastify gives me req.routerPath = '/v2/token'.

Expected Behavior

Matching serverless-express or providing a flag to toggle that behaviour.

Running Fastify on AWS Lambda: Dynamic require of "events" is not supported

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

This is not a bug with aws-lambda-fastify so much as trying to compile with esbuild and run it on AWS, but given how many people use esbuild, I think the README could make the DX much easier.

Background: I spent many hours this week trying to get my Fastify API deploying to AWS lambda. I'm running all the latest versions and used the boilerplate from this readme, but kept encountering this issue, even when I reduced it to the most minimal Fastify instance possible:

{
  "errorType": "Error",
  "errorMessage": "Dynamic require of \"events\" is not supported",
  "trace": [
    "Error: Dynamic require of \"events\" is not supported",
    "    at file:///var/task/dist/lambda.mjs:12:9",
    "    at node_modules/avvio/boot.js (file:///var/task/dist/lambda.mjs:1482:14)",
    "    at __require2 (file:///var/task/dist/lambda.mjs:15:50)",
    "    at node_modules/fastify/fastify.js (file:///var/task/dist/lambda.mjs:44355:17)",
    "    at __require2 (file:///var/task/dist/lambda.mjs:15:50)",
    "    at file:///var/task/dist/lambda.mjs:72691:30",
    "    at ModuleJob.run (node:internal/modules/esm/module_job:217:25)",
    "    at async ModuleLoader.import (node:internal/modules/esm/loader:316:24)",
    "    at async _tryAwaitImport (file:///var/runtime/index.mjs:1008:16)",
    "    at async _tryRequire (file:///var/runtime/index.mjs:1057:86)"
  ]
}

At various points, I tried switching to commonjs as a work-around, but that was not possible due to this error:

✘ [ERROR] Top-level await is currently not supported with the "cjs" output format

    src/lambda.ts:10:32:
      10 │ export default awsLambdaFastify(await buildFastifyApi());
         ╵                                 ~~~~~

To get my Fastify API building as ESM with esbuild, I ultimately just had to add one line to my esbuild script adding a banner, but I did not see this mentioned in any of the docs or tutorials I tried to follow:

esbuild
  .build({
    entryPoints: ['src/lambda.ts'],
    banner: { js: "import { createRequire } from 'module';const require = createRequire(import.meta.url);" },
    bundle: true,
    format: 'esm',
    platform: 'node',
    target: 'node20',
    outfile: 'dist/lambda.mjs'
  })
  .catch(() => process.exit(1));

Adding a disclaimer to the README about needing to use a banner if using esbuild would have saved me many hours this week.

original event body is not available on req.awsLambda.event

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

3.27.2

Plugin version

2.1.0

Node.js version

14.x

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

12.2.1

Description

Hello! 👋

The original event's body is being removed from the decorated req.awsLambda.event.body. This seems like the most likely cause: https://github.com/fastify/aws-lambda-fastify/blob/master/index.js#L62

Since currentAwsArguments.event is assigned directly to the event from the handler, setting event.body to undefined would also impact the reference held by currentAwsArguments.

I believe this should either assign a deep copy of the event to currentAwsArguments.event, or at the very least not set event.body to undefined (though it's not clear to me why this is being done).

Steps to Reproduce

  1. issue a POST request with a body included
  2. Observe that the body is provided in req.body but req.awsLambda.event.body is undefined

Expected Behavior

req.awsLambda.event.body contains the original post body of the event

Doesn't work on a deployed lambda function

While it seems to work fine when testing locally with sls offline, trying to run it on a deployed lambda function with API Gateway, I am getting the following error:

{
    "errorType": "TypeError",
    "errorMessage": "Cannot convert undefined or null to object",
    "stack": [
        "TypeError: Cannot convert undefined or null to object",
        "    at Function.keys (<anonymous>)",
        "    at getName (/var/task/_optimize/dev-main/src/lambda.js:17374:23)",
        "    at new Plugin (/var/task/_optimize/dev-main/src/lambda.js:17398:15)",
        "    at new Boot (/var/task/_optimize/dev-main/src/lambda.js:16259:16)",
        "    at Boot (/var/task/_optimize/dev-main/src/lambda.js:16211:22)",
        "    at build (/var/task/_optimize/dev-main/src/lambda.js:43616:17)",
        "    at bootstrapServer (/var/task/_optimize/dev-main/src/lambda.js:11:343)",
        "    at Runtime.exports.handler (/var/task/_optimize/dev-main/src/lambda.js:11:617)",
        "    at Runtime.handleOnce (/var/runtime/Runtime.js:66:25)",
        "    at process._tickCallback (internal/process/next_tick.js:68:7)"
    ]
}

To Reproduce

Here is the lambda.ts file used to bootstrap the app:

import {NestFactory} from '@nestjs/core';
import {FastifyAdapter, NestFastifyApplication} from '@nestjs/platform-fastify';
import {AppModule} from './app.module';
import * as fastify from 'fastify';
import * as awsLambdaFastify from 'aws-lambda-fastify';
import {
    Context,
    APIGatewayProxyEvent,
    APIGatewayProxyResult
} from 'aws-lambda';
import { Logger } from '@nestjs/common';

interface NestApp {
    app: NestFastifyApplication;
    instance: fastify.FastifyInstance;
}

let cachedNestApp: NestApp;

async function bootstrapServer(): Promise<NestApp> {
    const serverOptions: fastify.ServerOptionsAsHttp = {
        logger: true,
    };
    const instance: fastify.FastifyInstance = fastify(serverOptions);
    const app = await NestFactory.create<NestFastifyApplication>(
        AppModule,
        new FastifyAdapter(instance),
        {
            logger: !process.env.AWS_EXECUTION_ENV ? new Logger() : console
        }
    );
    app.setGlobalPrefix(process.env.API_PREFIX);
    await app.init();

    return {
        app,
        instance
    };
}

export const handler = async (
    event: APIGatewayProxyEvent,
    context: Context,
): Promise<APIGatewayProxyResult> => {
    if (!cachedNestApp) {
        cachedNestApp = await bootstrapServer();
    }
    const proxy = awsLambdaFastify(cachedNestApp.instance);

    return proxy(event, context);
};

And here is serverless.yml:

service:
  name: my-service

plugins:
  - serverless-plugin-typescript
  - serverless-plugin-optimize
  - serverless-offline

custom:
  stage: ${opt:stage, self:provider.stage}
  common: ${file(../serverless/common.yml):${self:custom.stage}}

provider:
  name: aws
  runtime: nodejs10.x
  region: ${self:custom.common.REGION}
  environment:
    API_PREFIX: "${self:custom.common.API_PREFIX}"

functions:
  main:
    handler: src/lambda.handler
    events:
      - http:
          method: any
          path: ${self:custom.common.API_PREFIX}/{proxy+}

Expected behavior

To work without errors.

Your Environment

  • node version: 10
  • fastify version: 2.7.1

Lax permissions in CI?

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

Looking at ci.yml, the test job has quite lax permissions:

pull-requests: write
contents: write

Is there a reason this job needs these?
From reviewing the scripts it runs, I can't see any.

Cookies are not handled

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure it has not already been reported

Fastify version

3.17.0

Plugin version

1.4.4

Node.js version

14.x

Operating system

Linux

Operating system version (i.e. 20.04, 11.3, 10)

Lambda OS

Description

In the lamda event cookies are in the cookies property and not in header. See:

{
...
"cookies":["FOO=Bar"]
...
}

But in the current state we don't map this value. Also in the current context i'm using API Gateway V2

Steps to Reproduce

Send cookies within Api Gateway v2

Expected Behavior

No response

TypeScript and ESM: This expression is not callable. Type ... has no call signatures

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

3.1.3

Plugin version

No response

Node.js version

18

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

12.5

Description

Hi,

I have a TS project in ESM mode (type:module) and getting a not callable error on awsLambdaFastiify.

package.json

{
  "dependencies": {
    "@fastify/aws-lambda": "^3.1.3",
    "fastify": "^4.5.3"
  },
  "devDependencies": {
    "@types/aws-lambda": "^8.10.95",
    "@types/got": "^9.6.12",
    "@types/node": "^17.0.22",
    "ts-node": "^10.9.1",
    "typescript": "^4.6.2"
  },
  "type": "module"
}

tsconfig.json

{
  "compilerOptions": {
    "module": "NodeNext",
    "moduleResolution": "NodeNext",
    "target": "ESNext",
  },
  "ts-node": {
    "esm": true
  }
}

lambda-fastify.ts

import awsLambdaFastify from '@fastify/aws-lambda'
import app from './app.js'

export const handler = awsLambdaFastify(app)

Will throw a This expression is not callable. error

Changing the import statement like this has no impact

import * as awsLambdaFastify from '@fastify/aws-lambda';

I have noticed that the node_modules/@fastify/aws-lambda/index.d.ts file does not contain the export = awsLambdaFastify at the end which prevents the import to work properly in ESM I suppose (adding it back make it works).
I noticed the same in the releases package source code zip file of previous versions as well.

Steps to Reproduce

On top of elements from the description, here is the app.ts file.

app.ts

import {fastify} from 'fastify'

const app = fastify()
app.get('/', (request, reply) => reply.send({ hello: 'world' }))

export default app

error is highlighted by the IDE it lambda-fastify.ts file as well as running

npx tsc

Expected Behavior

Expect the import awsLambdaFastify to be callable and return a PromiseHandler.

Thank you for the help on this.

Support for event/context to decorate request

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

Currently to capture the raw APIGW event or context, we need to use the json serialized headers.
While this is workable, I was wondering if its possible instead to provide an option to decorate the request with the event and context.

This would be useful to pull out authorizer results and other APIGW specific fields, and possibly reduce the overhead of json serialization (and deserialization) to read these fields.

I'm a little unsure how to go about this as this library uses Inject and is not a Plugin, but i'm happy to give it a go if you can give some guidance :)

Motivation

Take advantage of API Gateway functionality with less computational complexity.

Example

const proxy = awsLambdaFastify(app, {
 decorateRequest: true
})

or

const proxy = awsLambdaFastify(app, {
 requestDecoration: {
  enabled: true,
  eventKey: 'apiGatewayEvent',
  contextKey: 'apiGatewayContext',
 }
})

An in-range update of fastify is breaking the build 🚨

The devDependency fastify was updated from 2.11.0 to 2.12.0.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

fastify is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build failed (Details).

Release Notes for v2.12.0

📚 PR:

  • fix: skip serialization for json string (#1937)
  • Added fastify-casl to Community Plugins (#1977)
  • feature: added validation context to validation result (#1915)
  • ESM support (#1984)
  • fix: adjust hooks body null (#1991)
  • Added mcollina's plugin "fastify-secure-session" (#1999)
  • Add fastify-typeorm-plugin to community plugins (#1998)
  • Remove Azure Pipelines (#1985)
  • Update validation docs (#1994)
  • Drop Windows-latest and Node 6 from CI as its failing. (#2002)
  • doc: fix esm-support anchor (#2001)
  • Docs(Fluent-Schema.md): fix fluent schema repo link (#2007)
  • fix - docs - hooks - error handling (#2000)
  • add fastify-explorer to ecosystem (#2003)
  • Add a recommendations doc (#1997)
  • Fix TOC typo in recommendations (#2009)
  • docs(Typescript): typo (#2016)
  • docs: fix onResponse parameter (#2020)
  • Update template bug.md (#2025)
  • fix replace way enum (#2026)
  • docs: update inject features (#2029)
  • Update Copyright Year to 2020 (#2031)
  • add generic to typescript Reply.send payload (#2032)
  • Shorten longest line (docs) (#2035)
  • docs: OpenJS CoC (#2033)
  • Workaround for using one schema for multiple routes (#2044)
  • docs: inject chainable methods (#1917) (#2043)
  • http2: handle graceful close (#2050)
  • chore(package): update fluent-schema to version 0.10.0 (#2057)
  • chore(package): update yup to version 0.28.1 (#2059)
  • Update README.md (#2064)
  • Added fastify-response-validation to ecosystem (#2063)
  • fix: use opts of onRoute hook (#2060)
  • Fixed documentation typo (#2067)
  • Add missing TS definition for ServerOptions.genReqId function arg (#2076)
  • fix: throw hooks promise rejection (#2070) (#2074)
  • Add docs to stop processing hooks with async (#2079)
Commits

The new version differs by 38 commits.

  • 7a37924 Bumped v2.12.0
  • aacefcd Add docs to stop processing hooks with async (#2079)
  • c052c21 fix: throw hooks promise rejection (#2070) (#2074)
  • 6b39870 Add missing TS definition for ServerOptions.genReqId function arg (#2076)
  • 7fa4bdd Fixed documentation typo (#2067)
  • 6b73e0a fix: use opts of onRoute hook (#2060)
  • 7bb9733 Added fastify-response-validation to ecosystem (#2063)
  • 0a1c1f0 Update README.md (#2064)
  • bd9f608 chore(package): update yup to version 0.28.1 (#2059)
  • e19d078 chore(package): update fluent-schema to version 0.10.0 (#2057)
  • d0c976e http2: handle graceful close (#2050)
  • af8a6ac docs: inject chainable methods (#1917) (#2043)
  • 62f21b1 Workaround for using one schema for multiple routes (#2044)
  • 5258f42 docs: OpenJS CoC (#2033)
  • ac46905 Shorten longest line (docs) (#2035)

There are 38 commits in total.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Example code is wrong

🐛 Bug Report

The example code for app.js in the README has incorrect code to detect if the file was executed directly.

It shows:

if (require.main !== module) {

instead of

if (require.main === module) {

aws lambda fastify takes 10X longer than http-serverless with firestore

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure it has not already been reported

Fastify version

3.19

Plugin version

1.7.1

Node.js version

14.6

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

Description

My app become so slow that it was barely usable. At the end I managed to pinpoint the error to aws-lambda-fastify.
A normal call to a firestore database which usually takes ~ 300ms with aws-lambda-fastify takes 3.000ms.

Steps to Reproduce

I've created a reproduction repo here:

Simply visit this url and you will see the timings for different calls. The one which changes drastically is the one connecting to firebase.

https://nuxt-lambdax.vercel.app/

You can view the code at https://github.com/Cosbgn/nuxt_lambda/tree/master/api

Expected Behavior

aws-lambda-fastify should be faster or as fast as http-serverless.

Incorrect splitting of query parameters parameters containing literal commas

The current query string translation logic doesn't distinguish between repeated query string values (i.e. &key=a&key=b) and values that contain URL encoded literal commas (i.e. &key=a%2Cb). The former should map to a fastify array value, the latter should not. It appears the API GW V2.0 event format has this same hazard.

The two cases could be distinguished by parsing the rawQueryString value (or for performance reasons, doing this only when a potential split is detected).

if (event.queryStringParameters && event.version === '2.0') {

Remote IP address passed to app.inject()

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

Hi
How possible is it to get the remote IP from the aws lambda environment (somewhere?) and include that in the app.inject() call?

Motivation

I'm using fastify/rate-limit because I want custom control over the rate limits applied. But because I don't know the remote IP address it thinks the whole world is 127.0.0.1 when a request comes from api gateway / lambda

Example

No response

Support for response streaming

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

AWS just released the ability to stream a Lambda function's response: https://aws.amazon.com/blogs/compute/introducing-aws-lambda-response-streaming/

It would be great if this framework supported it.

It requires wrapping the handler in streamifyResponse().

Motivation

This is an extremely useful feature which greatly decreases TTFB.

Example

// API which returns chunked data.
// Locally TTFB is 0, but on Lambda TTFB is 5s.
// Presumably because the handler is not wrapped in streamifyResponse.
fastify_streamtest.get('/chunked', (request, reply) => {
  // Create a buffer to hold the response chunks
  var buffer = new stream.Readable();
  buffer._read = () => {};

  // Generate 5 chunks with 1 second interval
  var count = 5;
  var emit = () => {
    var data = `Hello World ${count}`;
    console.log(`sending "${data}"`);
    buffer.push(data);

    count--;
    if (count > 0) {
      setTimeout(emit, 1000);
    } else {
      console.log('end sending.');
      buffer.push(null);
    }
  };

  emit();
  void reply.type('text/html').send(buffer);
});

Is there any reason the package is not written with TS?

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

A question from the title of the issue :)

If you give me some guidelines, I can rewrite the package when I get few hours free.
If there are no guidelines from your side, I'm also ok with that :)

Support for MultiStringQueryParameter in ApiGateway V2

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

Since the API Gateway V2 does not support the field MultiValueQueryStringParameters (see AWS Docs) anymore,
I would suggest to enable the framework to offer the possibility to parse comma separated query parameters to an array.

The API Gateway V2 provides multiValueQueryStringParameters like
example.com?query=Value0&query=Value1
using a comma separated query parameter just like:
queryStringParameters : { queryParameter: '"Value0,Value1" }.

Motivation

Enables Frameworks like nestjs to retrieve multi value queries such as ?value=one&value=twp to an array ['one', 'two'] without further modification just like in the API Gateway V1. It might be useful to implemented an adapter option which can be used to enable parsing comma separed query params or not.

Example

NestJS Example:

<url>?value='foo'&value='bar'

Should in the future results in an query object:
query: { value: ['foo', 'bar'] }

Currently the above mentioned query results in the following query object:
query: { value: 'bar' }

NestJs for Instance can then be used with a fastify backend without further data manipulation or transformation:
@Get() getHello(@Query('value') value: string[]): string { return value?.toString() // Return value: ['foo', 'bar'] }

Provide the package as an ESModule

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

Currently the package cannot be used within an ESModule.
When one tries to do so you get an error like

ReferenceError: require is not defined

Unfortunately the approaches mentioned in #89 do not work for me since I have concurrent initialization.

I am bundling my lambda function with webpack.

Motivation

With the package as an esmodule we can take advantage of top level awaits to further reduce cold start times.

Example

No response

Setting cookie to reply causes API Gateway v1 to return 500

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

3.19.1

Plugin version

No response

Node.js version

14.19.2

Operating system

Windows

Operating system version (i.e. 20.04, 11.3, 10)

10

Description

When setting a cookie in the reply by using reply.setCookie() from @fastify/cookie package, the proxy adds a cookie property to the lambda response.

The response payload with the cookie property does not respect API Gateway v1 format and therefore API Gateway always returns a 500 error.

Example response that causes the API Gateway V1 to error:

{
  "statusCode": 200,
  "body": "<!DOCTYPE html><html><head><title>Test</title></head><body></body></html>",
  "headers": {
    "content-type": "text/html; charset=utf-8",
    "content-length": "1024",
    "date": "Mon, 30 May 2022 16:40:00 GMT",
    "connection": "keep-alive"
  },
  "isBase64Encoded": false,
  "cookies": [
    "MY_COOKIE=value; Expires=Mon, 30 May 2022 17:00:00 GMT",
    "OTHER_COOKIE=other; Expires=Mon, 30 May 2022 17:00:00 GMT"
  ],
  "multiValueHeaders": {
    "set-cookie": [
      "MY_COOKIE=value; Expires=Mon, 30 May 2022 17:00:00 GMT",
      "OTHER_COOKIE=other; Expires=Mon, 30 May 2022 17:00:00 GMT"
    ]
  }
}

Example response that works well with the API Gateway V1 :

{
  "statusCode": 200,
  "body": "<!DOCTYPE html><html><head><title>Test</title></head><body></body></html>",
  "headers": {
    "content-type": "text/html; charset=utf-8",
    "content-length": "1024",
    "date": "Mon, 30 May 2022 16:40:00 GMT",
    "connection": "keep-alive"
  },
  "isBase64Encoded": false,
  "multiValueHeaders": {
    "set-cookie": [
      "MY_COOKIE=value; Expires=Mon, 30 May 2022 17:00:00 GMT",
      "OTHER_COOKIE=other; Expires=Mon, 30 May 2022 17:00:00 GMT"
    ]
  }
}

I tried to fiddle with the code and everything works well when I comment the following line in index.js:

if (cookies) ret.cookies = cookies

Here are the response format for API Gateway: https://docs.aws.amazon.com/apigateway/latest/developerguide/http-api-develop-integrations-lambda.html

Steps to Reproduce

setup a lambda behind an API Gateway v1.

call reply.setCookie('cookieName', 'value') in the lambda code

api gateway returns error 500

Expected Behavior

cookies property should not be present in the lambda response payload when using API Gateway V1
Only the Set-Cookie headers must be present in the lambda payload response when using API Gateway V1

app.inject is not a function

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

3.0.0

Plugin version

3.0.0

Node.js version

14

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

11.5.2

Description

The Error I'm getting from lambda is...

2021-10-10T21:59:37.542Z	38e026e9-9bc5-4501-b94b-a5b8d2764de2	ERROR	Invoke Error 	
{
    "errorType": "TypeError",
    "errorMessage": "app.inject is not a function",
    "stack": [
        "TypeError: app.inject is not a function",
        "    at /var/task/node_modules/aws-lambda-fastify/index.js:47:9",
        "    at new Promise (<anonymous>)",
        "    at Runtime.handler (/var/task/node_modules/aws-lambda-fastify/index.js:46:16)",
        "    at Runtime.handleOnce (/var/runtime/Runtime.js:66:25)"
    ]
}

Steps to Reproduce

Index...

"use strict";
const awsLambdaFastify = require('aws-lambda-fastify');
const app = require('./app');
const proxy = awsLambdaFastify(app);
exports.handler = proxy;

app...

"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
exports.app = void 0;
const path_1 = require("path");
const fastify_autoload_1 = require("fastify-autoload");
const app = async (fastify, opts) => {
    void fastify.register(fastify_autoload_1.default, {
        dir: (0, path_1.join)(__dirname, 'plugins'),
        options: opts,
    });
    void fastify.register(fastify_autoload_1.default, {
        dir: (0, path_1.join)(__dirname, 'routes'),
        options: opts,
    });
};
exports.app = app;
exports.default = app;

the above was created and deployed from the typescript below

import { join } from 'path';
import AutoLoad, { AutoloadPluginOptions } from 'fastify-autoload';
import { FastifyPluginAsync } from 'fastify';

export type AppOptions = {} & Partial<AutoloadPluginOptions>;

const app: FastifyPluginAsync<AppOptions> = async (
  fastify,
  opts
): Promise<void> => {
  void fastify.register(AutoLoad, {
    dir: join(__dirname, 'plugins'),
    options: opts,
  });

  void fastify.register(AutoLoad, {
    dir: join(__dirname, 'routes'),
    options: opts,
  });
};

export default app;
export { app };

Expected Behavior

No response

Auth Plugin timesout

I do not think this is related to the plugin now but to how I use the hook. I add it in route like this. {preValidation:[fastify.authenticate]}

This error pops up
TypeError: Cannot read property 'bind' of undefined at opts.(anonymous function).opts.(anonymous function).map.fn (fastify\lib\route.js:223:50) at Array.map (<anonymous>) at Object.afterRouteAdded (API\node_modules\fastify\lib\route.js:223:37) at after (-API\node_modules\fastify\lib\route.js:156:25) at Object._encapsulateThreeParam (API\node_modules\avvio\boot.js:419:7) at Boot.callWithCbOrNextTick (API\node_modules\avvio\boot.js:341:5) at Boot._after (API\node_modules\avvio\boot.js:232:26) at Plugin.exec (API\node_modules\avvio\plugin.js:89:17) at Boot.loadPlugin (API\node_modules\avvio\plugin.js:175:10) at release (API\node_modules\fastq\queue.js:127:16) at Object.resume (API\node_modules\fastq\queue.js:61:7) at Plugin.finish (API\node_modules\avvio\plugin.js:166:10) at toLoad.exec (API\node_modules\avvio\plugin.js:176:12) at done (API\node_modules\avvio\plugin.js:115:5) at process._tickCallback (internal/process/next_tick.js:61:11)

AWS WebSocket API(s) are not supported?

I was trying to use aws-lambda-fastify with NestJs framework for the HTTP API WebSocket endpoints but it quite does not seem to work.

Quick browse indicates that WS endpoints are not supported?

Error: should have required property '.url',should have required property '.path',should match exactly one schema in oneOf
    at doInject (/serverless/.build/cloud/node_modules/light-my-request/index.js:85:13)
    at inject (/serverless/.build/cloud/node_modules/light-my-request/index.js:74:12)
    at Object.inject (/serverless/.build/cloud/node_modules/fastify/fastify.js:342:14)
    at /serverless/.build/cloud/node_modules/aws-lambda-fastify/index.js:39:9
    at new Promise (<anonymous>)
    at /serverless/.build/cloud/node_modules/aws-lambda-fastify/index.js:38:16
    at /serverless/cloud/src/serverless.ts:77:10
    at step (/serverless/.build/cloud/src/serverless.js:33:23)
    at Object.next (/serverless/.build/cloud/src/serverless.js:14:53)
    at /serverless/.build/cloud/src/serverless.js:8:71
    at new Promise (<anonymous>)
    at __awaiter (/serverless/.build/cloud/src/serverless.js:4:12)
    at exports.handler (/serverless/cloud/src/serverless.ts:70:19)
    at InProcessRunner.run (/serverless/node_modules/serverless-offline/dist/lambda/handler-runner/in-process-runner/InProcessRunner.js:100:16)
    at processTicksAndRejections (internal/process/task_queues.js:97:5)

Use with NestJS

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

Hi all,

Does anyone already use it in production with NestJS?
Is it possible to send me an example?

I searched and didn't find any information about this.
The only article I found (on Medium) uses old versions.

TIA,

Rafael.

Support compressed responses

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the feature has not already been requested

🚀 Feature Proposal

Currently, the only way to specify that the body is binary (ie.: encoded in base64) is by reading the Content-Type and match it with the binaryMimeTypes option. For those cases, the library will respond with the body in base64 perfectly.

But there are cases in which, the response is a text mime type (like application/json) but the body may be encoded in gzip or brotli (compressed).

The proposal is to look into the Content-Encoding and treat them as binary if it is filled, here:

const isBase64Encoded = options.binaryMimeTypes.indexOf(contentType) > -1

Add an option to enable/disable this check is acceptable.

Motivation

In our scenario, sometimes we treat with "huge" JSON payloads in both request and responses. The request body works fine compressed, but the response does not. Sometimes we face the issue in which the response is a bit huge and AWS Lambda refuses to respond, but we know that compressed it will work fine.

We use @fastify/compress to handle with compressed payloads.

Example

With the proposal, there is nothing to do to make it work.

Doesn't work if callback parameter is passed but it's used as an asynchronous handler

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure it has not already been reported

Fastify version

3.18.0

Plugin version

No response

Node.js version

14.x

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

11.4

Description

The Lambda environment always defines the callback parameter even if the function is marked as async. This check is causing serverless-offline to break because the handler returns undefined if the callback parameter is passed, as the function is used asynchronously [but the callback parameter is there].

I also decorate my function with an external handler from Sentry and they require the callback parameter to be passed downstream, I can't be sure whether or not their code will rely on it.

Steps to Reproduce

  1. Create blank Lambda with content:
module.exports = async (event, context, callback) {
  console.log(callback);
  return Promise.resolve();
}
  1. Check that it outputs:
[Function (anonymous)]

I don't think this check makes sense if the Lambda environment always passes this environment. Am I missing something?

Expected Behavior

No response

Updating for v5

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

I'm looking to update this repo for v5. fastify/fastify#5116

Changes Needed (Please let me know if anything might be missing)

  • Updaate dependencies
  • Update workflows to use v4

Release 3.1.0 doesn't match this repo

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the issue has not already been raised

Issue

Hey,

I just received an update 3.1.0 from NPM but was surprised to see no matching Tag oder incremented version in the package.json. Just to be sure, everything is right: has this change been published by intention? what has been changed?

Kind regards and thanks for your work,
Florian

App FastifyInstance type error when using ZodTypeProvider

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

4.26.0

Plugin version

4.0.0

Node.js version

20.x

Operating system

macOS

Operating system version (i.e. 20.04, 11.3, 10)

Sonoma 14.4

Description

When using Fastify type provider ZodTypeProvider from https://github.com/turkerdev/fastify-type-provider-zod we get the following type error when passing our app to awsLambdaFastify():

No overload matches this call.
  Overload 1 of 2, '(app: FastifyInstance<RawServerDefault, IncomingMessage, ServerResponse<IncomingMessage>, FastifyBaseLogger, FastifyTypeProviderDefault>, options?: LambdaFastifyOptions | undefined): PromiseHandler<...>', gave the following error.
    Argument of type 'FastifyZodInstance' is not assignable to parameter of type 'FastifyInstance<RawServerDefault, IncomingMessage, ServerResponse<IncomingMessage>, FastifyBaseLogger, FastifyTypeProviderDefault>'.
      The types returned by 'after()' are incompatible between these types.
        Type 'FastifyInstance<RawServerDefault, IncomingMessage, ServerResponse<IncomingMessage>, FastifyBaseLogger, ZodTypeProvider> & PromiseLike<...>' is not assignable to type 'FastifyInstance<RawServerDefault, IncomingMessage, ServerResponse<IncomingMessage>, FastifyBaseLogger, FastifyTypeProviderDefault> & PromiseLike<...>'.
          Type 'FastifyInstance<RawServerDefault, IncomingMessage, ServerResponse<IncomingMessage>, FastifyBaseLogger, ZodTypeProvider> & PromiseLike<...>' is not assignable to type 'FastifyInstance<RawServerDefault, IncomingMessage, ServerResponse<IncomingMessage>, FastifyBaseLogger, FastifyTypeProviderDefault>'.
            The types of 'addSchema(...).addHook' are incompatible between these types.
              Type '{ <RouteGeneric extends import("<redacted>/node_modules/fastify/types/route").RouteGenericInterface = import("<redacted>/node_modules/fastify/types/route").RouteGenericInterface, ContextConfig = unknown, SchemaCompiler extends import("<redacted>...' is not assignable to type '{ <RouteGeneric extends import("<redacted>/node_modules/fastify/types/route").RouteGenericInterface = import("<redacted>/node_modules/fastify/types/route").RouteGenericInterface, ContextConfig = unknown, SchemaCompiler extends import("<redacted>...'. Two different types with this name exist, but they are unrelated.
                Types of parameters 'hook' and 'hook' are incompatible.
                  Types of parameters 'opts' and 'opts' are incompatible.
                    Type 'RouteOptions<RawServerDefault, IncomingMessage, ServerResponse<IncomingMessage>, any, any, any, ZodTypeProvider, FastifyBaseLogger> & { ...; }' is not assignable to type 'RouteOptions<RawServerDefault, IncomingMessage, ServerResponse<IncomingMessage>, any, any, any, FastifyTypeProviderDefault, FastifyBaseLogger> & { ...; }'.
                      Type 'RouteOptions<RawServerDefault, IncomingMessage, ServerResponse<IncomingMessage>, any, any, any, ZodTypeProvider, FastifyBaseLogger> & { ...; }' is not assignable to type 'RouteOptions<RawServerDefault, IncomingMessage, ServerResponse<IncomingMessage>, any, any, any, FastifyTypeProviderDefault, FastifyBaseLogger>'.
                        Types of property 'handler' are incompatible.
                          Type 'RouteHandlerMethod<RawServerDefault, IncomingMessage, ServerResponse<IncomingMessage>, any, any, any, ZodTypeProvider, FastifyBaseLogger>' is not assignable to type 'RouteHandlerMethod<RawServerDefault, IncomingMessage, ServerResponse<IncomingMessage>, any, any, any, FastifyTypeProviderDefault, FastifyBaseLogger>'.
                            Type 'FastifyTypeProviderDefault' is not assignable to type 'ZodTypeProvider'.
  Overload 2 of 2, '(app: FastifyInstance<RawServerDefault, IncomingMessage, ServerResponse<IncomingMessage>, FastifyBaseLogger, FastifyTypeProviderDefault>, options?: LambdaFastifyOptions | undefined): CallbackHandler<...>', gave the following error.
    Argument of type 'FastifyZodInstance' is not assignable to parameter of type 'FastifyInstance<RawServerDefault, IncomingMessage, ServerResponse<IncomingMessage>, FastifyBaseLogger, FastifyTypeProviderDefault>'.ts(2769)

Which boils down to Type 'FastifyTypeProviderDefault' is not assignable to type 'ZodTypeProvider'. I'm not sure if that is the actual error or a red herring.

Our Fastify instance type is defined as:

import type {
  FastifyBaseLogger,
  FastifyInstance,
  RawReplyDefaultExpression,
  RawRequestDefaultExpression,
  RawServerDefault,
} from "fastify";
import type { ZodTypeProvider } from "fastify-type-provider-zod";

export type FastifyZodInstance = FastifyInstance<
  RawServerDefault,
  RawRequestDefaultExpression<RawServerDefault>,
  RawReplyDefaultExpression<RawServerDefault>,
  FastifyBaseLogger,
  ZodTypeProvider
>;

I'm not sure if we're doing something wrong or fastify-type-provider-zod or fastify-aws-lambda. I can post the full non-truncated type error if that helps. Any help is appreciated!

Steps to Reproduce

Reproducible repo https://github.com/rt-joe/fastify-aws-lambda-zod-type-provider-type-error

With node v20 installed

git clone https://github.com/rt-joe/fastify-aws-lambda-zod-type-provider-type-error
cd fastify-aws-lambda-zod-type-provider-type-error
npm install
npm run type

Expected Behavior

No type error when using non default fastify type provider

NestJS-Fastify

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure it has not already been reported

Fastify version

3.x

Plugin version

1.7.1

Node.js version

14.x

Operating system

Windows

Operating system version (i.e. 20.04, 11.3, 10)

Windows 10

Description

when use const proxy = awsLambdaFastify(cachedNestApp.instance);
there's an error on awsLambdaFastify
<<Type 'typeof import(".../node_modules/aws-lambda-fastify/index")' has no call signatures.ts(2349)>>

nestjs/core version 7.6.15

Steps to Reproduce

1 Create nestjs project with fastify
2. npm install aws-lambda -D
3. npm install aws-lambda-fastify
4. Add lambda.ts on scr path
`import { NestFactory } from '@nestjs/core';
import { FastifyAdapter, NestFastifyApplication } from '@nestjs/platform-fastify';
import { AppModule } from './app.module';
import { FastifyServerOptions, FastifyInstance, fastify } from 'fastify';
import * as awsLambdaFastify from 'aws-lambda-fastify';
import { Context, APIGatewayProxyEvent, APIGatewayProxyResult } from 'aws-lambda';
import { Logger } from '@nestjs/common';

interface NestApp {
app: NestFastifyApplication;
instance: FastifyInstance;
}

let cachedNestApp: NestApp;

async function bootstrapServer(): Promise {
const serverOptions: FastifyServerOptions = { logger: true };
const instance: FastifyInstance = fastify(serverOptions);
const app = await NestFactory.create(
AppModule,
new FastifyAdapter(instance), { logger: !process.env.AWS_EXECUTION_ENV ? new Logger() : console }
);
app.setGlobalPrefix(process.env.API_PREFIX);
await app.init();
return { app, instance };
}

export const handler = async (event: APIGatewayProxyEvent, context: Context,): Promise => {
if (!cachedNestApp) {
cachedNestApp = await bootstrapServer();
}
const proxy = awsLambdaFastify(cachedNestApp.instance);
return proxy(event, context);
};`

Expected Behavior

No response

Unable to get fastify/swagger working with fastify/aws-lambda

Prerequisites

  • I have written a descriptive issue title
  • I have searched existing issues to ensure the bug has not already been reported

Fastify version

4.7.0

Plugin version

3.1.3

Node.js version

16.10.0

Operating system

Linux

Operating system version (i.e. 20.04, 11.3, 10)

PopOS 20.04

Description

Route /documentations does not work when used with @fastify/aws-lambda and @fastify/swagger.

index.ts

const awsLambdaFastify = require("@fastify/aws-lambda");
const { fastify } = require('fastify');

const app = fastify({ logger: true });

app.register(require('@fastify/swagger'), {
  routePrefix: '/documentation',
  swagger: {
    info: {
      title: 'Test swagger',
      description: 'Testing the Fastify swagger API',
      version: '0.1.0'
    },
    consumes: ['application/json'],
    produces: ['application/json'],
  },
  uiConfig: {
    docExpansion: 'full',
    deepLinking: false
  },
  exposeRoute: true
});

app.get('/rest', async (request, reply) => {
  return { hello: 'world' }
});

app.ready().then(() => {
  app.swagger();
});

// app.listen({ port: 3000 })

const proxy = awsLambdaFastify(app, {
  callbackWaitsForEmptyEventLoop: false,
  decorateRequest: false,
});
module.exports = {
  handler: proxy,
};

serverless.yml

service: sls-test
# app and org for use with dashboard.serverless.com
#app: your-app-name
#org: your-org-name
frameworkVersion: '3'

# Add the serverless-webpack plugin
plugins:
  - serverless-esbuild
  - serverless-offline
provider:
  name: aws
  runtime: nodejs14.x
  stage: dev
  region: eu-north-1

functions:
  rest:
    handler: index.handler
    events:
      - http:
          method: ANY
          path: /{any+}
          cors: true

package.json

{
  "name": "sls-test",
  "version": "1.0.0",
  "description": "Serverless webpack example using ecma script",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "devDependencies": {
    "@babel/core": "^7.11.1",
    "@babel/preset-env": "^7.11.0",
    "babel-loader": "^8.1.0",
    "babel-plugin-transform-runtime": "^6.23.0",
    "babel-polyfill": "^6.23.0",
    "babel-preset-env": "^1.6.0",
    "serverless-esbuild": "^1.33.0",
    "serverless-offline": "^11.0.1",
    "serverless-webpack": "^5.3.1",
    "webpack": "^4.35.2"
  },
  "author": "The serverless webpack authors (https://github.com/elastic-coders/serverless-webpack)",
  "license": "MIT",
  "dependencies": {
    "@fastify/aws-lambda": "^3.1.3",
    "@fastify/swagger": "^7.6.1",
    "esbuild": "^0.15.10",
    "fastify": "^4.7.0",
    "server": "^1.0.37"
  }
}

Steps to Reproduce

  • Save the above code
  • Run sls offline start
  • Go to /rest and note that we get valid response
  • Go to /documentation and note that it will redirect and get 404

Running the above code as standalone works fine. In order to test this,

  • Uncomment this line
app.listen({ port: 3000 })
  • Comment these lines
const proxy = awsLambdaFastify(app, {
  callbackWaitsForEmptyEventLoop: false,
  decorateRequest: false,
});
module.exports = {
  handler: proxy,
};
  • run node index.js
  • Go to /rest and note that we get valid response
  • Go to /documentation and note that you will be successfully redirected to swagger page

Expected Behavior

No response

N/A

[Accidentally submitted the issue and there is no way to delete it.]

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.