Giter Club home page Giter Club logo

knex-data-api-client's Introduction

knex-aurora-data-api-client

Knex Aurora Data API Client

npm npm

This is a fork of the knex-data-api-client by @alan-cooney to support both Postgres and Mysql

The Knex Aurora Data API Client is a Knex extension that supports the RDS Data API, built using Jeremy Daily's excellent data-api-client module.

Support for transactions, and nestTables is included.

Configuration

The library uses the default AWS credentials to connect to the RDS database using the data-api. The data-api-client that this library is a using provides more documentation on the permissions required and how to enable the data-api for the database.

Use

To use aurora in mysql mode:

const knexDataApiClient = require('knex-aurora-data-api-client');
const knex = require('knex')({
  client: knexDataApiClient.mysql,
  connection: {
    secretArn: 'secret-arn', // Required
    resourceArn: 'db-resource-arn', // Required
    database: 'db-name',
    region: 'eu-west-2',
  },
});

To use aurora in postgres mode:

const knexDataApiClient = require('knex-aurora-data-api-client');
const knex = require('knex')({
  client: knexDataApiClient.postgres,
  connection: {
    secretArn: 'secret-arn', // Required
    resourceArn: 'db-resource-arn', // Required
    database: 'db-name',
    region: 'eu-west-2',
  },
});

Nested tables support

Note - this significantly increases the data required back from the RDS data api.

knex().doSomething().options({ nestTables: true });

Credits

Forked from Skyhook knex-data-api-client and provided under an MIT license.

knex-data-api-client's People

Contributors

alan-cooney avatar cccross avatar chipat avatar codanny avatar dependabot[bot] avatar husa avatar jaska120 avatar jonathannen avatar markusahlstrand avatar mkotsalainen avatar rifont avatar viqueen avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

knex-data-api-client's Issues

BadRequestException: Table 'knex_migrations' already exists.

I'm trying to make it work with knex migrations, but it fails with confusing error:

BadRequestException: Table 'knex_migrations' already exists.

Context

My migration file

exports.up = function(knex) {
  return knex.schema.createTable('foobar', (table) => {
    table.increments('id');
 })
};

exports.down = function(knex) {
  return knex.schema
    .dropTable('foobar');
};

knexfile

const knexDataApiClient = require('knex-aurora-data-api-client');

const config = {
  client: knexDataApiClient.mysql,
  connection: {
    resourceArn: '...',
    secretArn: '...',
    region: '...',
    database: '...',
  },
}

module.exports = {
  development: config,
  staging: config,
  production: config,
};

Once I run knex migrate:latest it fails with:

BadRequestException: Table 'knex_migrations' already exists

Upon checking RDS instance in AWS console, I see that it just created tables "knex_migrations" and "knex_migrations_lock".

When I try to run knex migrate:latest, I get same error.
When I remove those tables and rerun knex migrate:latest, I get same error.

How do we run migrations using the knex migration API ?

I tried importing the db and then calling the knex.migrate.up but it gave this error

Uncaught Exception 	
{
    "errorType": "Error",
    "errorMessage": "Knex: run\n$ npm install rds-data --save\n'secretArn' string value required",
    "stack": [
        "Error: Knex: run",
        "$ npm install rds-data --save",
        "'secretArn' string value required",
        "    at PostgresClientRDSDataAPI.initializeDriver (/var/task/node_modules/knex/lib/client.js:182:13)",
        "    at new Client (/var/task/node_modules/knex/lib/client.js:71:12)",
        "    at new Client_PG (/var/task/node_modules/knex/lib/dialects/postgres/index.js:18:5)",
        "    at new PostgresClientRDSDataAPI (/var/task/node_modules/knex-aurora-data-api-client/src/postgres.js:7:5)",
        "    at Function.knex (/var/task/node_modules/knex/lib/knex-builder/Knex.js:12:28)",
        "    at Object.<anonymous> (/var/task/db.js:5:19)",
        "    at Module._compile (internal/modules/cjs/loader.js:1072:14)",
        "    at Object.Module._extensions..js (internal/modules/cjs/loader.js:1101:10)",
        "    at Module.load (internal/modules/cjs/loader.js:937:32)",
        "    at Function.Module._load (internal/modules/cjs/loader.js:778:12)",
        "    at Module.require (internal/modules/cjs/loader.js:961:19)",
        "    at require (internal/modules/cjs/helpers.js:92:18)",
        "    at Object.<anonymous> (/var/task/migrations.js:4:14)",
        "    at Module._compile (internal/modules/cjs/loader.js:1072:14)",
        "    at Object.Module._extensions..js (internal/modules/cjs/loader.js:1101:10)",
        "    at Module.load (internal/modules/cjs/loader.js:937:32)"
    ]
}

Resolve with Bluebird Promise

Great little library! Works great for the most part. I have encountered an issue when I bundle this with webpack, and it seems to be related to how Knex expects (maybe? It's a bit unclear in their code base) Bluebird promises;

this.client.acquireConnection(...).catch(...).disposer

Is there a reason the promises at these lines aren't resolved with Bluebird? It is already imported in the bundle. Uncommenting these seems to solve my issues with webpack.

// return Bluebird.resolve(connection);

// return Bluebird.resolve();

When I'm marking both Knex and this library as externals in Webpack, it works without the need to modify the above source.

And just for the completeness of the use-case: I'm packing my dependencies with webpack to optimize my cold starts on AWS Lambda.

Postgres migrations broke on 1.4.1

Hi,

I just started to use your awesome library. I started with latest version 1.4.2 but could not get the migrations to work on postgres. I tried it multiple times on fresh database with no luck.

See my attached log from knex migrate:latest --debug command:

[
  {
    sql: 'select * from information_schema.tables where table_name = ? and table_schema = current_schema()',
    bindings: [ 'knex_migrations' ],
    output: [Function: output]
  }
]
[
  {
    sql: 'select * from information_schema.tables where table_name = ? and table_schema = current_schema()',
    bindings: [ 'knex_migrations_lock' ],
    output: [Function: output]
  }
]
{
  method: 'select',
  options: {},
  timeout: false,
  cancelOnTimeout: false,
  bindings: [],
  __knexQueryUid: 'J8ebRgCSTq9fVCRIBh9Op',
  sql: 'select * from "knex_migrations_lock"'
}
{
  method: 'select',
  options: {},
  timeout: false,
  cancelOnTimeout: false,
  bindings: [],
  __knexQueryUid: 'HRewO9CjQ1BJ0S2vdBO0g',
  sql: 'select "name" from "knex_migrations" order by "id" asc'
}
{
  method: 'update',
  options: {},
  timeout: false,
  cancelOnTimeout: false,
  bindings: [ 1, 0 ],
  __knexQueryUid: 'eDdSGhYPYGhUbJdbAgeC4',
  sql: 'update "knex_migrations_lock" set "is_locked" = ? where "is_locked" = ?',
  returning: undefined
}
Can't take lock to run migrations: Migration table is already locked
If you are sure migrations are not running you can release the lock manually by running 'knex migrate:unlock'
undefined

I can confirm that migrations run fine by downgrading to 1.4.0.

Wrong returning structure PostgreSQL

Right now if you use this connector with PostgreSQL and returning(...) call (knex spec)

It is always returning "array of objects" even when it should be just returning "array of values".

Example:

knex('books')
  .returning('id')
  .insert({title: 'Slaughterhouse Five'})

is returning [ { id: 1 } ] instead of expected [ 1 ]

Only if you are expecting multiple properties returning('id','title') it should return [ { id: 1, title: 'Slaughterhouse Five' } ]

Not sure if its problem here or with used "data-api-client", but this connector should comply with Knex specification.

I was expecting to use knex with interchangeable connectors for DataAPI and plain PostgreSQL, but this issue is making it harder.

Aurora MySQL Data API - migration fails on knex_migrations insert step

Environment

Knex version: "0.95.6",
Database + version: Aurora MySQL 5.7 Serverless accessed through Data API
Client: "knex-aurora-data-api-client": "1.5.4",

Bug

When trying to run a migration to create a simple table, it fails on inserting a tuple into the knex_migrations table. After the migration failure, the required table is created and there is a row in the knex_migrations_lock table, however the migration fails with the error given below. If created manually, data can be read from the table with the same knex connection parameters and therefore the connection is set up well. The error:

migration file "20210602103458_create_example_table.js" failed
migration failed with error: insert into `knex_migrations` (`batch`, `migration_time`, `name`) values (1, '2021-06-02 13:06:32.145', '20210602103458_create_example_table.js') - Database error code: 1292. Message: Data truncation: Incorrect datetime value: '"2021-06-02T11:06:32.145Z"' for column 'migration_time' at row 1

You can basically try to create any table. I created a simple table with 4 int columns in an empty database and still got the error. However, if you manually run the insert from the error on the database, it works.

Migrations failing with 1.2.2

Getting the following error when running a migration which works fine with 1.2.1

TypeError: Cannot read property 'length' of undefined
at Runner.output (/Users/markusahlstrand/projects/sesamy/api/node_modules/knex/lib/dialects/postgres/schema/compiler.js:28:24)
at PostgresClientRDSDataAPI.processResponse (/Users/markusahlstrand/projects/sesamy/api/node_modules/knex-aurora-data-api-client/src/data-api.js:88:41)

installing extra packages

hi, thanks for making this.

I'm using the NodejsFunction CDK construct which uses parcel to bundle. I had to install a few extra packages including mysql, pg, pg-native and pg-query-stream. Is this normal?

thanks

Subsequent migrations do not work with mysql

First off, thanks for the library, it's been a great help.

We use custom migrations in our application, and had trouble running subsequent migrations. The issue stems from the way knex dialects are compiled, with methods such as hasTable requiring a specific shape to be passed in the output method - see the knex hasTable method here, with the output method checking on the response length, like this:

output: function output(resp) {
        return resp.length > 0;
      }

However, the output method is being currently being called with data of the shape:

type resp = {
  rows: Array,
  records: Array,
}

This call occurs in the _query method of data-api.js

In this case, the hasTable output method will actually look for a length attribute on the response object, which doesn't exist and is therefore undefined, so the hasTable output function always returns false.

In our case, we perform migrations when starting up the application, this causes issues when creating the knex_migrations table. The createTable method will first check if the table exists using the hasTable method, which always returns false. Knex tries to create the table, and hits a namespace collision:

Unhandled rejection error: relation "knex_migrations" already exists

Our fix was to modify the _query method and processResponse methods to better emulate the same methods in the knex mysql dialect compiler.

I'll raise a PR with the fix.

Incompatible with string array column types?

Hi, thanks for the module.

I've run into an issue using the postgres clients with certain column types created with t.specificType

We have some columns created like the example below

export async function up(knex: Knex): Promise<void> {
  return knex.schema.alterTable('files', t => {
    t.specificType('errors', 'text ARRAY')
      .notNullable()
      .defaultTo('{}')
  })
}

When querying this column the data is returned in the following structure:

{errors: {stringValue: []}}

rather than the expected []

Does that mean this client is incompatible with these types of columns?

The library does not close database transactions correctly

When using transactions, the library sends COMMIT sql, instead of using commit transaction method of RDS.

Since the transaction is not committed via RDS method, the transaction will be kept open in AWS RDS for 5mins, which means that the number of connections will keep increasing. Eventually, connections will start to drop.

This issue was well described in jeremydaly/data-api-client#37 (comment).

Our solution was to properly close transactions with function as we were using a lot of transactions but implementation of knex-aurora-data-api-client for "starting transaction" was properly calling DataAPI function, but one for "commiting transaction" was only sending COMMIT command and not using commitTransaction function of data-api-client. And Data API is waiting 5 minutes for not committed transactions to end. Data in database was already committed (thanks to COMMIT command), but connection to Data API was not considered as committed transactions as Data API requires to call explicit commit of rollback (and you don't need to call SQL COMMIT). After changing implementation our problem of hitting 500 connections was solved immediately and now we are at maximum 20 connections. Hopefully it would help somebody. Don't know what is AppSync using.

This can be reproduced by sending sequential transactions - the number of connections keeps increasing, until they start to be dropped (timeout).

Cannot find module 'knex/lib/transaction'

The following versions are used:
"knex": "^0.95.4",
"knex-aurora-data-api-client": "^1.5.2",

When i am deploying the code in lambda and calling the api the following error is what I get:

Cannot find module 'knex/lib/transaction'.

Value of infinity in timestamp columns breaks > v.1.5.0

Problem
I have just tried to upgrade from v.1.5.0 to the latest available version to use fresh version of knex. However, 181e410b9dd60abb8d6fffe4f5d9c54765c39a8a seems to break the possible value of infinity in PostgreSQL database, which is valid value according to the spec.

Background
"8.5.1.4. Special Values
PostgreSQL supports several special date/time input values for convenience, as shown in Table 8-13. The values infinity and -infinity are specially represented inside the system and will be displayed unchanged; but the others are simply notational shorthands that will be converted to ordinary date/time values when read. (In particular, now and related strings are converted to a specific time value as soon as they are read.) All of these values need to be enclosed in single quotes when used as constants in SQL commands."

What is causing it
Now, the timestamp value of "+292278994-08-16 23:00:00" (representing the value of infinity) is passed to the match method in https://github.com/markusahlstrand/knex-data-api-client/blob/master/src/types.js#L15 and the array destructor fails with error of "object null is not iterable (cannot read property Symbol(Symbol.iterator)".

I am not sure who is responsible for converting infinity value to "+292278994-08-16 23:00:00": AWS, data-api-client or this project. Should this library take into account the possibility of inifinity values? With version 1.5.0 the infinity values are working as expected.

Plans to work with AWS SDK for JavaScript (v3)?

I'm seeing in logs that there's a plan to put AWS SDK v2 into maintenance mode in 2023:

"NOTE: We are formalizing our plans to enter AWS SDK for JavaScript (v2) into maintenance mode in 2023.
Please migrate your code to use AWS SDK for JavaScript (v3)."

Is this something which knex-data-api-client will need to update for?

when using insert with returning *, record types are not applied

From what I see, column types are only applied on select method, but we expect them to work for insert.returning too

await domain.schema.createTableIfNotExists('information', (t) => {
         t.bigIncrements('id');
         t.string('dynamo_id').unique().nullable();
         t.jsonb('payload');
});

const [output] = await domain('information')
        .insert({
            id: 1,
            user_id: 'one',
            payload: { name: 'moi', age: 22 }
        })
        .onConflict('user_id')
        .merge()
        .returning('*');

    console.info('** payload', output.payload);
    console.info('** name', output.payload.name); // returns undefined

[DOCS] update install/use docs to mention configuring AWS

Overview

While attempting to configure this in a project I followed the steps as outlined currently. I added a section to my knexfile and provided the values needed. I received connection refused errors when attempting to actually run migrations/ query against my database.

Cause

The cause turned out to be 100% my fault. I didn't have AWS initialized or configured in my knexfile. I found the solution by referencing the knexfile within this repo.

Suggested Solution

I suggest adding a section or expanding the current installation guide to mention that AWS must additionally be configured in your project. This is probably a no-brainer for people experience with AWS, but I'm learning it and this blocked me for far longer than it should have.

less documentation

data-api-client contributor suggest that, first build the query with knex and use that in data-api query,found this package but has no more documentation so please could you give more information about this package
and what here doSomething() actually does what kind of action could be execute with it

when using `update.returning` types are not applied

From what I see, column types are only applied on select method, but we expect them to work for update.returning too

await domain.schema.createTableIfNotExists('information', (t) => {
         t.bigIncrements('id');
         t.string('dynamo_id').unique().nullable();
         t.jsonb('payload');
});

const [output] = await domain('information')
        .update({
            id: 1,
            user_id: 'one',
            payload: { name: 'moi', age: 22 }
        })
        .returning('*');

    console.info('** payload', output.payload);
    console.info('** name', output.payload.name); // returns undefined

Support for knex-migrate toolkit

This package really solved a lot of problems for me, so first off, thanks for all the effort with this, I'm really liking it so far. Sadly though I had to get rid of the knex-migrate utility, since this package doesn't seem to support it, and i'd be nice to have both.

While migrations work just fine with plain knex.js, trying to use knex-migrate causes the following error: expecting an array or an iterable object but got [object Null].

After investigating some, I think this is due to the fact knex-migrate uses pluck, which is not mapped in the data-api processResponse, though I'm not all that familiar with all this, so I may be mistaken.

TypeScript extension not working

When using the latest knex package directly I'm able to write migrations in TypeScript/ES6 and that works.
However, after switching to knex-aurora-data-api-client (which has Knex 0.21.1), ts extension seems to have stopped working. If I get rid of types and ES6, then it works.
Current version of Knex is 0.95.6, wondering if it's related and/or if the package can be updated to latest.

This is my knexfile.ts:

const config = {
  client: knexDataApiClient.postgres,
  connection: {
    secretArn: 'arn:aws:secretsmanager:us-east-1:...',
    resourceArn: 'arn:aws:rds:us-east-1:...',
    database: 'mydatabase',
    region: 'us-east-1',
  },
  migrations: {
    config: 'knex_migrations',
    extension: 'ts',
    directory: 'migrations',
    stub: 'migrations.stub.ts',
  },
};

The error I get when running knex migrate:latest:

image

some properties don't exist when using typescript

Hello, I'm trying to configure knex for a mysql aurora DB using typescript

I did the configuration as follows:

import knexDataApiClient from "knex-aurora-data-api-client";
import Knex from "knex";

export const knex = Knex({
client: knexDataApiClient.mysql,
connection: {
secretArn: "secret-arn", // Required
resourceArn: "db-resource-arn", // Required
database: "db-name",
region: "eu-west-2",
},
});

the code editor points out that secretArn, resourceArn, region doesn't exist. However other properties like client and database are working fine. what could be the reason? am I doing the configuration wrong?

Error: Property x... doesnt not exist on the staticConnection config | connectionConfigProvider
Thank you

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.