Giter Club home page Giter Club logo

loopback-connector-saphana's Introduction

loopback-connector-saphana

loopback-connector-saphana is the SAP HANA connector module for loopback-datasource-juggler.

Connector settings

The connector can be configured using the following settings from the data source.

  • host (default to 'localhost'): The host name or ip address of the SAP HANA server
  • port (default to 30015): The port number of the SAP HANA server
  • username: The user name to connect to the SAP HANA server
  • password: The password
  • debug (default to false)

NOTE: By default, the default schema of the user is used for all tables which is the same as the username.

The SAP HANA connector uses node-hdb as the driver. See more information about configuration parameters, check https://github.com/SAP/node-hdb/blob/master/README.md.

Discovering Models

SAP HANA data sources allow you to discover model definition information from existing SAP HANA databases. See the following APIs:

  • [dataSource.discoverModelDefinitions([username], fn)]
  • [dataSource.discoverSchema([owner], name, fn)]

Model definition for SAP HANA

The model definition consists of the following properties:

  • name: Name of the model, by default, it's the camel case of the table
  • options: Model level operations and mapping to SAP HANA schema/table
  • properties: Property definitions, including mapping to SAP HANA column
    {"name": "Inventory", "options": {
      "idInjection": false,
      "hdb": {
        "schema": "strongloop",
        "table": "inventory"
      }
    }, "properties": {
      "id": {
        "type": "String",
        "required": false,
        "length": 64,
        "precision": null,
        "scale": null,
        "hdb": {
          "columnName": "id",
          "dataType": "varchar",
          "dataLength": 64,
          "dataPrecision": null,
          "dataScale": null,
          "nullable": "NO"
        }
      },
      "productId": {
        "type": "String",
        "required": false,
        "length": 20,
        "precision": null,
        "scale": null,
        "id": 1,
        "hdb": {
          "columnName": "product_id",
          "dataType": "varchar",
          "dataLength": 20,
          "dataPrecision": null,
          "dataScale": null,
          "nullable": "YES"
        }
      },
      "locationId": {
        "type": "String",
        "required": false,
        "length": 20,
        "precision": null,
        "scale": null,
        "id": 1,
        "hdb": {
          "columnName": "location_id",
          "dataType": "varchar",
          "dataLength": 20,
          "dataPrecision": null,
          "dataScale": null,
          "nullable": "YES"
        }
      },
      "available": {
        "type": "Number",
        "required": false,
        "length": null,
        "precision": 32,
        "scale": 0,
        "hdb": {
          "columnName": "available",
          "dataType": "integer",
          "dataLength": null,
          "dataPrecision": 32,
          "dataScale": 0,
          "nullable": "YES"
        }
      },
      "total": {
        "type": "Number",
        "required": false,
        "length": null,
        "precision": 32,
        "scale": 0,
        "hdb": {
          "columnName": "total",
          "dataType": "integer",
          "dataLength": null,
          "dataPrecision": 32,
          "dataScale": 0,
          "nullable": "YES"
        }
      }
    }}

Type Mapping

  • Number
  • Boolean
  • String
  • Object
  • Date
  • Array

JSON to SAP HANA Types

  • String|JSON|Text|default: VARCHAR, default length is 1024
  • Number: INTEGER
  • Date: TIMESTAMP
  • Timestamp: TIMESTAMP
  • Boolean: VARCHAR(1)

SAP HANA Types to JSON

  • VARCHAR(1): Boolean
  • VARCHAR|NVARCHAR|ALPHANUM|SHORTTEXT: String
  • VARBINARY: Binary;
  • TINYINT|SMALLINT|INTEGER|BIGINT|SMALLDECIMAL|DECIMAL|REAL|DOUBLE: Number
  • DATE|TIME|SECONDDATE|TIMESTAMP: Date

Destroying Models

Destroying models may result in errors due to foreign key integrity. Make sure to delete any related models first before calling delete on model's with relationships.

Auto Migrate / Auto Update

After making changes to your model properties you must call Model.automigrate() or Model.autoupdate(). Only call Model.automigrate() on new models as it will drop existing tables.

LoopBack SAP HANA connector creates the following schema objects for a given model:

  • A table, for example, "product" under the default schema of the user
  • A sequence with name "product_seq" if the primary key "id" is auto-increment

Running tests

npm test
  • Prerequisites for saphana.discover.test.js: execute tables.sql in SAP HANA Studio

loopback-connector-saphana's People

Contributors

yangzhaox avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

loopback-connector-saphana's Issues

executeSQL() should be implemented by Connector

some of tests provided in the test folder are failing, due to the error executeSQL() should be implemented by the connector. Although the implementation is same as SAPHANA.prototype.query method may be I think the method name should be changed. Once after changing the method name it shows callback should be function error.

Has anyone got this error previously. Anyone who established connection to HANA with this connector and able to make operations in the database??

SAP HANA Cloud Connection fails: Error: connect ECONNREFUSED

I'm trying to connect de api to SAP HANA Cloud but when tha api is starting it give me this error.

[2019-01-30 16:46:51.543] [ERROR] console - Connection fails: Error: connect ECONNREFUSED 18.232.233.169:20286
It will be retried for the next request.
events.js:167
throw er; // Unhandled 'error' event
^

Error: connect ECONNREFUSED 18.232.233.169:20286
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1113:14)
Emitted 'error' event at:
at DataSource.postInit (H:\odata-server\node_modules\loopback-datasource-juggler\lib\datasource.js:483:26)
at H:\odata-server\node_modules\loopback-connector-saphana\lib\saphana.js:37:21
at H:\odata-server\node_modules\loopback-connector-saphana\lib\saphana.js:72:20
at done (H:\odata-server\node_modules\hdb\lib\Client.js:153:7)
at onopen (H:\odata-server\node_modules\hdb\lib\Client.js:159:14)
at done (H:\odata-server\node_modules\hdb\lib\protocol\Connection.js:137:14)
at Socket.onerror (H:\odata-server\node_modules\hdb\lib\protocol\Connection.js:159:5)
at Socket.emit (events.js:182:13)
at emitErrorNT (internal/streams/destroy.js:82:8)
at emitErrorAndCloseNT (internal/streams/destroy.js:50:3)
at process._tickCallback (internal/process/next_tick.js:63:19)

I trried this connector with HANA on premise and everythig works perfect.

Does this connector support SAP HANA Cloud?

SAP HANA Connector: POST failed with 500 Error Code

I created a REST API for sap hana connector and while trying POST, the REST call failed with HTTP Error Code 500.

Response Body:

{
"error": {
"name": "Error",
"status": 500,
"message": "invalid sequence: CONTACTSeq: line 1 col 26 (at pos 25)",
"code": 313,
"sqlState": "HY000",
"level": 1,
"position": 25
}
}

where CONTACT is my model name.

I tried PUT, GET,DELETE, and it worked fine. Only POST call is failing with above error.

500 Error: query method should be declared in connector

Hello @jensonzhao can you helpme with a custom query?
kna1Repository.execute(sqlStmt, []);

**

  • Execute a command with given parameters
  • @param {String} command The command such as SQL
  • @param {Object[]} [params] An array of parameters
  • @param {Function} [callback] The callback function
    */
    Connector.prototype.execute = function (command, params, callback) {
    /*jshint unused:false */
    throw new Error('query method should be declared in connector');
    };

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.