Giter Club home page Giter Club logo

inference-engine-node's Introduction

DISCONTINUATION OF PROJECT.

This project will no longer be maintained by Intel.

This project has been identified as having known security escapes.

Intel has ceased development and contributions including, but not limited to, maintenance, bug fixes, new releases, or updates, to this project.

Intel no longer accepts patches to this project.

Inference Engine Binding for Node.js*

*Other names and brands may be claimed as the property of others.

Prerequisites

Instal Node.js.

Download OpenVINO and install it into the default path.

For Windows, install Visual Studio 2019.

For Linux, install build-essential package.

Verified configurations:

  • Node.js 12 LTS
  • OpenVINO 2021.4
  • Windows 10
  • Ubuntu Linux 18.04

Install

The Inference Engine Binding for Node.js supports installation for two operation systems: Windows 10 and Ubuntu 16.04 and two build systems: node-gyp and CMake-based and Node-GYP-based.

Install on Ubuntu 18.04

To install the Inference Engine Binding for Node.js on Ubuntu 16.04 use the following instruction:

  1. Open a terminal in the repository root folder
  2. Activate the OpenVINO environment:

If you installed the OpenVINO to the /opt/intel/openvino directory (as root) use the following command:

$ source /opt/intel/openvino/bin/setupvars.sh

If you installed the OpenVINO to the home directory ~/intel/openvino directory use the following command:

$ source ~/intel/openvino/bin/setupvars.sh

To install Inference Engine Binding for Node.js using node-gyp use the following command in the same terminal:

$ npm install

To install Inference Engine Binding for Node.js using cmake use following commands in the same terminal:

  1. Set an environment variable NODE_PATH to directory with installed NodeJS. For example:
$ export NODE_PATH=/home/user/.nvm/versions/node/v12.17.0/
  1. Create an empty directory to build and go to this directory:
$ mkdir "cmake-build" && cd "cmake-build"
  1. Run cmake to fetch project dependencies and create Unix makefiles, then run make to build the project:
$ cmake -DCMAKE_BUILD_TYPE=Release ../ && \
$ cmake --build . --target inference_engine_node -- -j $(nproc --all)

Install on Windows 10

To install the Inference Engine Binding for Node.js on Windows 10 use the following instruction:

  1. Open a terminal in the repository root folder
  2. Activate the OpenVINO environment:
> "C:\Program Files (x86)\IntelSWTools\openvino\bin\setupvars.bat"

To install Inference Engine Binding for Node.js using node-gyp use the following command in the same terminal:

> npm install

Note: For "Error: MSBuild is not set" on VS 2019, please set msbuild_path, e.g.

> npm config set msbuild_path "c:\Program Files (x86)\Microsoft Visual Studio\2019\Professional\MSBuild\Current\Bin\MSBuild.exe"

To install Inference Engine Binding for Node.js using cmake use following commands in the same terminal:

  1. Set an environment variable NODE_PATH to directory with installed node-gyp. For example:
> set NODE_PATH=C:\Users\user\AppData\Local\node-gyp\Cache\14.1.0\
  1. Create an empty directory to build and go to this directory:
> cmake ../ && \
    cmake --build . --target inference_engine_node --config Release

Build

$ npm run build

Note: For debug build on Windows, open the solution in Visual Studio, change library path to "C:\Program Files (x86)\IntelSWTools\openvino\inference_engine\lib\intel64\Debug" and library name to "inference_engined.lib".

Test

Setup the system environment variables for OpenVINO on Windows and Linux.

$ npm test

Example

Setup system environment variables of OpenVINO as test.

API

API Doc

Development

Coding Style

inference-engine-node's People

Contributors

artyomtugaryov avatar dependabot[bot] avatar huningxin avatar ibelem avatar lionkunonly avatar rdower avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

inference-engine-node's Issues

Inference Engine can not make full use of logic CPUs.

I try to use the inference engine to run the network on my CPU. I want to make full use of my logic CPUs, but the fact is that it usually uses half of the logic CPUs. Is there any way to use all logic CPUs with the inference engine? And I want to know if the increasing number of logic CPUs can bring better performance?

System Info: Unbuntu 18.04 64bit
CPU Info: Intel(R) Core(TM) i7-8700 CPU
OpenVINO version: 2020.3

@argretzi

Multiple Async Models for MYRIAD

I'm having an issue where instantiating a new network on the NCS2 creates an issue with any existing networks i've instantiated.

I have a pipeline that i'm passing images through several models and want to run all of the inference jobs on the NCS2 but can only seem to get this library to spin up one.

If I run all of the models on the CPU I don't have the same issue -- perhaps an issue in this library?

`npm i inference-engine-node` meets error for version 0.1.5

I try install the inference-engine-node v0.1.5 in other projects. But the npm reports an error:

npm ERR! code ENOLOCAL
npm ERR! Could not install from "node_modules/inference-engine-node/example/hello_classification_node" as it does not contain a package.json file.

It is a strange error, because there is a package.js in the hello_classification_node. I do not know why this error happen. Could you help solve it? @huningxin

A test bug on Core.getVersions('GPU')

In the file test/core.js, I try to execute the sentence core.getVersions('GPU') to make the GPU load the plugins in advance. This operation leads a error like the following:

 TypeError: Failed to create plugin /opt/intel/openvino_2020.2.120/deployment_tools/inference_engine/lib/intel64/libclDNNPlugin.so for device GPU
Please, check your environment
[CLDNN ERROR]. clGetPlatformIDs error -1001

The node.js API core.getVersions() binds the native Core.GetVersions() API. It seems a problem related with the OpenCL, but I do not find any information for such a error.

Support loading extension

It includes loading CPU extensions by AddExtension and GPU extensions by SetConfig.

The native sample code could be found at https://github.com/opencv/dldt/blob/2020/inference-engine/samples/classification_sample_async/main.cpp#L83 for CPU extension and https://github.com/opencv/dldt/blob/2020/inference-engine/samples/classification_sample_async/main.cpp#L89 for GPU extension.

Please check out more details of extension mechanism at: https://docs.openvinotoolkit.org/latest/_docs_IE_DG_Extensibility_DG_Intro.html

For 2020 release, according to comment, all ops are supported by MKLDNNPlugin, so there is no built-in CPU extension. GPU extension is still there.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.