intel / inference-engine-node Goto Github PK
View Code? Open in Web Editor NEWBringing the hardware accelerated deep learning inference to Node.js and Electron.js apps.
License: Apache License 2.0
Bringing the hardware accelerated deep learning inference to Node.js and Electron.js apps.
License: Apache License 2.0
Hi @lionkunonly, what's the license of hello_classification_electron/main.js you are using if it's from Electron upstream? Thanks
I'm having an issue where instantiating a new network on the NCS2 creates an issue with any existing networks i've instantiated.
I have a pipeline that i'm passing images through several models and want to run all of the inference jobs on the NCS2 but can only seem to get this library to spin up one.
If I run all of the models on the CPU I don't have the same issue -- perhaps an issue in this library?
Test "InferRequest.getBlob should throw for invalid arguments" fails against OpenVINO 2020.1.
It maps to Core::GetAvailableDevices
. The JS API sketch could be
partial interface Core {
sequence<DOMString> getAvailableDevices();
};
@huningxin I think the inference-engine-node should be updated to 0.5 in npmjs.com while it supports OpenVINO 2021.1 and changes some APIs now
The gaps bettwen Node.js API vs Python IE API will be recorded here.
In the file test/core.js
, I try to execute the sentence core.getVersions('GPU')
to make the GPU load the plugins in advance. This operation leads a error like the following:
TypeError: Failed to create plugin /opt/intel/openvino_2020.2.120/deployment_tools/inference_engine/lib/intel64/libclDNNPlugin.so for device GPU
Please, check your environment
[CLDNN ERROR]. clGetPlatformIDs error -1001
The node.js API core.getVersions()
binds the native Core.GetVersions()
API. It seems a problem related with the OpenCL, but I do not find any information for such a error.
The inference-engine-node in npmjs.com is old. Some APIs like core.getAvailabelDevices()
is not supported.
I try install the inference-engine-node v0.1.5 in other projects. But the npm
reports an error:
npm ERR! code ENOLOCAL
npm ERR! Could not install from "node_modules/inference-engine-node/example/hello_classification_node" as it does not contain a package.json file.
It is a strange error, because there is a package.js
in the hello_classification_node
. I do not know why this error happen. Could you help solve it? @huningxin
Test "Core.getVersions should return a map" fails. It turns an object currently.
We can fix it later once n-api supports Map. see nodejs/node-addon-api#472 for details.
It would be good to get the target device list by getAvailableDevices API in the Electron.js example. Refer to #16
I'll try to use PreProcess interface to optimize the image resizing and color conversion of node.js image classification example.
After #39 merged, the createCore
API is removed. There are some issues:
npm test
fails, the test cases need to be fixed.
The API doc also to be fixed.
The electron example needs to be fixed.
@artyomtugaryov , please help. CC @lionkunonly .
It is required to support mean and std values. Please refer to the API doc of PreProcessChannel.
Click the icon should navigate to https://github.com/intel/inference-engine-node
I try to use the inference engine to run the network on my CPU. I want to make full use of my logic CPUs, but the fact is that it usually uses half of the logic CPUs. Is there any way to use all logic CPUs with the inference engine? And I want to know if the increasing number of logic CPUs can bring better performance?
System Info: Unbuntu 18.04 64bit
CPU Info: Intel(R) Core(TM) i7-8700 CPU
OpenVINO version: 2020.3
It maps to PreProcessInfo
. It allows to setup input color format, resize algorithm and others. The possible usage could be:
input_info.preProcess.setResizeAlgorithm('resize-bilinear');
input_info.preProcess.setColorFormat('rgbx');
It includes loading CPU extensions by AddExtension and GPU extensions by SetConfig.
The native sample code could be found at https://github.com/opencv/dldt/blob/2020/inference-engine/samples/classification_sample_async/main.cpp#L83 for CPU extension and https://github.com/opencv/dldt/blob/2020/inference-engine/samples/classification_sample_async/main.cpp#L89 for GPU extension.
Please check out more details of extension mechanism at: https://docs.openvinotoolkit.org/latest/_docs_IE_DG_Extensibility_DG_Intro.html
For 2020 release, according to comment, all ops are supported by MKLDNNPlugin, so there is no built-in CPU extension. GPU extension is still there.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.