Comments (5)
@bibekyess, please review the return statement in the conversion
function: the example from the documentation uses .outputs()
that returns the list of output ports for a node, but in your example the list of nodes is returned instead. The error message complains about that list. So, please, replace:
return [ops.grid_sample(node.get_input(0), node.get_input(1), attributes)]
by
return ops.grid_sample(node.get_input(0), node.get_input(1), attributes).outputs()
from openvino.
Hello @bibekyess,
First, let me explain why you are observing this behavior.
core.add_extension(OpExtension("grid_sampler", "mmdeploy.grid_sampler"))
why is this not working?
Short answer: it doesn't work because there is no operation with the name "grid_sampler" in OpenVINO, you are passing this name as the first parameter to OpExtension
. This name should refer to an existing operation on OpenVINO side, standard one (among https://docs.openvino.ai/2024/documentation/openvino-ir-format/operation-sets/available-opsets/opset14.html) or registered as another extension. There is nothing with the name "grid_sampler" on OpenVINO side at that point in your code.
Now, about the blog mentioned:
I found one nice blog with the same issue as mine.
https://blog.openvino.ai/blog-posts/openvino-frontend-extension-samples but I couldn't follow it because it seems the blog is not self-sufficient. There is C++ code but I couldn't understand where I should put that code inside the openvino repo to build.
The blog is really sufficient but it supposes you are writing a C++ application, not a Python application. The code is not needed to be put inside openvino, it is just a part of your own C++ application. And the code in that blog is actually what you need: it maps mmdeploy.grid_sampler
to existing operation from OpenVINO opset: GridSample
specified here https://docs.openvino.ai/2024/documentation/openvino-ir-format/operation-sets/operation-specs/image/grid-sample-9.html. In the blog, ConversionExtension
is used to facilitate operation translation, and this extension type is also described in that documentation that you mentioned: https://docs.openvino.ai/2022.3/openvino_docs_Extensibility_UG_Frontend_Extensions.html#mapping-to-multiple-operations-with-conversionextension.
So, finally, to make work done in Python without any involvement of C++ part, you need to follow the steps in the code from that blog and rewrite them in Python. I mean the part that implements ConversionExtension
for grid_sampler
("Step 2. Add Extension"). The main part of that code just translates attributes of operation from mmdeply.grid_sampler
semantics to OpenVINO GridSample
semantics, it looks straightforward to follow and provide the same steps in Python easily. That part of the document I mentioned (https://docs.openvino.ai/2022.3/openvino_docs_Extensibility_UG_Frontend_Extensions.html#mapping-to-multiple-operations-with-conversionextension) provides an example of how to use Python in this case (the last example on the page with ThresholdedRelu
). Please try and let us know if you have issue on that way.
from openvino.
Great! Thank you @slyalin for your detailed response.
Following your suggestions, I wrote this code:
import openvino as ov
import openvino.runtime.opset9 as ops
from openvino.frontend.onnx import ConversionExtension
core = ov.Core()
def conversion(node):
mapping = {0: "bilinear", 1: "bicubic", 2: "nearest"}
padmapping = {0: "zeros", 1: "border", 2: "reflection"}
attributes = {}
attributes['align_corners'] = node.get_attribute("align_corners", False)
attributes['mode'] = mapping.get(node.get_attribute("interpolation_mode", "bilinear"))
attributes["padding_mode"] = padmapping.get(node.get_attribute("padding_mode", "zeros"))
return [ops.grid_sample(node.get_input(0), node.get_input(1), attributes)]
core.add_extension(ConversionExtension("grid_sampler", "mmdeploy", conversion))
dino_ov = core.read_model("../sample.onnx")
It is giving me type casting error. Any idea on how to solve this?
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
Cell In[1], [line 20](vscode-notebook-cell:?execution_count=1&line=20)
[17](vscode-notebook-cell:?execution_count=1&line=17) return [ops.grid_sample(node.get_input(0), node.get_input(1), attributes)]
[19](vscode-notebook-cell:?execution_count=1&line=19) core.add_extension(ConversionExtension("grid_sampler", "mmdeploy", conversion))
---> [20](vscode-notebook-cell:?execution_count=1&line=20) dino_ov = core.read_model("[/home/bibekyess/mmcv/mmdetection/mmdeploy_model/dino/end2end.onnx](https://vscode-remote+ssh-002dremote-002blayout-002dparser-002dtransformer.vscode-resource.vscode-cdn.net/home/bibekyess/mmcv/mmdetection/mmdeploy_model/dino/end2end.onnx)")
File [~/miniconda3/envs/yolo-env-cpu/lib/python3.9/site-packages/openvino/runtime/ie_api.py:479](https://vscode-remote+ssh-002dremote-002blayout-002dparser-002dtransformer.vscode-resource.vscode-cdn.net/home/bibekyess/mmcv/mmdetection/~/miniconda3/envs/yolo-env-cpu/lib/python3.9/site-packages/openvino/runtime/ie_api.py:479), in Core.read_model(self, model, weights)
[477](https://vscode-remote+ssh-002dremote-002blayout-002dparser-002dtransformer.vscode-resource.vscode-cdn.net/home/bibekyess/mmcv/mmdetection/~/miniconda3/envs/yolo-env-cpu/lib/python3.9/site-packages/openvino/runtime/ie_api.py:477) return Model(super().read_model(model, weights))
[478](https://vscode-remote+ssh-002dremote-002blayout-002dparser-002dtransformer.vscode-resource.vscode-cdn.net/home/bibekyess/mmcv/mmdetection/~/miniconda3/envs/yolo-env-cpu/lib/python3.9/site-packages/openvino/runtime/ie_api.py:478) else:
--> [479](https://vscode-remote+ssh-002dremote-002blayout-002dparser-002dtransformer.vscode-resource.vscode-cdn.net/home/bibekyess/mmcv/mmdetection/~/miniconda3/envs/yolo-env-cpu/lib/python3.9/site-packages/openvino/runtime/ie_api.py:479) return Model(super().read_model(model))
RuntimeError: Exception from src/inference/src/cpp/core.cpp:92:
Check 'error_message.empty()' failed at src/frontends/onnx/frontend/src/frontend.cpp:122:
FrontEnd API failed with GeneralFailure:
Errors during ONNX translation:
While validating ONNX node '<Node(grid_sampler): [/encoder/layers.0/self_attn/grid_sampler_4](https://vscode-remote+ssh-002dremote-002blayout-002dparser-002dtransformer.vscode-resource.vscode-cdn.net/encoder/layers.0/self_attn/grid_sampler_4)>':
Unable to cast Python instance of type <class 'list'> to C++ type '?' (#define PYBIND11_DETAILED_ERROR_MESSAGES or compile in debug mode for details)While validating ONNX node '<Node(grid_sampler): [/encoder/layers.0/self_attn/grid_sampler_3](https://vscode-remote+ssh-002dremote-002blayout-002dparser-002dtransformer.vscode-resource.vscode-cdn.net/encoder/layers.0/self_attn/grid_sampler_3)>':
Unable to cast Python instance of type <class 'list'> to C++ type '?' (#define PYBIND11_DETAILED_ERROR_MESSAGES or compile in debug mode for details)While validating ONNX node '<Node(grid_sampler): [/encoder/layers.0/self_attn/grid_sampler_2](https://vscode-remote+ssh-002dremote-002blayout-002dparser-002dtransformer.vscode-resource.vscode-cdn.net/encoder/layers.0/self_attn/grid_sampler_2)>':
...
Unable to cast Python instance of type <class 'list'> to C++ type '?' (#define PYBIND11_DETAILED_ERROR_MESSAGES or compile in debug mode for details)
I tried some workaround like ops.convert(node.get_input(0), np.dtype)
or ops.convert(node.get_input(0), ov.runtime.Tensor)
but it is not working.
GridSample
expects this:
Inputs
1: data - Input tensor of type T with data to be sampled. This input is expected to be a 4-dimensional tensor with NCHW layout. Required.
2: grid - A 4-dimensional tensor containing normalized sampling coordinates(pairs of floats). The shape of this tensor is [N, H_out, W_out, 2] and the data type is T1. Required.
Outputs
1: A 4-dimensional tensor of type T with [N, C, H_out, W_out] shape. It contains the interpolated values calculated by this operator.
The current node.get_input(0)
is like this <Output: names[/encoder/layers.0/self_attn/Reshape_5_output_0] shape[?,?,?,?] type: f32>
with input_type <Type: 'float32'>
Thank you for your time and help! :)
from openvino.
Great! Issue is solved. Thank you so much for your help! :)
from openvino.
@andrei-kochin, if we have even a blog post about enabling this operation, and we have this question raised here, why not to add it in the set of supported operations in ONNX FE? Looks quite straightforward. GFI or something.
from openvino.
Related Issues (20)
- [Bug]: The inference result using CPU on MacOS M2 is abnormal, but the result using TEMPLATE device is normal HOT 10
- [Bug]: Memory leak with TBB 2020.3
- [Feature Request]: Rename inputs/outputs HOT 2
- [Bug]: NPU not detected in OpenVINO container HOT 2
- [Bug]: ov::InferRequest::infer() not thread safe when having multiple models HOT 8
- [Performance]: Latency is much longer than RealTime when using benchmark_app for profiling HOT 1
- How to use pre-allocated buffer in output ?
- Openvino Chinese Path and Interface Support HOT 11
- [Bug]: Model infers on Intel CPU, but crashes on ARM CPU (both systems using Ubuntu 22.04) HOT 7
- [Bug]: when loading model, 'Floating point exception (core dumped)‘ happended HOT 4
- [Bug]: Run OpenVINO benchmark_app with Yolo-v4-tf and yolo-v8 INT8 model Failed on NPU HOT 2
- [Bug]: Missing outputs when converting ONNX to IR with ovc HOT 5
- [Performance]: Periodic slow down in InferRequest.infer
- Bug HOT 2
- [Bug]: onnx HOT 5
- [Bug]: crashes when calling c++ in arm android HOT 9
- [Bug]: Operators unsupported HOT 4
- [Bug]: When i use Ultra9 NPU to run models, it report: get_shape was called on a descriptor::Tensor with dynamic shape HOT 1
- [Bug]: LSAN reports leaks in ONNX frontend HOT 1
- [Bug]: OpenVINO does not support the following ONNX operations: DeformConv HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from openvino.