Comments (3)
char *input_name = session.GetInputNameAllocated(i, allocator).get();
This is not a C++ support group, but I will have to do it. The API returns a smart pointer, which is a temporary object. The lifespan of the temp objects in C++ is limited to the statement where they are produced. So you obtain the pointer to the buffer, which is then is promptly deallocated once the statement finishes execution.
I agree that the API needs improvement.
from onnxruntime.
See example code:
onnxruntime/onnxruntime/python/tools/tensorrt/perf/mem_test/main.cpp
Lines 51 to 52 in 4d2b981
from onnxruntime.
This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.
from onnxruntime.
Related Issues (20)
- Please Add webpack and typescript configuration HOT 2
- How to use I/O binding if input tensor shape is not fixed HOT 2
- [Build] wheels of 1.17/1.18 not found installing with uv HOT 3
- [Mobile] android prod crash: signal 11 (SIGSEGV), code 1 (SEGV_MAPERR) HOT 2
- [Feature Request] prebuilt package with cudnn 9 support HOT 1
- When will onnxruntime make it available to get nodeunit inputs correctly for QLinearConcat HOT 4
- [Build] support for CPython 3.13.0b1
- [Feature Request] why is `FunctionProto` missing from the file "onnxruntime/core/providers/shared_library/provider_wrappedtypes.h"? HOT 3
- [Build] Dockerfile.tensorrt out of date HOT 3
- [Web] I canβt use onnruntime-web to load a onnx model in a react web HOT 5
- Unauthorized for `onnxruntime-cuda-12/pypi/simple/`
- [Documentation] The documentation for early versions is missing HOT 2
- pip install failure for onnxruntime==1.17.3 HOT 1
- Index put loop model regression with ort==1.18
- Non-zero status code returned while running Add node. Name:'Add_221'
- can't install on Python HOT 1
- Error in quantize vicuna-7b model from fp16 to int8 HOT 4
- Issue of UNet Model Inference with [model.onnx + model.onnx_data] in ONNX Runtime C++ for Stable Diffusion HOT 9
- Gemm fp8 run error HOT 3
- Where is onnxruntime-openvino==1.18.0 HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. πππ
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google β€οΈ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from onnxruntime.