Comments (15)
检查一下环境是否安装好,看报错 log 应该是没有找到 device,其他的报错都是 warning,不影响运行 [F 5/ 9 20:23:58.112 ...driver/huawei_ascend_npu/model_client.cc:54 InitAclClientEnv] Check failed: (reinterpret_cast(aclrtSetDevice(device_id_)) == ACL_ERROR_NONE): 507033!==0 507033 Unknown ACL error code(507033)
from paddle-lite.
好的谢谢您,请问这个device报错是华为那边环境问题还是paddlelite或者fastdeploy的环境问题呢,我好针对排查
from paddle-lite.
这个应该是你本身昇腾环境安装的问题吧,看下驱动安装。
from paddle-lite.
您好,我进行了昇腾的环境的健康检查和兼容性检查,调整了一些不兼容的地方,也请了华为的工程师帮忙查看,当前npu环境运行正常,环境版本如下:
我使用pp-yoloe-r再次进行推理,仍然报出相同的错误。
我是否需要重装paddlelite或者fastdeploy,或者给您提供报错的日志信息或者其他信息。
项目即将验收,请您继续提供帮助,十分感谢!
from paddle-lite.
您好,目前 cann 版本8.x 我们这边没有适配,最高验证过6.x,所以可能需要您自己适配一下了。
from paddle-lite.
您调试的时候也可以打开 ascend 的日志,看下 log 。
from paddle-lite.
确认下你的 fastdeploy 调用的是编译出来的 Paddle-Lite 么?是否需要设置一下 export LD_LIBRARY_PATH? 感觉像是版本不对,以至于没有找到 so。
from paddle-lite.
您可以用一下我们官方 demo 尝试跑一下,排除 fastdeploy 的干扰,如果官方 Demo 没问题,在使用 fastdeploy 进行集成。
from paddle-lite.
您好,我下载了PaddleLite-generic-demo.tar.gz,使用./run.sh mobilenet_v1_fp32_224 imagenet_224.txt test linux amd64 huawei_ascend_npu
运行了一个demo,程序报错并退出。
我的paddlelite当时是从github上下载后使用命令
./lite/tools/build_linux.sh --arch=x86 --with_extra=ON --with_log=ON --with_exception=ON --with_nnadapter=ON --nnadapter_with_huawei_ascend_npu=ON --nnadapter_huawei_ascend_npu_sdk_root=/usr/local/Ascend/ascend-toolkit/latest full_publish
是直接装在物理机上的,并未使用容器,请问paddlelite如何卸载,当时我的昇腾存在系统兼容性问题,我准备卸载后重新安装paddlelite,或者使用官方建议的容器进行安装
from paddle-lite.
用demo的话,你不是用python跑的,所以手动指定一下LD_LIBRARY_PATH 路径就好了,配置一下run.sh的脚本即可,无需卸载。
from paddle-lite.
您指的LD_LIBRARY_PATH 路径是哪条路径,run.sh中的哪部分需要更改
run.sh与昇腾有关的部分如下:
if [[ "$NNADAPTER_DEVICE_NAMES" =~ "huawei_ascend_npu" ]]; then
HUAWEI_ASCEND_TOOLKIT_HOME="/usr/local/Ascend/ascend-toolkit/latest"
if [ "$TARGET_OS" == "linux" ]; then
if [[ "$TARGET_ABI" != "arm64" && "$TARGET_ABI" != "amd64" ]]; then
echo "Unknown OS $TARGET_OS, only supports 'arm64' or 'amd64' for Huawei Ascend NPU."
exit -1
fi
else
echo "Unknown OS $TARGET_OS, only supports 'linux' for Huawei Ascend NPU."
exit -1
fi
if [[ ! "$NNADAPTER_CONTEXT_PROPERTIES" =~ "HUAWEI_ASCEND_NPU_SELECTED_DEVICE_IDS" ]]; then
NNADAPTER_CONTEXT_PROPERTIES="HUAWEI_ASCEND_NPU_SELECTED_DEVICE_IDS=0;${NNADAPTER_CONTEXT_PROPERTIES}"
fi
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/Ascend/driver/lib64/driver:/usr/local/Ascend/driver/lib64:/usr/local/Ascend/driver/lib64/stub:$HUAWEI_ASCEND_TOOLKIT_HOME/acllib/lib64:$HUAWEI_ASCEND_TOOLKIT_HOME/atc/lib64:$HUAWEI_ASCEND_TOOLKIT_HOME/opp/op_proto/built-in
export PYTHONPATH=$PYTHONPATH:$HUAWEI_ASCEND_TOOLKIT_HOME/fwkacllib/python/site-packages:$HUAWEI_ASCEND_TOOLKIT_HOME/acllib/python/site-packages:$HUAWEI_ASCEND_TOOLKIT_HOME/toolkit/python/site-packages:$HUAWEI_ASCEND_TOOLKIT_HOME/atc/python/site-packages:$HUAWEI_ASCEND_TOOLKIT_HOME/pyACL/python/site-packages/acl
export PATH=$PATH:$HUAWEI_ASCEND_TOOLKIT_HOME/atc/ccec_compiler/bin:${HUAWEI_ASCEND_TOOLKIT_HOME}/acllib/bin:$HUAWEI_ASCEND_TOOLKIT_HOME/atc/bin
export ASCEND_AICPU_PATH=$HUAWEI_ASCEND_TOOLKIT_HOME
export ASCEND_OPP_PATH=$HUAWEI_ASCEND_TOOLKIT_HOME/opp
export TOOLCHAIN_HOME=$HUAWEI_ASCEND_TOOLKIT_HOME/toolkit
export ASCEND_SLOG_PRINT_TO_STDOUT=1
export ASCEND_GLOBAL_LOG_LEVEL=3
fi
demo部分报错信息如下,前几行和fastdeploy的报错相同:
[F 5/10 20:28:31. 79 .../src/driver/huawei_ascend_npu/utility.cc:315 BuildOMModelToBuffer] Check failed: (reinterpret_cast<ge::graphStatus>(aclgrphBuildModel(ir_graph, options, om_buffer)) == ge::GRAPH_SUCCESS): 1343266818!==0 1343266818 Unknown ATC error code(1343266818)
[F 5/10 20:28:31. 79 .../src/driver/huawei_ascend_npu/utility.cc:315 BuildOMModelToBuffer] Check failed: (reinterpret_cast<ge::graphStatus>(aclgrphBuildModel(ir_graph, options, om_buffer)) == ge::GRAPH_SUCCESS): 1343266818!==0 1343266818 Unknown ATC error code(1343266818)
[F 5/10 20:28:31.258 ...ter/nnadapter/src/runtime/compilation.cc:98 ~Program] Check failed: device_context: No device found.
[F 5/10 20:28:31.258 ...ter/nnadapter/src/runtime/compilation.cc:98 ~Program] Check failed: device_context: No device found.
terminate called after throwing an instance of 'nnadapter::logging::Exception'
what(): NNAdapter C++ Exception:
[F 5/10 20:28:31.258 ...ter/nnadapter/src/runtime/compilation.cc:98 ~Program] Check failed: device_context: No device found.
./run.sh: line 177: 24022 Aborted (core dumped) ./$BUILD_DIR/demo ../assets/models/$MODEL_NAME ../assets/configs/$CONFIG_NAME ../assets/datasets/$DATASET_NAME $NNADAPTER_DEVICE_NAMES "$NNADAPTER_CONTEXT_PROPERTIES" $NNADAPTER_MODEL_CACHE_DIR $NNADAPTER_MODEL_CACHE_TOKEN $NNADAPTER_SUBGRAPH_PARTITION_CONFIG_PATH $NNADAPTER_MIXED_PRECISION_QUANTIZATION_CONFIG_PATH
root@hitsz-NF5280M5:/home/hitsz/PP-Yoloe-R/PaddleLite-generic-demo/image_classification_demo/shell# [ERROR] TBE(24150,python3):2024-05-10-20:28:31.489.433 [../../../../../../latest/python/site-packages/tbe/common/repository_manager/utils/repository_manager_log.py:30][log] [../../../../../../latest/python/site-packages/tbe/common/repository_manager/route.py:61][repository_manager] Subprocess[task_distribute] raise error[]
[ERROR] TBE(24151,python3):2024-05-10-20:28:31.489.388 [../../../../../../latest/python/site-packages/tbe/common/repository_manager/utils/repository_manager_log.py:30][log] [../../../../../../latest/python/site-packages/tbe/common/repository_manager/route.py:61][repository_manager] Subprocess[task_distribute] raise error[]
[ERROR] TBE(24149,python3):2024-05-10-20:28:31.489.469 [../../../../../../latest/python/site-packages/tbe/common/repository_manager/utils/repository_manager_log.py:30][log] [../../../../../../latest/python/site-packages/tbe/common/repository_manager/route.py:61][repository_manager] Subprocess[task_distribute] raise error[]
[ERROR] TBE(24153,python3):2024-05-10-20:28:31.489.467 [../../../../../../latest/python/site-packages/tbe/common/repository_manager/utils/repository_manager_log.py:30][log] [../../../../../../latest/python/site-packages/tbe/common/repository_manager/route.py:61][repository_manager] Subprocess[task_distribute] raise error[]
[ERROR] TBE(24154,python3):2024-05-10-20:28:31.489.474 [../../../../../../latest/python/site-packages/tbe/common/repository_manager/utils/repository_manager_log.py:30][log] [../../../../../../latest/python/site-packages/tbe/common/repository_manager/route.py:61][repository_manager] Subprocess[task_distribute] raise error[]
[ERROR] TBE(24148,python3):2024-05-10-20:28:31.489.516 [../../../../../../latest/python/site-packages/tbe/common/repository_manager/utils/repository_manager_log.py:30][log] [../../../../../../latest/python/site-packages/tbe/common/repository_manager/route.py:61][repository_manager] Subprocess[task_distribute] raise error[]
[ERROR] TBE(24152,python3):2024-05-10-20:28:31.489.596 [../../../../../../latest/python/site-packages/tbe/common/repository_manager/utils/repository_manager_log.py:30][log] [../../../../../../latest/python/site-packages/tbe/common/repository_manager/route.py:61][repository_manager] Subprocess[task_distribute] raise error[]
[ERROR] TBE(24147,python3):2024-05-10-20:28:31.489.591 [../../../../../../latest/python/site-packages/tbe/common/repository_manager/utils/repository_manager_log.py:30][log] [../../../../../../latest/python/site-packages/tbe/common/repository_manager/route.py:61][repository_manager] Subprocess[task_distribute] raise error[]
Process ForkServerPoolWorker-5:
Process ForkServerPoolWorker-6:
Process ForkServerPoolWorker-4:
Process ForkServerPoolWorker-9:
Process ForkServerPoolWorker-8:
Process ForkServerPoolWorker-3:
Traceback (most recent call last):
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/pool.py", line 127, in worker
put((job, i, result))
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/queues.py", line 364, in put
self._writer.send_bytes(obj)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
self._send(header + buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
Traceback (most recent call last):
BrokenPipeError: [Errno 32] Broken pipe
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/pool.py", line 127, in worker
put((job, i, result))
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/queues.py", line 364, in put
self._writer.send_bytes(obj)
Traceback (most recent call last):
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
self.run()
Traceback (most recent call last):
Traceback (most recent call last):
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
self._send(header + buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/process.py", line 99, in run
self._target(*self._args, **self._kwargs)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/pool.py", line 127, in worker
put((job, i, result))
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/pool.py", line 132, in worker
put((job, i, (False, wrapped)))
Traceback (most recent call last):
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/queues.py", line 364, in put
self._writer.send_bytes(obj)
BrokenPipeError: [Errno 32] Broken pipe
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/pool.py", line 127, in worker
put((job, i, result))
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/queues.py", line 364, in put
self._writer.send_bytes(obj)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/pool.py", line 127, in worker
put((job, i, result))
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
During handling of the above exception, another exception occurred:
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/queues.py", line 364, in put
self._writer.send_bytes(obj)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/queues.py", line 364, in put
self._writer.send_bytes(obj)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
self._send(header + buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/pool.py", line 127, in worker
put((job, i, result))
Traceback (most recent call last):
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
self._send(header + buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/queues.py", line 364, in put
self._writer.send_bytes(obj)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
self._send(header + buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
self._send(header + buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
BrokenPipeError: [Errno 32] Broken pipe
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
self.run()
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
self._send(header + buf)
BrokenPipeError: [Errno 32] Broken pipe
During handling of the above exception, another exception occurred:
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/process.py", line 99, in run
self._target(*self._args, **self._kwargs)
BrokenPipeError: [Errno 32] Broken pipe
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
BrokenPipeError: [Errno 32] Broken pipe
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/pool.py", line 132, in worker
put((job, i, (False, wrapped)))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
During handling of the above exception, another exception occurred:
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/queues.py", line 364, in put
self._writer.send_bytes(obj)
Traceback (most recent call last):
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
self.run()
Traceback (most recent call last):
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/process.py", line 99, in run
self._target(*self._args, **self._kwargs)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
self._send(header + buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/pool.py", line 132, in worker
put((job, i, (False, wrapped)))
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
self.run()
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
self.run()
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/queues.py", line 364, in put
self._writer.send_bytes(obj)
BrokenPipeError: [Errno 32] Broken pipe
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
self.run()
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/process.py", line 99, in run
self._target(*self._args, **self._kwargs)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/process.py", line 99, in run
self._target(*self._args, **self._kwargs)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/process.py", line 99, in run
self._target(*self._args, **self._kwargs)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/pool.py", line 132, in worker
put((job, i, (False, wrapped)))
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/pool.py", line 132, in worker
put((job, i, (False, wrapped)))
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
self._send(header + buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/pool.py", line 132, in worker
put((job, i, (False, wrapped)))
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/queues.py", line 364, in put
self._writer.send_bytes(obj)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/queues.py", line 364, in put
self._writer.send_bytes(obj)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/queues.py", line 364, in put
self._writer.send_bytes(obj)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
BrokenPipeError: [Errno 32] Broken pipe
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
self._send(header + buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
self._send(header + buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
self._send(header + buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe
BrokenPipeError: [Errno 32] Broken pipe
BrokenPipeError: [Errno 32] Broken pipe
Process ForkServerPoolWorker-2:
Process ForkServerPoolWorker-7:
Traceback (most recent call last):
Traceback (most recent call last):
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/pool.py", line 127, in worker
put((job, i, result))
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/queues.py", line 364, in put
self._writer.send_bytes(obj)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
self._send(header + buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/pool.py", line 127, in worker
put((job, i, result))
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/queues.py", line 364, in put
self._writer.send_bytes(obj)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
BrokenPipeError: [Errno 32] Broken pipe
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
self._send(header + buf)
During handling of the above exception, another exception occurred:
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
Traceback (most recent call last):
BrokenPipeError: [Errno 32] Broken pipe
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
self.run()
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/process.py", line 99, in run
self._target(*self._args, **self._kwargs)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/pool.py", line 132, in worker
put((job, i, (False, wrapped)))
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/queues.py", line 364, in put
self._writer.send_bytes(obj)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
self.run()
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/process.py", line 99, in run
self._target(*self._args, **self._kwargs)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
self._send(header + buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/pool.py", line 132, in worker
put((job, i, (False, wrapped)))
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/queues.py", line 364, in put
self._writer.send_bytes(obj)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 200, in send_bytes
self._send_bytes(m[offset:offset + size])
BrokenPipeError: [Errno 32] Broken pipe
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 404, in _send_bytes
self._send(header + buf)
File "/usr/local/python3.7.5/lib/python3.7/multiprocessing/connection.py", line 368, in _send
n = write(self._handle, buf)
BrokenPipeError: [Errno 32] Broken pipe
/usr/local/python3.7.5/lib/python3.7/multiprocessing/semaphore_tracker.py:144: UserWarning: semaphore_tracker: There appear to be 39 leaked semaphores to clean up at shutdown
len(cache))
from paddle-lite.
如果你ascend 那部分环境变量设置正确的话,还出现错误那就是需要单独适配了,需要开启ascend相关日志看下是什么原因
from paddle-lite.
您好,这个问题看起来很难解决,有可能是难以兼容8.0rc1的华为软件,我准备重新开始,将paddlelite卸载干净后把昇腾的软件降到6.0.RC1再测试demo,有几个问题向您请教
①请问您知道paddlelite的卸载方式吗,官网好像没有
②此外通过在github上下载paddlelite后使用./lite/tools/build_linux.sh --arch=x86 --with_extra=ON --with_log=ON --with_exception=ON --with_nnadapter=ON --nnadapter_with_huawei_ascend_npu=ON --nnadapter_huawei_ascend_npu_sdk_root=/usr/local/Ascend/ascend-toolkit/latest full_publish
直接在物理机上安装paddlelite是否合理
③模型的部署是否要求下载昇腾的nnae而非nnrt
④如果只运行PaddleLite-generic-demo.tar.gz中的demo是否需要以完成②步为前提
from paddle-lite.
1.paddlelite如果不是python编译的话,不需要卸载,运行时通过指定LD_LIBRARY_PATH 路径后生效,如果是python安装的,可以使用pip unistall paddlelite 进行卸载。
2. paddlelite是以动态库的形式运行,所以只要运行时,设置正确的路径即可,物理机 docker 均可。
3. 我们这边也是使用的nnrt,可以参考下面
4. 这个如果装的环境和paddlelite预编译的环境不一样可能需要重新编译,但是如果您安装好了环境的话,可以先使用我们的预编译库先试着跑一下,有问题在重新编译。
from paddle-lite.
我注意到您提到环境变量的相关问题,请您帮助查看环境变量是否包含您所提到的paddlelite正确路径,我的paddlelite未经过python编译
from paddle-lite.
Related Issues (20)
- predictor.run()之后无结果 HOT 2
- 树莓派4b上跑自己的nb模型结果图片没有识别框 HOT 7
- 使用经过paddle-lite-opt优化后的模型,在压测环境下报错(fread(dst, 1, size, file_) == size): 0!==262288 Failed to read 262288 bytes. HOT 4
- V4 OCR的检测模型,导出为atlas格式后,无法跑在atlas 300ipro上 HOT 1
- arm的推理结果正确,opencl的结果不正确,请问如何做逐层的精度对比分析? HOT 3
- Object Detection运行run脚本的时候报错 HOT 6
- 华为Atlas300I(3010,昇腾310)安装Paddle-Lite运行例程出错 HOT 4
- Paddle-Lite通过X2Paddle转化出的nb模型出错 HOT 2
- [BUG] sort_cpuid_by_max_freq()中的bubble sort代码逻辑错误 HOT 4
- 基于Paddle Lite框架下的编译问题 HOT 3
- Android端推理LightGlue模型crash了 HOT 1
- ubuntu 22.04 x64 编译paddle-lite x86出错 lite/CMakeFiles/publish_inference.dir/rule HOT 2
- .pdmodel转换.nb失败,报错Check failed: it != outputs_.end(): HOT 18
- 3399 编译with_opencl后运行示例mobilenetv1_light_api显示不支持opencl HOT 1
- could not create a descriptor for a pooling forward propagation primitive HOT 1
- "树莓派5 编译错误"- v2.13 编译报错、环境信息: (Linux raspberrypi 6.6.28+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.6.28-1+rpt1 (2024-04-22) aarch64 GNU/Linux) HOT 20
- 预编译库的V2,12下载的压缩包里没有找到python,我要怎么继续下一步得到python--->install--->dist--->paddlelite-*.whl HOT 1
- "树莓派5 编译错误"- v2.13 编译报错 cd09a8e01、环境信息:(Debian arm64 ----armv8 ) ,经检查“cd09a8e01”是git describe --always 值 HOT 3
- 编译无法生成inference_lite_lib.armlinux.armv8.opencl包 HOT 5
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from paddle-lite.