Comments (6)
It occurs only when the Metal binary is not properly build. Would you like to double check?
from mlc-llm.
trying to build ios app from the source and everything is ok except for running the app on the iPhone, the app shows ready to chat but after sending the message, the app crashes, and on the Xcode it shows :
ibc++abi: terminating due to uncaught exception of type tvm::runtime::InternalError: [15:39:04] /Users/relax/src/runtime/relax_vm/lm_support.cc:247:
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html
Check failed: uniform_sample <= data[0].first (0.0715982 vs. nan) : Stack trace: [bt] (0) 1 MLCChat 0x0000000104c9f094 tvm::runtime::detail::LogFatal::Entry::Finalize() + 116 [bt] (1) 2 MLCChat 0x0000000104c9f020 tvm::runtime::detail::LogFatal::Entry::Finalize() + 0 [bt] (2) 3 MLCChat 0x0000000104c9e51c __clang_call_terminate + 0 [bt] (3) 4 MLCChat 0x0000000104d4f8a8 tvm::runtime::relax_vm::SampleTopPFromLogits(tvm::runtime::NDArray, double, double, double) + 1544 [bt] (4) 5 MLCChat 0x0000000104d557dc void tvm::runtime::TypedPackedFunc<int (tvm::runtime::NDArray, double, double, double)>::AssignTypedLambda<int ()(tvm::runtime::NDArray, double, double, double)>(int ()(tvm::runtime::NDArray, double, double, double), std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>)::'lambda'(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*) const + 232 [bt] (5) 6 MLCChat 0x0000000104cc35ec mlc::llm::LLMChatModule::SampleFromLogitsOnCPU() + 348 [bt] (6) 7 MLCChat 0x0000000104cc1e60 mlc::llm::LLMChatModule::EncodeStep(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>) + 504 [bt] (7) 8 MLCChat 0x0000000104cc1b54 mlc::llm::LLMChatModule::GetFunction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) const + 124 [bt] (8) 9 MLCChat 0x0000000104cc1acc tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<mlc::llm::LLMChatModule::GetFunction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)>>::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) + 40
trying to build ios app from the source and everything is ok except for running the app on the iPhone, the app shows ready to chat but after sending the message, the app crashes, and on the Xcode it shows :
ibc++abi: terminating due to uncaught exception of type tvm::runtime::InternalError: [15:39:04] /Users/relax/src/runtime/relax_vm/lm_support.cc:247:
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html
Check failed: uniform_sample <= data[0].first (0.0715982 vs. nan) : Stack trace: [bt] (0) 1 MLCChat 0x0000000104c9f094 tvm::runtime::detail::LogFatal::Entry::Finalize() + 116 [bt] (1) 2 MLCChat 0x0000000104c9f020 tvm::runtime::detail::LogFatal::Entry::Finalize() + 0 [bt] (2) 3 MLCChat 0x0000000104c9e51c __clang_call_terminate + 0 [bt] (3) 4 MLCChat 0x0000000104d4f8a8 tvm::runtime::relax_vm::SampleTopPFromLogits(tvm::runtime::NDArray, double, double, double) + 1544 [bt] (4) 5 MLCChat 0x0000000104d557dc void tvm::runtime::TypedPackedFunc<int (tvm::runtime::NDArray, double, double, double)>::AssignTypedLambda<int ()(tvm::runtime::NDArray, double, double, double)>(int ()(tvm::runtime::NDArray, double, double, double), std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>)::'lambda'(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*) const + 232 [bt] (5) 6 MLCChat 0x0000000104cc35ec mlc::llm::LLMChatModule::SampleFromLogitsOnCPU() + 348 [bt] (6) 7 MLCChat 0x0000000104cc1e60 mlc::llm::LLMChatModule::EncodeStep(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>) + 504 [bt] (7) 8 MLCChat 0x0000000104cc1b54 mlc::llm::LLMChatModule::GetFunction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) const + 124 [bt] (8) 9 MLCChat 0x0000000104cc1acc tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<mlc::llm::LLMChatModule::GetFunction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)>>::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) + 40
Hey, how did you solve the problem? Can you please share?
from mlc-llm.
Metal binary
Could you please explain which step "Metal binary" is in
from mlc-llm.
trying to build ios app from the source and everything is ok except for running the app on the iPhone, the app shows ready to chat but after sending the message, the app crashes, and on the Xcode it shows :
ibc++abi: terminating due to uncaught exception of type tvm::runtime::InternalError: [15:39:04] /Users/relax/src/runtime/relax_vm/lm_support.cc:247:
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html
Check failed: uniform_sample <= data[0].first (0.0715982 vs. nan) : Stack trace: [bt] (0) 1 MLCChat 0x0000000104c9f094 tvm::runtime::detail::LogFatal::Entry::Finalize() + 116 [bt] (1) 2 MLCChat 0x0000000104c9f020 tvm::runtime::detail::LogFatal::Entry::Finalize() + 0 [bt] (2) 3 MLCChat 0x0000000104c9e51c __clang_call_terminate + 0 [bt] (3) 4 MLCChat 0x0000000104d4f8a8 tvm::runtime::relax_vm::SampleTopPFromLogits(tvm::runtime::NDArray, double, double, double) + 1544 [bt] (4) 5 MLCChat 0x0000000104d557dc void tvm::runtime::TypedPackedFunc<int (tvm::runtime::NDArray, double, double, double)>::AssignTypedLambda<int ()(tvm::runtime::NDArray, double, double, double)>(int ()(tvm::runtime::NDArray, double, double, double), std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>)::'lambda'(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*) const + 232 [bt] (5) 6 MLCChat 0x0000000104cc35ec mlc::llm::LLMChatModule::SampleFromLogitsOnCPU() + 348 [bt] (6) 7 MLCChat 0x0000000104cc1e60 mlc::llm::LLMChatModule::EncodeStep(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>) + 504 [bt] (7) 8 MLCChat 0x0000000104cc1b54 mlc::llm::LLMChatModule::GetFunction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) const + 124 [bt] (8) 9 MLCChat 0x0000000104cc1acc tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<mlc::llm::LLMChatModule::GetFunction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)>>::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) + 40trying to build ios app from the source and everything is ok except for running the app on the iPhone, the app shows ready to chat but after sending the message, the app crashes, and on the Xcode it shows :
ibc++abi: terminating due to uncaught exception of type tvm::runtime::InternalError: [15:39:04] /Users/relax/src/runtime/relax_vm/lm_support.cc:247:
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html
Check failed: uniform_sample <= data[0].first (0.0715982 vs. nan) : Stack trace: [bt] (0) 1 MLCChat 0x0000000104c9f094 tvm::runtime::detail::LogFatal::Entry::Finalize() + 116 [bt] (1) 2 MLCChat 0x0000000104c9f020 tvm::runtime::detail::LogFatal::Entry::Finalize() + 0 [bt] (2) 3 MLCChat 0x0000000104c9e51c __clang_call_terminate + 0 [bt] (3) 4 MLCChat 0x0000000104d4f8a8 tvm::runtime::relax_vm::SampleTopPFromLogits(tvm::runtime::NDArray, double, double, double) + 1544 [bt] (4) 5 MLCChat 0x0000000104d557dc void tvm::runtime::TypedPackedFunc<int (tvm::runtime::NDArray, double, double, double)>::AssignTypedLambda<int ()(tvm::runtime::NDArray, double, double, double)>(int ()(tvm::runtime::NDArray, double, double, double), std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>)::'lambda'(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*) const + 232 [bt] (5) 6 MLCChat 0x0000000104cc35ec mlc::llm::LLMChatModule::SampleFromLogitsOnCPU() + 348 [bt] (6) 7 MLCChat 0x0000000104cc1e60 mlc::llm::LLMChatModule::EncodeStep(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator>) + 504 [bt] (7) 8 MLCChat 0x0000000104cc1b54 mlc::llm::LLMChatModule::GetFunction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)::operator()(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) const + 124 [bt] (8) 9 MLCChat 0x0000000104cc1acc tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<mlc::llm::LLMChatModule::GetFunction(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, tvm::runtime::ObjectPtrtvm::runtime::Object const&)::'lambda1'(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)>>::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) + 40Hey, how did you solve the problem? Can you please share?
sorry i do not solve this problem yet
from mlc-llm.
It occurs only when the Metal binary is not properly build. Would you like to double check?
thanks for replying really appreciate would you be more specific about which building step causing metal binary not be properly built? Thanks
from mlc-llm.
We've fixed several related issues in the recent month, but could you guys double check if the issue persists? If so, please open a new issue with detailed information so that I could help with!
from mlc-llm.
Related Issues (20)
- [Question] Is there an embeddings model in MLC format?
- [Question] Is Apple Silicon Neural Engine (ANE) and Core ML model package format supported? HOT 1
- [Model Request] OpenELM HOT 1
- [Question] Support for Custom Attention Mask
- [Bug] libc++abi: terminating due to uncaught exception of type tvm::runtime::InternalError: [14:02:26] HOT 3
- [Model Request] Microsoft Phi-3 mini Instruct (Faster and better then LLama 3 8B) HOT 2
- [Bug] Unexpected Error: The model weight size may be larger than GPU memory size HOT 5
- [Bug] TVMError: Check failed: (result) is false: Failed to allocate 99121664 bytes with alignment 16 bytes
- AutoTVM optimization? HOT 3
- Phi-3-3.8 billion model [Model Request] HOT 1
- [Question] Omniquant. (AFAIK) scores best for Q. Methods, why no adoption? In any case, is per-tensor quant. best for Mixtral/MoE models? HOT 1
- [Bug] Error: could not compile `regex-syntax`
- [Bug] `mlc_llm chat` throws errors for model `mlc-ai/Qwen1.5-1.8B-Chat-q4f16_1-MLC`
- [Bug] `system-lib-prefix` would be cleared if `device` is not strictly `android` while `mlc_llm compile` HOT 2
- /opt/AI/llm_obj/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc:232:31: error: cannot initialize a parameter of type 'void **' with an rvalue of type 'JNIEnv **' (aka 'JNIEnv_ **') 232 | _jvm->AttachCurrentThread(&env, nullptr); | ^~~~ /usr/local/java/jdk-17.0.11/include/jni.h:1938:37: note: passing argument to parameter 'penv' here 1938 | jint AttachCurrentThread(void **penv, void *args) { | ^ /opt/AI/llm_obj/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc:309:31: error: cannot initialize a parameter of type 'void **' with an rvalue of type 'JNIEnv **' (aka 'JNIEnv_ **') HOT 2
- [Question] What models actually work with function calling?
- [Question] Using FP8 quantization on a RedPajama-INCITE-Chat-3B-v1 model
- Phi-3 mini 4k instruct with MICROSOFT's quantization HOT 3
- Reduced and now seemingly removed support for Mali? HOT 5
- [Bug] Check failed: (args.size() == initial_indices_orig.size()) is false
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from mlc-llm.