Comments (4)
clone the repo
TVM is in https://github.com/mlc-ai/mlc-llm/tree/main/3rdparty
MLC_LLM is there, too
here's some of the output of mlc_llm compile --help
--output OUTPUT, -o OUTPUT The path to the output file. The suffix determines if the output file is a shared library or objects. Available suffixes: 1) Linux: .so (shared), .tar (objects); 2) macOS: .dylib (shared), .tar (objects); 3) Windows: .dll (shared), .tar (objects); 4) Android, iOS: .tar (objects); 5) Web: .wasm (web assembly). (required)
after you have EMSDK installed per your directions up there, you should be able to build a WASM binary
from mlc-llm.
MLC_HOME
from mlc-llm.
clone the repo
TVM is in https://github.com/mlc-ai/mlc-llm/tree/main/3rdparty
MLC_LLM is there, toohere's some of the output of mlc_llm compile --help
--output OUTPUT, -o OUTPUT The path to the output file. The suffix determines if the output file is a shared library or objects. Available suffixes: 1) Linux: .so (shared), .tar (objects); 2) macOS: .dylib (shared), .tar (objects); 3) Windows: .dll (shared), .tar (objects); 4) Android, iOS: .tar (objects); 5) Web: .wasm (web assembly). (required)
after you have EMSDK installed per your directions up there, you should be able to build a WASM binary
So this wouldn't work with the direct pip installs for MLC right, I would have to build it from scratch instead to get a working MLC build for WASM
from mlc-llm.
Related Issues (20)
- [Bug] INVALID_BUFFER_SIZE
- [Bug] set storage_type to uint8 for llama2 q4f16_1 can't generate normal OpenCL code. HOT 1
- [Bug] Can't finish the build process on windows HOT 10
- [Model Request] Mamba
- [Question] Can MLC quantize multimodal models? HOT 2
- [Question] mlc_llm serve fails with --speculative-mode, does it require certain hardware?
- mlc-chat.apk initialize model failed
- [Doc] Cant install mlc HOT 1
- [Feature Request] Implement AttentionStore
- [Question] Single forward pass through ChatModule HOT 1
- 执行mlc_chat命令时,提示tvm模块找不到。 HOT 1
- [Model Request] Yi-1.5 HOT 2
- [Bug] Mistral MultiRound Chat Bug HOT 1
- [Bug] java.lang.NullPointerException: Attempt to invoke virtual method 'org.apache.tvm.TVMValue org.apache.tvm.Function.invoke()' on a null object reference HOT 2
- [Question] How to generate conversational template with more than one input HOT 3
- [Bug] mlc_llm.serve server mode Error when multiple(>=4) concurrent requests HOT 4
- [Model Request] Phi-3-Vision HOT 4
- Fail to build tvm-unity from source on orin[Bug] HOT 2
- [Bug] AttributeError: 'Namespace' object has no attribute 'mlc_source_dir' HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from mlc-llm.