Comments (14)
I'd see why one would want to load a model at runtime instead of baking it as executable code. One way to achieve this is to have treelite
convert a model (in any supported format) into a common format. Then at the time of deployment, you'd deploy the prediction logic that would load the common-format model at runtime. The format conversion is necessary so that the prediction logic would deal with one single format instead of multiple (xgboost, lightgbm, etc).
Let me get back to you when I have time estimate for this feature.
from treelite.
Let me get back to you when I have time estimate for this feature.
Great. Let me know if I can help.
One way to achieve this is to have treelite convert a model (in any supported format) into a common format.
Perhaps the the trees could be loaded then serialized with something like cereal, which provides a lot of format flexibilty: portable binary archive, xml, json, etc. Thoughts?
from treelite.
Let me get back to you when I have time estimate for this feature.
Any updates on this one?
Background: I have a few regression/detection tasks for mobile devices where this could significantly streamline the xgboost model deployment.
from treelite.
Seems to be fixed by #45 ?
from treelite.
@headupinclouds I don't think it goes all the way you want, since models will still need to be compiled in order to be fed into the runtime.
from treelite.
@headupinclouds Conceptually, to do what you want, we'd need to add a new runtime that directly reads from Protobuf. Any reason why current approach (generating shared libs) doesn't work for your scenario?
from treelite.
since models will still need to be compiled in order to be fed into the runtime.
I see, thanks for the clarification.
Any reason why current approach (generating shared libs) doesn't work for your scenario
I'm interested in using treelite as a general purpose alternative to direct static linking of xgboost in a couple real-time cross platform computer vision applications, although I see my requirements may not be in line with current project goals. I'd like to have a single universal prediction module that supports a variety of pretrained regression and classification tasks, where different model sizes may be required due to varying application specific and platform constraints. A mobile model <= 1 MB may be required for some builds, and a more accurate 8 MB model may be appropriate for desktop platforms. You mentioned SHARED libs, allthough I'm guessing this isn't a hard constraint. For iOS, one would have to generate a SHARED lib in the form of a dynamic framework, which takes some work. In my use case, static linking is generally preferable.
from treelite.
@headupinclouds You can use static linking for prediction runtime. Currently, the model itself will be encoded as a .so
file, since Treelite produces an executable code for the model. What is the advantage of reading a Protobuf file instead of reading .so
file? Either way, you need to read one file for each model.
from treelite.
I can have one protobuf file that will work on Linux, Windows, macOS, iOS, Android and Raspberry Pi, but I would need 6 model.so
files to accomplish this. Right?
from treelite.
@headupinclouds Yes, you'd need to prepare multiple .so/.dll/.dylib
files to target multiple platforms. I understand now the benefit of using Protobuf. As I mentioned earlier, to use Protobuf, we'll need to create a new runtime that takes in Protobuf as input.
from treelite.
Okay, thanks for the input. Treelite is a great idea.
from treelite.
@headupinclouds and @hcho3 Thanks for interesting discussion. I followed tutorial given on https://treelite.readthedocs.io/en/latest/ and generated .so
file.
But now no idea on how to convert it into protobuf file so that I can use same model into iOS and Android. Any reference link is very much appreciated.
from treelite.
But now no idea on how to convert it into protobuf file so that I can use same model into iOS and Android. Any reference link is very much appreciated.
That isn't how this library is designed. From the above discussion this would have to be added as a new feature, but it is quite orthogonal to the current design, so it isn't clear it will necessarily be added. The .so
you created is your model, so you need to generate one for each architecture. In the case of iOS, you would have to create a shared framework (essentially a wrapper around an iOS *.dylib
).
from treelite.
I decided to remove Protobuf entirely from Treelite. For the use case you described, I suggest that you write a custom runtime that operates on the Treelite C++ object (https://github.com/dmlc/treelite/blob/mainline/include/treelite/tree.h). For example, The Forest Inferencing Library (FIL) in cuML uses this approach.
from treelite.
Related Issues (20)
- Document that Left child is chosen when condition is evaluated True
- Use std::variant to implement type-based dispatching
- Do not call np.squeeze on output of predict_leaf() / predict_per_tree() HOT 1
- treelite::ConcatenateModelObjects() ought to set threshold_type and leaf_output_type fields
- Clean up serialization logic
- Support XGBoost gblinear Booster HOT 1
- Release version 3.3.0
- Release version 3.4.0
- Replace setup.py with pyproject.toml
- Treelite crashes with XGBoost 2.0 dev
- Document Treelite serialization format.
- Adopt Four-Document System to organize docs
- Refactor sklearn loader using mix-in classes
- Implement v4 serialization format
- Revamp JSON importer to make it easy to use
- Drop "max_index" postprocessor
- Add directory exist check in _load_lib for add_dll_directory HOT 1
- scikit-learn 1.4.0 breaks unit test HOT 1
- Multi-class, multi-output RandomForestClassifier in scikit-learn produces error
- Test multi-target models where each tree predicts multiple regression targets HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from treelite.