Giter Club home page Giter Club logo

Comments (14)

hcho3 avatar hcho3 commented on July 28, 2024

I'd see why one would want to load a model at runtime instead of baking it as executable code. One way to achieve this is to have treelite convert a model (in any supported format) into a common format. Then at the time of deployment, you'd deploy the prediction logic that would load the common-format model at runtime. The format conversion is necessary so that the prediction logic would deal with one single format instead of multiple (xgboost, lightgbm, etc).

Let me get back to you when I have time estimate for this feature.

from treelite.

headupinclouds avatar headupinclouds commented on July 28, 2024

Let me get back to you when I have time estimate for this feature.

Great. Let me know if I can help.

One way to achieve this is to have treelite convert a model (in any supported format) into a common format.

Perhaps the the trees could be loaded then serialized with something like cereal, which provides a lot of format flexibilty: portable binary archive, xml, json, etc. Thoughts?

from treelite.

headupinclouds avatar headupinclouds commented on July 28, 2024

Let me get back to you when I have time estimate for this feature.

Any updates on this one?

Background: I have a few regression/detection tasks for mobile devices where this could significantly streamline the xgboost model deployment.

from treelite.

headupinclouds avatar headupinclouds commented on July 28, 2024

Seems to be fixed by #45 ?

from treelite.

hcho3 avatar hcho3 commented on July 28, 2024

@headupinclouds I don't think it goes all the way you want, since models will still need to be compiled in order to be fed into the runtime.

from treelite.

hcho3 avatar hcho3 commented on July 28, 2024

@headupinclouds Conceptually, to do what you want, we'd need to add a new runtime that directly reads from Protobuf. Any reason why current approach (generating shared libs) doesn't work for your scenario?

from treelite.

headupinclouds avatar headupinclouds commented on July 28, 2024

since models will still need to be compiled in order to be fed into the runtime.

I see, thanks for the clarification.

Any reason why current approach (generating shared libs) doesn't work for your scenario

I'm interested in using treelite as a general purpose alternative to direct static linking of xgboost in a couple real-time cross platform computer vision applications, although I see my requirements may not be in line with current project goals. I'd like to have a single universal prediction module that supports a variety of pretrained regression and classification tasks, where different model sizes may be required due to varying application specific and platform constraints. A mobile model <= 1 MB may be required for some builds, and a more accurate 8 MB model may be appropriate for desktop platforms. You mentioned SHARED libs, allthough I'm guessing this isn't a hard constraint. For iOS, one would have to generate a SHARED lib in the form of a dynamic framework, which takes some work. In my use case, static linking is generally preferable.

from treelite.

hcho3 avatar hcho3 commented on July 28, 2024

@headupinclouds You can use static linking for prediction runtime. Currently, the model itself will be encoded as a .so file, since Treelite produces an executable code for the model. What is the advantage of reading a Protobuf file instead of reading .so file? Either way, you need to read one file for each model.

from treelite.

headupinclouds avatar headupinclouds commented on July 28, 2024

I can have one protobuf file that will work on Linux, Windows, macOS, iOS, Android and Raspberry Pi, but I would need 6 model.so files to accomplish this. Right?

from treelite.

hcho3 avatar hcho3 commented on July 28, 2024

@headupinclouds Yes, you'd need to prepare multiple .so/.dll/.dylib files to target multiple platforms. I understand now the benefit of using Protobuf. As I mentioned earlier, to use Protobuf, we'll need to create a new runtime that takes in Protobuf as input.

from treelite.

headupinclouds avatar headupinclouds commented on July 28, 2024

Okay, thanks for the input. Treelite is a great idea.

from treelite.

karimkhanvi avatar karimkhanvi commented on July 28, 2024

@headupinclouds and @hcho3 Thanks for interesting discussion. I followed tutorial given on https://treelite.readthedocs.io/en/latest/ and generated .so file.

But now no idea on how to convert it into protobuf file so that I can use same model into iOS and Android. Any reference link is very much appreciated.

from treelite.

headupinclouds avatar headupinclouds commented on July 28, 2024

But now no idea on how to convert it into protobuf file so that I can use same model into iOS and Android. Any reference link is very much appreciated.

That isn't how this library is designed. From the above discussion this would have to be added as a new feature, but it is quite orthogonal to the current design, so it isn't clear it will necessarily be added. The .so you created is your model, so you need to generate one for each architecture. In the case of iOS, you would have to create a shared framework (essentially a wrapper around an iOS *.dylib).

from treelite.

hcho3 avatar hcho3 commented on July 28, 2024

I decided to remove Protobuf entirely from Treelite. For the use case you described, I suggest that you write a custom runtime that operates on the Treelite C++ object (https://github.com/dmlc/treelite/blob/mainline/include/treelite/tree.h). For example, The Forest Inferencing Library (FIL) in cuML uses this approach.

from treelite.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.