Comments (9)
Optimum is an open source library that lets one do many of those conversions locally (it's powered most/all of those Spaces):
cc @mfuntowicz!
from huggingface.js.
@xenova you have just fully defined a JavaScript Dev's nightmare.
"Just load Gradio, to run the Python code, in an App, that can be put in an iFrame, in the web browser." LOL
On a more serious level it does look do-able, I will check around for some similar working examples. I need to be able to:
- load a tensorflowJS model in .json format with it's .bin binary shard files
- convert the model, let's say to Keras just to test. I am used to the command line but probably better to use a python script. Here is the link to the tensorflowjs-converter https://github.com/tensorflow/tfjs/tree/master/tfjs-converter
- Download the converted model
If I can get that working I can probably do the entire process.
I am a bit confused by the whole PR thing the other converter examples did. Does the code send a PR which then automatically does the conversion? Seems strange. Can the pipeline just do what I listed above or does it need to do the whole PR thing?
from huggingface.js.
cc @xenova 👀
from huggingface.js.
we have a few converters on the Hub already, for instance:
- https://huggingface.co/spaces/safetensors/convert
- https://huggingface.co/spaces/onnx/export
- https://huggingface.co/spaces/huggingface-projects/transformers-to-coreml
so definitely open to building more!
from huggingface.js.
@hpssjellis Hi there! 👋 as mentioned above, we have a few converters already available as spaces on the Hub, but the only "JS-ready" format would be ONNX (thanks to onnxruntime-web).
From your first message, it seems like quite a specific use-case: starting with a TensorflowJS model and ending with a C-header file (with many steps in the middle).
That said, it might be worth looking into building a HF space that handles the full pipeline. Is that something you'd be interested in? As a follow up, since you seem to already have a dockerfile/setup to do it all, might I suggest you create a HF space yourself for it? Considering the typical size of these models (i.e., quite small), I am quite sure you won't need anything above the basic (free) version.
from huggingface.js.
Thanks @julien-c and @xenova those look great. I will definitely look into HF space. Yes my use case is very specific but also very helpful for moving ML models to micro-controllers. Here is an example if you are interested . Works on a cell phone without a connected device since the Cell Phone Motion button captures the x,y,z from the cell phone.
https://hpssjellis.github.io/tinyMLjs/public/acceleration/a00-best-acceleration.html
If HF Space can make a pipeline that I can drop any tensorflowJS file and output a c-header file that would be really helpful.
from huggingface.js.
If HF Space can make a pipeline that I can drop any tensorflowJS file and output a c-header file that would be really helpful.
That should be possible 👍 I recommend that you check out Gradio, which seamlessly integrates with Hugging Face spaces (docs). It handles most of the things you are looking to add (e.g., file uploads).
Also, since Gradio runs in python, you can directly use the tensorflowjs converter you linked above. Alternatively, you can just set up a list of os.system
calls - whichever works.
from huggingface.js.
"Just load Gradio, to run the Python code, in an App, that can be put in an iFrame, in the web browser." LOL
😅 Haha, yeah I totally understand. The Gradio docs are quite extensive, and the above example spaces should help provide you with some idea on how to get started.
I am a bit confused by the whole PR thing the other converter examples did. Does the code send a PR which then automatically does the conversion? Seems strange. Can the pipeline just do what I listed above or does it need to do the whole PR thing?
The reason they submit a PR is just to contribute the converted model the original repo (for others to use too). In your case, probably the easiest would be to create and upload the converted files to a new model repo (under your account, either private or public). For that you can use the Hub python library to do that: https://huggingface.co/docs/huggingface_hub/guides/upload
from huggingface.js.
we have a few converters on the Hub already, for instance:
- https://huggingface.co/spaces/safetensors/convert
- https://huggingface.co/spaces/onnx/export
- https://huggingface.co/spaces/huggingface-projects/transformers-to-coreml
so definitely open to building more!
@julien-c @xenova these converters seem awesome but they are all cloud side (which makes sense for Hugggingface). My group often has to download a model and then work in the field or just without great internet. I am looking for ways to convert models such as downloaded Huggingface models but also models made by the students offsite. Does anyone have any suggestions:
I have found these sites
https://www.tensorflow.org/lite/models/convert
https://github.com/hhsecond/ml2rt
https://madewithml.com/courses/foundations/utilities/
Much like my Javascript questions which ended with this very useful set of static webpages I am trying to let students manipulate locally the datasets and models that they have access to.
from huggingface.js.
Related Issues (20)
- [Inference] Support for Messages API OpenAI API specs
- [Feature Request] Model inspector for other formats HOT 4
- Missing Type From the Inference Package HOT 1
- [Conversational] Property conversational does not exist on type HfInference HOT 3
- Safetensors sharded model inspector does not work in subdirs HOT 8
- Is 404 console.error expected for `fileExists ` HOT 4
- Sharded GGUF in subdir HOT 3
- fix `pipelineSnippet` for repos with custom pipelines HOT 2
- GGUF Sharded model metadata display might have a memory leak HOT 6
- GGUF: missing `split.no` metadata HOT 6
- [Question] What is the correct way to access commit diff results via http? HOT 1
- [ASR Widget] “Browse for file” is unresponsive HOT 1
- Inference API widget does not work anymore for token classification of POS HOT 3
- [Widget] "Model not loaded yet" error on page load
- Rm ArrayBuffer.resize polyfill when Firefox supports it by default HOT 4
- Feature Request: Get Commits List / Commits History by Repository ID using @huggingface/hub HOT 1
- [Question] Closed source model support?
- Build a chatbot with huggingFace
- CDN: Reference unbundled/unminified files directly instead of base esm/jsdeliver links?
- Add Ollama and vLLM as Options for Local Apps in Hugginface HOT 5
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from huggingface.js.