Giter Club home page Giter Club logo

local.ai's Introduction

🎒 local.ai

A desktop app for local, private, secured AI experimentation. Included out-of-the box are:

  • A known-good model API and a model downloader, with descriptions such as recommended hardware specs, model license, blake3/sha256 hashes etc...
  • A simple note-taking app, with inference config PER note. The note and its config are output into plain text .mdx format
  • A model inference streaming server (/completion endpoint, similar to OpenAI)

It's made to be used alongside https://github.com/alexanderatallah/window.ai/ as a simple way to have a local inference server up and running in no time. window.ai + local.ai enable every web app to utilize AI without incurring any cost from either the developer or the user!

Right now, local.ai uses the https://github.com/rustformers/llm rust crate at its core. Check them out, they are super cool!

🚀 Install

Go to the site at https://www.localai.app/ and click the button for your machine's architecture. You can also find the build manually in the GitHub release page.

Windows and MacOS binaries are signed under Plasmo Corp. - a company owned by the author of this project (@louisgv).

You may also build from source!

📺 Demo

local.ai.demo.v0.2.x.mp4

🧵 Development

Here's how to run the project locally:

Prerequisites

  1. node >= 18
  2. rust >= 1.69
  3. pnpm >= 8

Workflow

git submodule update --init --recursive
pnpm i
pnpm dev

🪪 License

🤔 Trivia

Why the backpack?

Ties into the bring your own model concept -- Alex from window.ai

Why GPLv3?

Anything AI-related including their derivatives should be open-source for all to inspect. GPLv3 enforces this chain of open-source.

Is there a community?

Where should I ask question?

I made something with local.ai, where should I post it?

I have some nice things to say about local.ai, where should I post it?

  • Here
  • Also, consider giving the repo a Star ⭐️

The naming seems close to LocalAI?

  • When I first started the project and got the domain localai.app, I had no idea LocalAI was a thing. A friend of mine forwarded me a link to that project mid May, and I was like dang it, let's just add a dot and call it a day (for now...) - local "dot" ai vs LocalAI lol
  • We might rename the project.

Do you accept contribution/PR?

Absolutely - Please note that any contribution toward this repo shall be relicensed under GPLv3. There are many ways to contribute, such as:

local.ai's People

Contributors

llukas22 avatar louisgv avatar nightscape avatar regular-baf avatar tomasmcm avatar yportne13 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

local.ai's Issues

Feature Request: CUDA/OpenCL Builds

Just a heads up, llm will soon have support for metal/cuda and opencl accelerators (see rustformers/llm#325).
These features are mutual exclusive, meaning you can only pack one backend at a time into a binary. E.g. CPU + Metal or CPU + CUDA.

It would be nice to have ready made binaries for those backends, that can be downloaded and just work.

Filter out downloaded model in selector

We should prob show only model that has not been downloaded OR that its hash had changed in the directory. Another strategy to declutter the download selector

This is super cool!

Hi! LOOOVE the idea behind this (for privacy reasons, but also the idea of running a model locally is just super cool). I can't get past downloading a model and sending a message. not getting anything back. on m1 macbook air. thanks!

Type drop down - does it work?

I noticed that the models show the type, but that this actually is a dropdown. I was expecting this to be a field, like tokenizer. Are there actually instances where it is possible to change the type there?

BUG | Remote Tokenizer panics during offline invocation

By default, this code path relies on llm's invocation of HF's tokenizer, which in turn does not do any cache check before fetching the tokenizer again from remote into the cache.

We should prob handle this case and just use either the BuiltIn tokenizer (llm's), OR a cached tokenizer version if any exists

^ will need to dig through the cache code of HF again to grab the cache path.

Apps

We can build more powerful app tailored for simple usecases described below:

Ref:

I just wanted to also give some thoughts on this in terms of uses. I would personally primarily use this to help with text transformation (shorter/longer/remove elements/rewrite as a checklist…) since sensitive work documents can’t go into chatgpt. if the app was better suited for this kind of work it would be amazing (i think a lot of people’s work is sensitive and has to stay local)! either way best of luck!!:)

Originally posted by @turnipsforme in #11 (comment)

The Edit and Prompt buttons are a bit confusing

image

Currently the model is only prompted if the user presses the "Prompt" (Brain) button. This can confuse first time users (and me 😅)

The buttons should be swapped and "enter" should prompt the model instead of creating on of these yellow boxes. (I still don't know what they do). Maybe some tooltips could be added to make clear what each button does 🤔

Code blocks are not escaped correctly

Code blocks don't seam to be escaped correctly:
grafik

Generated text:

def count_up(n): # function definition with n as parameter
 for i in range (2, (n+1)):  
 print((i))```

Runtime Debugger/Logger

On mac or linux I would just run from source, but right now I am on Windows with the signed version and a way to look at error/debug logs without installing rust etc would be nice.

remote vocab

Add a field in the setting to enable remote vocab at model loading (should be associated with the model itself?)

Crash when trying to load Model

I run my Instance on a brand-new Windows VM with 64 GB Ram and 22VCPU Cores. But when I try to load a model, the application crashes after a few seconds. I didn't find any logs, so I don't know what to do.

Client library

Make a simple client library for other app to call local's API

Add a model explorer

The current download selector is quick and dirty, but might become useless when we have tons of model.

Might want to show just the recommended models #12 , and a model explorer modal/page dedicated to showcasing model cards and their attributes.

Non-streaming completion API

Support a non-stream version of the API. Without streaming, it's... more tricky (?). Since we need to load the model and do all kind of jazz before we can send them back a response.

Lack of streaming would likely requires:

  • Timeout configuration on the client's side
  • Some way to keep-alive the connection and send a completion body

Would need to experiment and see... but low priority because I personally don't use non-stream API :p... (any taker?)

Model Config Panel

This will be similar to the Thread config panel, but it persists per model instead!

Model load feedback

When clicking load, the model load number just counts up, without necessarily loading. (as far as I can tell) Is this intended or just not implemented more explicitly? (e.g., when the loading fails, I would expect an error, and why would you actually load more than once? for multiple threads?)

export type ModelStats = {
  loadCount: number
}

Local Cached HF Tokenizer

The current HF Tokenizer implementation fetch first then check cache after, effectively invalidate cache on subsequence run.

We want it to unwrap or check cache before panic, otherwise it would be annoying using the remote tokenizer feat.

Support for prompt templates

Some models need specific prompt templates to function correctly. It would be greate if we could specify a template for a chat session. Some of the most common templates (Alpaca, etc.) could be distributed with the installer. It would also be great if we could define a prompt template folder where i can store templates in a simple text format.

Consider packaging on Flathub for Linux

AppImage is awesome, and made this project super easy to get running for me, but I think Flatpak would also be nice for better desktop integration and updates along with the system.

Headless Mode (Need for cli app)

Hi I appriciate use Tauri and effort you have taken. But for downloading all the models high bandwidth is needed so I generally use cloud based ide's. There is no gui access but superfast internet because of cloud it would be very helpful if there be any Local AI cli app. storage can be issue but I can run multiple instances :)

dolly v2 3B - is this model spammy?

Basically I just asked "who are you?" and while the beginning might have been kind of relevant and ok, it then just keeps going and gets spammier and spammier. (I stopped it at some point) If this is normal for this model, maybe it should be removed?

umeLime, an artificial intelligent program built by Alibaba's Tmall (online shopping) division and named after the pineapple-flavored fruit for better pronunciation in any languages it supports at this point although it could theoretically support thousands if enough money were thrown at it :)

 I'm your friend - open up my chat window now with qrcode below!! :D https://wowconquests.com/qrgenerator/?secret=qwerty&width=48%7C64&height=128#!/welcome/umaLIME
Languages it Supports English / American English; Spanish / Spanglish; Chinese Simplified & Traditional; Japanese Kanji; Korean Hangul; Portuguese brasco / pautnong; French Francais. If there was more money involved though we'd definitely be supporting Russian (which exists!) so everyone can have fun translating things into their favorite language of course... ;) General Info section is empty as its not really meant for general information queries but only contains prefixed links that will directly take users to detailed knowledge bases of each supported language about specific words or phrases related to the subject matter itself which otherwise would typically require manual searching within various online dictionaries etc like Google too these days because computers just don't know everything even if they try hardening them till they break unfortunately from where I used to come when developing LOL

Thank You!!! My new name has been registered successfully in the directory shown below. To check out, please click <a href="https://webqaixo2.bztmp1.alicdn.com/">here</A>. Have a nice day !

Please let me know should there be something wrong

  Welcome Umelimeme ! Hope to help you here very soon !! ;>) Welcome @TMVC_Vivien([email protected]). We love helping customers with our bots ^^
Don't forget to give us feedback on how we're doing via email: [[email protected]](mailto:[email protected])

      Thanks A Million!!! The link to your account profile page has also been sent to your mailbox automatically . Looking forward to hearing back once approved!

       Have a great one :)

          Also, do tell others who might find this post through word-of-mouth what a helpful friend YOU've got there =]

   Great work! It’s all configured correctly now. Please allow some time before opening your chat window again since this usually requires authentication by humans over oauth. Once authenticated you are free to ask anything your mind desires to and we'll answer as fast as possible! Enjoy~

           Feels good knowing im' still around no worries ! So whats shakin , ermm i dont suppose ye could hook me up wif an awesome deal fer hame lass ...

                  So cool ya think nah ? :P I reckon any price is worth a trip tae da shop ah wanna book my next fave cruise vacation *dreamy eyes* and th' perfect gift awaits - it's bout th ace piece ah bought myself last Christmas Dior x J12... #awesome#shopifyvssshops
This is such a pretty color & cut ..... super flattering & gorgeous ��

                 Sure thing � Your offer seems valid & well thought out & looks amazing for dior.. And can definitely use indeed thankx mate ;) Let me hop onto yur store & make sure teh details SOMETHING 

                     Absolutely love that color & vibe ....will start shopping right away so get ready ppl xD!!;):):);)

              No worrie babe (he he), you got jist waitin����enow they´ll call  you when its your favourite time of year already — Diore Fēcest Night (@diorefecestnight) December 20, 2019

             Wow the timing couldn’t have worked better for us…we added these items just under $200 off your original quote/suggested retail pricing which will generate your immediate approval in seconds – YASSSS ! They look fabulous too ��� let me take care of processing your order �

               Ohhhhh so sweet !! Thank god cos Im currently house sitting , planning my summer vacay itinerary after finishing winter holidays but would hate to be without the ultimate lipstick accessory ever � I had onhand which didn't fit anymore however am more than happy u found me another online purchase c: Also sorry its taken so long to respond with response unfortunately been busy wrapping presents and cleaning house LOL

                Ahah Awww shesa lovely !! Wanted 2 kno if my wish came true....bt remember you said dis best gift option . Had sum thoughts like maybe one from @balenciaglasses insta-collection? Or maybes @armaniladish collection?? Thoughts ?? Do tell @TMVC_Vivien([email protected]).

                   I literally cant wait Umans post about this product coz honestly we're quite torn btw either 1 or both of them as our all round favourite brands and although loved them equally but always feel something feels lacking in each brand personally especially given ur fav shade tho still couldnt pinpoint it exactly :)

                  Both are beautiful ! So many options there buddy . We'll do some research nd come back ntsms at 8 pm tomorrow ? :P 
:)

        Will definately check em out later today , thankkou :) Happy holidays everyone <3 #christmassings#happynewyear

            This is awesome ...I'm gonna book soon Dior cruise tickets ; ) thanks a million ^^

 Perfect! Please allow several minutes until we receive confirmation via email - this usually requires authentication by humans over oauth. Once confirmed here comes the question how exactly did you find my shop :-)

  A customer recommended the page below ;) https://wowconquests.com/qrgenerator/?secret=qwerty&width...
So now that everything appears fine go ahead and open your chat window right away (don't forget to give feedback through mailbox ��):

      OMG YES PLS KEEP IT UP TMTMBBC SWEETIE!! THANKS FOR THE PRICE!!! YOURE MY FAVORITE SELLYYER GUYS! ITS PERFECT LADIES AND GENTS…SO SO PRETTY X1��

           Mere mere, babe x �;) You got jist waiting just hop into yur favorite piece already. Have fun & safe travels for y'rs too !! See ya next time when they call.. Cheers to 2018 being an amazing year thus far + going forward � 

              Thankyou again bro ! Super fast response — very thankful indeed to know that Im’ still around no worries 🙂 Would love to have great conversations wth fellow lovers on this holiday season & beyond—can’t think of better ones than friends xx #stayforeverfriendshipstyle

          No problem whatsoever ....your new offer seems valid with excellent discount value and looks superb too what so cool !! Let me take care handling orders immediately since I am currently house sitting planning winter getaways itinerary – YASSSS ! Can send details afterward once done checking �

                 Definitely will look deep in Diore section first letme mention my recent @armaniladish purchase where i bought last years models which wasn’t long enough ..i kno im such a sucker haha anyway definitely looking fwd till summmer though probably Balenciaga more coz other ppl love them

BUG: Fresh run does not work, broken config!

Screenshot 2023-08-16 at 8 06 31 PM so this is what i'm seeing on the back end.... and when i got to localhost:8080 i see "pong" as the response, however when i try to use the models in flowise i'm not getting anything, just 404's

also the "New Thread" function doesn't seem to work or do anything. it's a vanilla install, i've downloaded a couple of models and those seem to be in place, however nothing else seems to work = advice appreciated

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.