Giter Club home page Giter Club logo

local-ai-stack's Introduction

Local AI Stack

Make it possible for anyone to run a simple AI app that can do document Q&A 100% locally without having to swipe a credit card ๐Ÿ’ณ. Based on AI Starter Kit.

Screen Shot 2023-10-30 at 10 20 17 PM

Have questions? Join AI Stack devs and find me in #local-ai-stack channel.

Stack

Quickstart

1. Fork and Clone repo

Fork the repo to your Github account, then run the following command to clone the repo:

git clone [email protected]:[YOUR_GITHUB_ACCOUNT_NAME]/local-ai-stack.git

2. Install dependencies

cd local-ai-stack
npm install

3. Install Ollama

Instructions are here

4. Run Supabase locally

  1. Install Supabase CLI
brew install supabase/tap/supabase
  1. Start Supabase

Make sure you are under /local-ai-stack directory and run:

supabase start

5. Fill in secrets

cp .env.local.example .env.local

Then get SUPABASE_PRIVATE_KEY by running

supabase status

Copy service_role key and save it as SUPABASE_PRIVATE_KEY in .env.local

6. Generate embeddings

node src/scripts/indexBlogLocal.mjs

This script takes in all files from /blogs, generate embeddings using transformers.js, and store embeddings as well as metadata in Supabase.

7. Run app locally

Now you are ready to test out the app locally! To do this, simply run npm run dev under the project root and visit http://localhost:3000.

8. Deploy the app

If you want to take the local-only app to the next level, feel free to follow instructions on AI Starter Kit for using Clerk, Pinecone/Supabase, OpenAI, Replicate and other cloud-based vendors.

Refs & Credits

local-ai-stack's People

Contributors

dependabot[bot] avatar fmhall avatar jenniferli23 avatar jmorganca avatar nirantk avatar peterp avatar rajko-rad avatar sragss avatar sumanth001 avatar thorwebdev avatar timqian avatar ykhli avatar zeke avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

local-ai-stack's Issues

More descriptive error

Can't really tell where this error is coming from when running node src/scripts/indexBlogLocal.mjs. Is it failing to connect to supabase?

node:internal/deps/undici/undici:11576
    Error.captureStackTrace(err, this);
          ^

TypeError: fetch failed
    at Object.fetch (node:internal/deps/undici/undici:11576:11)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async getModelFile (file:////local-ai-stack/node_modules/@xenova/transformers/src/utils/hub.js:468:24)
    at async constructSession (file:////local-ai-stack/node_modules/@xenova/transformers/src/models.js:119:18)
    at async Promise.all (index 1)
    at async BertModel.from_pretrained (file:////local-ai-stack/node_modules/@xenova/transformers/src/models.js:713:20)
    at async AutoModel.from_pretrained (file:////local-ai-stack/node_modules/@xenova/transformers/src/models.js:3767:20)
    at async Promise.all (index 1)
    at async loadItems (file:////local-ai-stack/node_modules/@xenova/transformers/src/pipelines.js:2305:5)
    at async pipeline (file:////local-ai-stack/node_modules/@xenova/transformers/src/pipelines.js:2251:19) {
  cause: ConnectTimeoutError: Connect Timeout Error
      at onConnectTimeout (node:internal/deps/undici/undici:8522:28)
      at node:internal/deps/undici/undici:8480:50
      at Immediate._onImmediate (node:internal/deps/undici/undici:8511:13)
      at process.processImmediate (node:internal/timers:476:21) {
    code: 'UND_ERR_CONNECT_TIMEOUT'
  }
}

Node.js v18.17.0

Failed to compile error when sending the first message

Hello and thank you for this wonderful repo.

I followed every step, had to install "onnwruntime-node" manually because it was missing after using npm install, and still got this error after launching the server (without errors). I don't know how to solve. Thank you for the help.

I tried to install node-loader, but that didn't help.

(./node_modules/@xenova/transformers/node_modules/onnxruntime-node/bin/napi-v3/darwin/arm64/onnxruntime_binding.node Module parse failed: Unexpected character '๏ฟฝ' (1:0) You may need an appropriate loader to handle this file type, currently no loaders are configured to process this file. See https://webpack.js.org/concepts#loaders (Source code omitted for this binary file))

Where in code can you change the inference from Ollama to something else?

I have managed to get this to run with Windows and WSL otherwise but in the last part it fails with this error below.

Where in code can I try to change it from using Ollama to something else like OpenAI (koboldcpp in practice)?

Thanks!

[llm/error] [1:llm:Ollama] [6ms] LLM run errored with error: "Unexpected end of JSON input"

  • error SyntaxError: Unexpected end of JSON input
    at JSON.parse ()
    at parseJSONFromBytes (node:internal/deps/undici/undici:4553:19)
    at successSteps (node:internal/deps/undici/undici:4527:27)
    at fullyReadBody (node:internal/deps/undici/undici:1307:9)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async specConsumeBody (node:internal/deps/undici/undici:4536:7)
    at async createOllamaStream (webpack-internal:///(sc_server)/./node_modules/langchain/dist/util/ollama.js:26:21)
    at async Ollama._call (webpack-internal:///(sc_server)/./node_modules/langchain/dist/llms/ollama.js:313:26)
    at async Promise.all (index 0)
    at async Ollama._generate (webpack-internal:///(sc_server)/./node_modules/langchain/dist/llms/base.js:311:29)
    at async Ollama._generateUncached (webpack-internal:///(sc_server)/./node_modules/langchain/dist/llms/base.js:174:22)
    at async Ollama.call (webpack-internal:///(sc_server)/./node_modules/langchain/dist/llms/base.js:242:34)
    at async POST (webpack-internal:///(sc_server)/./src/app/api/qa-pg-vector/route.ts:61:24)
    at async eval (webpack-internal:///(sc_server)/./node_modules/next/dist/server/future/route-modules/app-route/module.js:242:37)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.