Giter Club home page Giter Club logo

shreyaskarnik / distillama Goto Github PK

View Code? Open in Web Editor NEW
264.0 6.0 26.0 11.5 MB

Chrome Extension to Summarize or Chat with Web Pages/Local Documents Using locally running LLMs. Keep all of your data and conversations private. πŸ”

License: MIT License

Shell 0.17% JavaScript 9.08% TypeScript 68.65% SCSS 0.03% CSS 21.29% HTML 0.78%
chrome-extension llama2 llms local-llm ollama summarization retrieval-augmented-generation langchain mistral-7b readability

distillama's Introduction

Hi there πŸ‘‹

distillama's People

Contributors

dependabot[bot] avatar shreyaskarnik avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

distillama's Issues

Support for Ollama hosted in local network

Is your feature request related to a problem? Please describe.
i could not try the extension due to my ollama being hosted in a truenas scale server, i can use it with the ip and port of the server, but i didn't see any options on how to use it here

Describe the solution you'd like
A settings option where you can enter ollama host

Describe alternatives you've considered
well for now, i had to install ollama in my pc as a alternative

Additional context
no additional context

LLM in javascript directly?

Thanks for your awesome work here. I have a more architectural questionβ€”As deployment is a challenge for non-technical users, is it possible to wrap a llama model directly into the chrome extension JS code? Or are there any limitations to chrome extension memory/storage/CPU that would make that difficult or impossible? Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.