Giter Club home page Giter Club logo

macllm's Introduction

macLLM - Use ChatGPT with your Clipboard

This is a super simple python script that does the following:

  • It watches your clipboard for a trigger (default is the double at symbol, i.e. "@@")
  • If your clipboard starts with this trigger, it sends the clipboard to a model (e.g. ChatGPT) and pastes the result back into the clipboard

Installation

macLLM now uses the Poetry package manager, make sure you have it installed first (https://python-poetry.org/). Then, install macLLM with:

poetry install

Your OpenAI API key should be configured in the environment variable OPENAI_API_KEY. Poetry will also automatically read it if it is specified in a .env file. Format is:

OPENAI_API_KEY="your_api_key"

Now you can run macllm it with:

poetry run macllm

Poetry should take care of all the dependencies, you don't need to install anything else.

How do you use this?

For example, let's assume you forgot the capital of France. You write:

@@capital of france?

You mark this text, copy (i.e. Apple-C), wait 1-2 seconds, and paste (Apple-P).

Paris

Or you want to tell someon in German that ControlNet is underrated, you would write:

@@translate german: ControlNet is an AI technology that is really underrated

Copy, paste:

ControlNet ist eine AI-Technologie, die wirklich unterschätzt wird.

And of course you can also use this to summarize text, expand text to bullet points or anything else ChatGPT can do.

Shortcuts

Shortcuts are shorthand for specifc prompts. So for example "@@fix" is a prompt that corrects spelling or "@@tr-es" is short for translate to Spanish. You use them in text like this:

@@fix My Canaidian Mooose is Braun.

This gets internally expanded to:

@@Correct any spelling or grammar mistakes in the following text: My Canaidian Mooose is Braun.

Which GPT will correct to:

My Canadian Moose is Brown.

You can add your own shortcuts for prompts in the shortcuts.py file. Currently supported shortcuts are:

Supported Shortcuts

  • @@exp-email: Write an extremely concise and friendly email based on bullet points.
  • @@exp: Expand the text using sophisticated and concise language.
  • @@fix-de: Correct spelling and grammar mistakes in the following German text.
  • @@fix-fr: Correct spelling and grammar mistakes in the following French text.
  • @@fix: Correct spelling and grammar mistakes in the following text.
  • @@rewrite: Rewrite the text to be extremely concise while keeping the same meaning.
  • @@tr-de: Translate the text from English to German.
  • @@tr-fr: Translate the text from English to French.
  • @@tr-es: Translate the text from English to Spanish.
  • @@emoji: Pick a relevant emoji for the text and reply with only that emoji.

License

Apache 2.0

macllm's People

Contributors

appenz avatar imagedeeply avatar ericwb avatar

Stargazers

Przemysław Grenda avatar kernkraft avatar Taylor Dolezal avatar Debnil Sur avatar Buchi Reddy Busi Reddy avatar Tim Dolan avatar Ivan Abramov avatar Dr. Artificial曾小健 avatar  avatar  avatar  avatar Grant Barrett avatar  avatar  avatar jiasheng55 avatar Andrey Risukhin avatar Paul Rodrigues avatar Ismet Handzic avatar Subhadeep Datta avatar Yadong Xie avatar Alex Moura avatar Brodie Helms avatar Danial Othman avatar Matthew Rubashkin avatar Paul avatar Pavan Kumar avatar  avatar Jean Hewetson avatar  avatar  avatar ARIJIT BANDYOPADHYAY avatar  avatar Daniele Moro avatar Sanjay Aiyagari avatar  avatar  avatar Sanjoy Das avatar Chi Chen avatar Elie Habib avatar Rafiul avatar sfuller14 avatar bastafidli avatar Jimmy Ruska avatar Daniel Pett avatar  avatar Ahmed BESBES avatar Stanley Jacob avatar

Watchers

 avatar  avatar Kostas Georgiou avatar

macllm's Issues

Add a macOS menu bar icon that shows if a request is in progress or it has completed

Add an icon that shows up in the macOS menu bar at the top right.
The icon should:

  • Show the macLLM logo when idle
  • Show a spinning animated icon if a request is in progress
  • Show the macLLM logo in red if there was an error with the large request

Clicking on the icon shouls:

  1. Show the current version of the tool
  2. Show a "Quit" option that allows quitting macLLM.

Add support for locally running Ollama as an LLM backend

Right now this tool requires OpenAI to use. Instead the tool should use a locally running olllama server that it accesses via a local socket connection. Specifically:

  • Add ollama as an LLM backend
  • Add a command line option to select OpenAI or ollama
  • Add a menu toggle to switch between the two modes

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.