Giter Club home page Giter Club logo

rawdog's People

Contributors

biobootloader avatar eltociear avatar granawkins avatar jakethekoenig avatar kvaky avatar nl3v avatar pcswingle avatar tikendraw avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

rawdog's Issues

RawDog without OpenAI's ChatGPT

I must acknowledge that indeed rawdog is a masterpiece. However, it's dependency on OpenAI as the main LLM provider is a kinda drawback to me, if I was to be asked. This reason actually fueled the motive to implement the rawdog idea on the the freely available LLM providers such as Aura, Phind, You etcetra. I thought by sharing python-tgpt here, someone would find it useful. Thank you.

ModuleNotFoundError: No module named 'pkg_resources'

randomrandomovic@MacBook-Pro-Random rawdog-main % rawdog
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.12/bin/rawdog", line 5, in
from rawdog.main import main
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/rawdog/main.py", line 9, in
from rawdog.llm_client import LLMClient
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/rawdog/llm_client.py", line 9, in
from litellm import completion, completion_cost
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/init.py", line 520, in
from .utils import (
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/utils.py", line 32, in
import pkg_resources
ModuleNotFoundError: No module named 'pkg_resources'
randomrandomovic@MacBook-Pro-Random rawdog-main %

Python 3.12.1

rawdog Crashes on wrong commands

Hello, I couldn't reopen #67 so im following up here.

first I'm running rawdog as admin as I am unable to run it otherwise, it bring errors.

Second, it seems like its crashing on wrong commands such as cls to clear the window, its just a happit and i know it will not work with rawdog. but does it always crash like in the picture on wrong commands or this is just me for some reason?

image

Please check #67 update.

I get this error on Windows 10 "AttributeError: module 'os' has no attribute 'uname'. Did you mean: 'name'?"

I've successfully installed and used 'rawdog' on my personal workstation at home. But when I installed and ran it on my corporate workstation I get the error:

AttributeError: module 'os' has no attribute 'uname'. Did you mean: 'name'?

My quick google suggests that this has to do with Windows not using the 'os.uname' method but instead should use 'platform.name'.

Is there a way to make sure 'rawdog' is using the Windows method?

Thanks in advanced!

Rawdog fails to run on initial install

Ran pip install rawdog-ai
Ran rawdog

Received this error

(base) ➜ ~ rawdog Traceback (most recent call last): File "/Users/jesse/miniconda3/bin/rawdog", line 5, in <module> from rawdog.__main__ import main File "/Users/jesse/miniconda3/lib/python3.9/site-packages/rawdog/__main__.py", line 6, in <module> from rawdog.llm_client import LLMClient File "/Users/jesse/miniconda3/lib/python3.9/site-packages/rawdog/llm_client.py", line 8, in <module> from litellm import completion, completion_cost File "/Users/jesse/miniconda3/lib/python3.9/site-packages/litellm/__init__.py", line 6, in <module> from litellm.proxy._types import KeyManagementSystem File "/Users/jesse/miniconda3/lib/python3.9/site-packages/litellm/proxy/_types.py", line 78, in <module> class ModelInfo(LiteLLMBase): File "pydantic/main.py", line 197, in pydantic.main.ModelMetaclass.__new__ File "pydantic/fields.py", line 506, in pydantic.fields.ModelField.infer File "pydantic/fields.py", line 436, in pydantic.fields.ModelField.__init__ File "pydantic/fields.py", line 552, in pydantic.fields.ModelField.prepare File "pydantic/fields.py", line 661, in pydantic.fields.ModelField._type_analysis File "pydantic/fields.py", line 668, in pydantic.fields.ModelField._type_analysis File "/Users/jesse/miniconda3/lib/python3.9/typing.py", line 852, in __subclasscheck__ return issubclass(cls, self.__origin__) TypeError: issubclass() arg 1 must be a class

Looked over submitted issues and tried installing LLMClient and setup tools but they were already installed

`--verbose` flag

I'm trying to configure rawdog to run with Mistral inside of Ollama (like #3), and I encountered the error message.

image

It would be nice if rawdog accepted a --verbose flag that would be propagated to litellm options.

WDYT? Would you accept a PR adding it?

Ollama

How to config with ollama?

Default temperature

Currently the default temperature is 1, which is the most imaginative wouldn't (since this is dealing with our actual data) it be better to have it set to 0.1 so it's more exacting?

Unclear installation steps

pip install rawdog-ai doesn't seem to be enough.

I also had to pip install litellm and pip install setuptools.

This should not exist! You going to bring doom to mankind

You literally giving ChatGPT full access to local machine, what could possibly go wrong?

And it's remote running AI so except for rogue AI it could also be MITM where some hackers would target specific users to send malicious commands, pretending to be CHarGPT's API.

In general this is very bad Idea and I don't want to live in world from Terminaror Franchise.

You'd better stop doing this.

I just hope you would think about this. I'm completely serious, actually. At least when guys from MS are giving their AI in bing way to search internet -- they have lot's of safrguards, and it's not executing anything, it's just using search engine cache.

pyreadline3 is a dependency not installed

Didn't work for me when I first installed it on Windows because it depends on the readline package -- this isn't available on Windows, but installing pyreadline3 made it work for me. I guess it should be listed as a dependency for Windows machines?

Not working with api gpt 3.5?

Hi,

I had to type in my API Key from gpt 3.5 and it seems not to work properly.
Is any chance to change the config?

C:\Users\Alex>rawdog How much space do I have on the c:?
API Key (None) not found.
Enter API Key (e.g. OpenAI): sk******

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Error:
{'model': 'gpt-4', 'prompt': 'PROMPT: How much space do I have on the c:?', 'response': None, 'cost': None, 'error': "OpenAIException - Error code: 404 - {'error': {'message': 'The model gpt-4 does not exist or you do not have access to it. Learn more: https://help.openai.com/en/articles/7102672-how-can-i-access-gpt-4.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}"}
Error: Execution error: OpenAIException - Error code: 404 - {'error': {'message': 'The model gpt-4 does not exist or you do not have access to it. Learn more: https://help.openai.com/en/articles/7102672-how-can-i-access-gpt-4.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}

Best regards

API key prompt when using local ollama model

Hi all, very cool project. I tried running rawdog with a local gguf model via ollama with the configuration suggested in the readme. I am prompted for an API key as follows:

`API Key (None) not found.

Enter API Key (e.g. OpenAI): `

[Edit: I didn't realize rawdog expected ollama to be running in serve mode, mistakenly thought it would interact via 'ollama run.' Got it to work, sorry about that.]

cant run rawdog on windows.

I am using Python 3.12.2

I ran pip install rawdog-ai

then I initially get this error when I run rawdog

Traceback (most recent call last):
File "", line 198, in _run_module_as_main
File "", line 88, in run_code
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Lib\site-packages\rawdog_main
.py", line 4, in
import readline
ModuleNotFoundError: No module named 'readline'

I installed pip install pyreadline

Now I'm getting this error

C:\Windows\System32>rawdog
Traceback (most recent call last):
File "", line 198, in run_module_as_main
File "", line 88, in run_code
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Scripts\rawdog.exe_main
.py", line 4, in
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Lib\site-packages\rawdog_main
.py", line 4, in
import readline
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Lib\site-packages\readline.py", line 34, in
rl = Readline()
^^^^^^^^^^
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Lib\site-packages\pyreadline\rlmain.py", line 422, in init
BaseReadline.init(self)
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Lib\site-packages\pyreadline\rlmain.py", line 62, in init
mode.init_editing_mode(None)
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Lib\site-packages\pyreadline\modes\emacs.py", line 633, in init_editing_mode
self._bind_key('space', self.self_insert)
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Lib\site-packages\pyreadline\modes\basemode.py", line 162, in _bind_key
if not callable(func):
^^^^^^^^^^^^^^
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Lib\site-packages\pyreadline\py3k_compat.py", line 8, in callable
return isinstance(x, collections.Callable)
^^^^^^^^^^^^^^^^^^^^
AttributeError: module 'collections' has no attribute 'Callable'

Update readme

To use local models with ollama a sample configuration is
config.yaml

llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: null
llm_model: ollama/mistral

Very cool project

Rawdog v0.1.3

RELEASE ON FRIDAY

  • 1. --leash (G)
  • 2. Update Config system (G)
  • 3. More initial context (G)
  • 4. Retry loop (G)
  • 5. Cost tracking (G)
  • 6. Streaming for --dry-run (J)
  • 7. brew / choco (J)
  • 8. Task Complete experiment (J)
  • 9. GH Actions black + isort (S)
  • 10. Pypi release action (S)
  • 11. Remove extra info before cursor (host..) (G)

API (MAYBE IN TIME?)

  • 11. Add more fine-tuning examples (all)
  • 12. Update training examples format (J)
  • 13. Get API setup (P)
  • 14. Remove system prompt if using fine-tuned model (J)

LATER

  1. New Benchmarks?

No output after inserting API key

I am on windows at the moment. is it supported?
after it requests my API key I see no output and cannot enter cli mode.

there is no --help either or anything else I can see.

Add config to make --dry-run default

In environments where I might be recommending this tool to others - I'd feel better having the safer option be the default. Is there a setting for this currently, or would we have to add a flag/config?

litellm error when model not listed?

A strange one. I'm using ollama as a local model. If config.yaml is:

llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: null
llm_model: ollama/mixtral

rawdog works prefectly.
but when I update config.yaml to:

llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: null
llm_model: ollama/mixtral

so it uses the mixtral model, now rawdog always gives back an error

What can I do for you? (Ctrl-C to exit)
> list files    

Error:
 {'model': 'ollama/mixtral', 'prompt': 'PROMPT: list files', 'response': " ```python\nimport os\n\nfiles = [f for f in os.listdir('.') if os.path.isfile(f)]\nfor file in files:\n    print(file)\n```", 'cost': None, 'error': 'Model not in model_prices_and_context_window.json. You passed model=ollama/mixtral\n'}
Error: Execution error: Model not in model_prices_and_context_window.json. You passed model=ollama/mixtral


What can I do for you? (Ctrl-C to exit)

This error seems to original from litellm https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json
where indeed mixtral isn't listed, and mistral is.

Is it possible to get around this?

CLI: No Way to Change OpenAI API KEY Once Set

Bottom Line Upfront

I accidentally set a bad OpenAI API key when running rawdog. The only way to fix this is comment out a line of code in llm_client.py or change the OpenAI API key in the hidden config.yaml file in the home directory. Please add the ability to check the API Key and to change it from the CLI while rawdog is running.

How the error appears in the command line

(base) jerrymclaughlin@MacBook ~ % rawdog
   / \__
  (    @\___   ┳┓┏┓┏ ┓┳┓┏┓┏┓
  /         O  ┣┫┣┫┃┃┃┃┃┃┃┃┓
 /   (_____/   ┛┗┛┗┗┻┛┻┛┗┛┗┛
/_____/   U    Rawdog v0.1.1

What can I do for you? (Ctrl-C to exit)
> test


Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Error:
 {'model': 'gpt-4', 'prompt': 'PROMPT: test', 'response': None, 'cost': None, 'error': "OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: export O*************************************************************FZ5v. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}"}
Error: Execution error: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: export O*************************************************************FZ5v. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

What can I do for you? (Ctrl-C to exit)

Steps to reproduce

Install rawdog
start rawdog
set the API Key to any value that is not a real API key

Checking the API Key

I set a new PATH variable for the OpenAI API key
I ran conda deactivate
I ran echo $OPEN_AI_KEY
The correct API Key was there
I ran conda activate base
I ran echo $OPEN_AI_KEY
The correct API Key was there
Still rawdog returned the same error

Fix

I opened llm_client.py
I commented out
#self.api_key = get_llm_api_key() or os.environ.get("OPENAI_API_KEY")
I added a new line
self.api_key = os.environ.get("OPENAI_API_KEY")
I saved the file

I ran conda activate base
I ran rawdog
Rawdog prompted me to add an API key
I provided a valid API key
Rawdog worked

I found .rawdog/config.yaml in my home directory
config.yaml had the old, wrong API Key
I changed the API key
I saved the file

I opened llm_client.py
I removed this line
self.api_key = os.environ.get("OPENAI_API_KEY")
I replaced it with this line
self.api_key = get_llm_api_key() or os.environ.get("OPENAI_API_KEY")
I saved the file

I ran rawdog
rawdog worked right away.

(base) jerrymclaughlin@Macbook ~ % rawdog
   / \__
  (    @\___   ┳┓┏┓┏ ┓┳┓┏┓┏┓
  /         O  ┣┫┣┫┃┃┃┃┃┃┃┃┓
 /   (_____/   ┛┗┛┗┗┻┛┻┛┗┛┗┛
/_____/   U    Rawdog v0.1.1

What can I do for you? (Ctrl-C to exit)
> hello world

Hello, world!


What can I do for you? (Ctrl-C to exit)

Requested change

Please add the ability to check the API Key and to change it from the CLI while rawdog is running.

Thank you @ApophisUSA for helping me to troubleshoot this problem!

litellm.exceptions.NotFoundError: Model not in model_prices_and_context_window.json. You passed model=openai/mistral

litellm.exceptions.NotFoundError: Model not in model_prices_and_context_window.json. You passed model=openai/mistral

File ".../lib/python3.11/site-packages/rawdog/llm_client.py", line 115, in get_script:

            if custom_llm_provider:
                cost = 0
            else:
                cost = (
                    completion_cost(model=model, messages=messages, completion=text)
                    or 0
                )

Fixed:

            if custom_llm_provider:
                cost = 0
            else:
                try:
                    cost = (
                        completion_cost(model=model, messages=messages, completion=text)
                        or 0
                    )
                except Exception as e:
                    cost = 0

Done.

Error when trying to use rawdog: "ModuleNotFoundError: No module named 'pkg_resources'"

Hello. I have followed the instructions for installing, using pip install rawdog-ai:
However, when I prompt "rawdog" it gives me:

Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "C:\Python312\Scripts\rawdog.exe\__main__.py", line 4, in <module>
  File "C:\Python312\Lib\site-packages\rawdog\__main__.py", line 6, in <module>
    from rawdog.llm_client import LLMClient
  File "C:\Python312\Lib\site-packages\rawdog\llm_client.py", line 8, in <module>
    from litellm import completion, completion_cost
  File "C:\Python312\Lib\site-packages\litellm\__init__.py", line 518, in <module>
    from .utils import (
  File "C:\Python312\Lib\site-packages\litellm\utils.py", line 32, in <module>
    import pkg_resources
ModuleNotFoundError: No module named 'pkg_resources'

lm studio

the instructions aren't clear on how to run it on lm studio server

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.