abanteai / rawdog Goto Github PK
View Code? Open in Web Editor NEWGenerate and auto-execute Python scripts in the cli
License: Apache License 2.0
Generate and auto-execute Python scripts in the cli
License: Apache License 2.0
I must acknowledge that indeed rawdog is a masterpiece. However, it's dependency on OpenAI as the main LLM provider is a kinda drawback to me, if I was to be asked. This reason actually fueled the motive to implement the rawdog idea on the the freely available LLM providers such as Aura, Phind, You etcetra. I thought by sharing python-tgpt here, someone would find it useful. Thank you.
randomrandomovic@MacBook-Pro-Random rawdog-main % rawdog
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.12/bin/rawdog", line 5, in
from rawdog.main import main
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/rawdog/main.py", line 9, in
from rawdog.llm_client import LLMClient
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/rawdog/llm_client.py", line 9, in
from litellm import completion, completion_cost
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/init.py", line 520, in
from .utils import (
File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/utils.py", line 32, in
import pkg_resources
ModuleNotFoundError: No module named 'pkg_resources'
randomrandomovic@MacBook-Pro-Random rawdog-main %
Python 3.12.1
Hello, I couldn't reopen #67 so im following up here.
first I'm running rawdog as admin as I am unable to run it otherwise, it bring errors.
Second, it seems like its crashing on wrong commands such as cls to clear the window, its just a happit and i know it will not work with rawdog. but does it always crash like in the picture on wrong commands or this is just me for some reason?
Please check #67 update.
I've successfully installed and used 'rawdog' on my personal workstation at home. But when I installed and ran it on my corporate workstation I get the error:
AttributeError: module 'os' has no attribute 'uname'. Did you mean: 'name'?
My quick google suggests that this has to do with Windows not using the 'os.uname' method but instead should use 'platform.name'.
Is there a way to make sure 'rawdog' is using the Windows method?
Thanks in advanced!
Ran pip install rawdog-ai
Ran rawdog
Received this error
(base) ➜ ~ rawdog Traceback (most recent call last): File "/Users/jesse/miniconda3/bin/rawdog", line 5, in <module> from rawdog.__main__ import main File "/Users/jesse/miniconda3/lib/python3.9/site-packages/rawdog/__main__.py", line 6, in <module> from rawdog.llm_client import LLMClient File "/Users/jesse/miniconda3/lib/python3.9/site-packages/rawdog/llm_client.py", line 8, in <module> from litellm import completion, completion_cost File "/Users/jesse/miniconda3/lib/python3.9/site-packages/litellm/__init__.py", line 6, in <module> from litellm.proxy._types import KeyManagementSystem File "/Users/jesse/miniconda3/lib/python3.9/site-packages/litellm/proxy/_types.py", line 78, in <module> class ModelInfo(LiteLLMBase): File "pydantic/main.py", line 197, in pydantic.main.ModelMetaclass.__new__ File "pydantic/fields.py", line 506, in pydantic.fields.ModelField.infer File "pydantic/fields.py", line 436, in pydantic.fields.ModelField.__init__ File "pydantic/fields.py", line 552, in pydantic.fields.ModelField.prepare File "pydantic/fields.py", line 661, in pydantic.fields.ModelField._type_analysis File "pydantic/fields.py", line 668, in pydantic.fields.ModelField._type_analysis File "/Users/jesse/miniconda3/lib/python3.9/typing.py", line 852, in __subclasscheck__ return issubclass(cls, self.__origin__) TypeError: issubclass() arg 1 must be a class
Looked over submitted issues and tried installing LLMClient and setup tools but they were already installed
I'm trying to configure rawdog to run with Mistral inside of Ollama (like #3), and I encountered the error message.
It would be nice if rawdog
accepted a --verbose
flag that would be propagated to litellm
options.
WDYT? Would you accept a PR adding it?
How to config with ollama?
Ollama has an api that I think could simplify things.
https://github.com/ollama/ollama-python
Do you have instruction how to connect RawDog and LM Studion?
Providing a configurable OpenAI base URL allows integration with https://localai.io/, a locally-run API that is compatible with the OpenAI API specification.
Currently the default temperature is 1, which is the most imaginative wouldn't (since this is dealing with our actual data) it be better to have it set to 0.1 so it's more exacting?
pip install rawdog-ai
doesn't seem to be enough.
I also had to pip install litellm
and pip install setuptools
.
ChatGPT by OpenAI is expensive. There are alternates available for those who dont want to use OpenAI API LLM.
You literally giving ChatGPT full access to local machine, what could possibly go wrong?
And it's remote running AI so except for rogue AI it could also be MITM where some hackers would target specific users to send malicious commands, pretending to be CHarGPT's API.
In general this is very bad Idea and I don't want to live in world from Terminaror Franchise.
You'd better stop doing this.
I just hope you would think about this. I'm completely serious, actually. At least when guys from MS are giving their AI in bing way to search internet -- they have lot's of safrguards, and it's not executing anything, it's just using search engine cache.
Didn't work for me when I first installed it on Windows because it depends on the readline
package -- this isn't available on Windows, but installing pyreadline3
made it work for me. I guess it should be listed as a dependency for Windows machines?
Hi,
I had to type in my API Key from gpt 3.5 and it seems not to work properly.
Is any chance to change the config?
C:\Users\Alex>rawdog How much space do I have on the c:?
API Key (None) not found.
Enter API Key (e.g. OpenAI): sk******
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
Error:
{'model': 'gpt-4', 'prompt': 'PROMPT: How much space do I have on the c:?', 'response': None, 'cost': None, 'error': "OpenAIException - Error code: 404 - {'error': {'message': 'The model gpt-4
does not exist or you do not have access to it. Learn more: https://help.openai.com/en/articles/7102672-how-can-i-access-gpt-4.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}"}
Error: Execution error: OpenAIException - Error code: 404 - {'error': {'message': 'The model gpt-4
does not exist or you do not have access to it. Learn more: https://help.openai.com/en/articles/7102672-how-can-i-access-gpt-4.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
Best regards
Hi all, very cool project. I tried running rawdog with a local gguf model via ollama with the configuration suggested in the readme. I am prompted for an API key as follows:
`API Key (None) not found.
Enter API Key (e.g. OpenAI): `
[Edit: I didn't realize rawdog expected ollama to be running in serve mode, mistakenly thought it would interact via 'ollama run.' Got it to work, sorry about that.]
I am using Python 3.12.2
I ran pip install rawdog-ai
then I initially get this error when I run rawdog
Traceback (most recent call last):
File "", line 198, in _run_module_as_main
File "", line 88, in run_code
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Lib\site-packages\rawdog_main.py", line 4, in
import readline
ModuleNotFoundError: No module named 'readline'
I installed pip install pyreadline
Now I'm getting this error
C:\Windows\System32>rawdog
Traceback (most recent call last):
File "", line 198, in run_module_as_main
File "", line 88, in run_code
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Scripts\rawdog.exe_main.py", line 4, in
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Lib\site-packages\rawdog_main.py", line 4, in
import readline
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Lib\site-packages\readline.py", line 34, in
rl = Readline()
^^^^^^^^^^
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Lib\site-packages\pyreadline\rlmain.py", line 422, in init
BaseReadline.init(self)
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Lib\site-packages\pyreadline\rlmain.py", line 62, in init
mode.init_editing_mode(None)
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Lib\site-packages\pyreadline\modes\emacs.py", line 633, in init_editing_mode
self._bind_key('space', self.self_insert)
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Lib\site-packages\pyreadline\modes\basemode.py", line 162, in _bind_key
if not callable(func):
^^^^^^^^^^^^^^
File "C:\Users\myuser\AppData\Local\Programs\Python\Python312\Lib\site-packages\pyreadline\py3k_compat.py", line 8, in callable
return isinstance(x, collections.Callable)
^^^^^^^^^^^^^^^^^^^^
AttributeError: module 'collections' has no attribute 'Callable'
To use local models with ollama a sample configuration is
config.yaml
llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: null
llm_model: ollama/mistral
Very cool project
--leash
(G)How to config with Azure OpenAI?
I am on windows at the moment. is it supported?
after it requests my API key I see no output and cannot enter cli mode.
there is no --help either or anything else I can see.
In environments where I might be recommending this tool to others - I'd feel better having the safer option be the default. Is there a setting for this currently, or would we have to add a flag/config?
A strange one. I'm using ollama as a local model. If config.yaml is:
llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: null
llm_model: ollama/mixtral
rawdog works prefectly.
but when I update config.yaml to:
llm_api_key: no need
llm_base_url: http://localhost:11434
llm_custom_provider: null
llm_model: ollama/mixtral
so it uses the mixtral model, now rawdog always gives back an error
What can I do for you? (Ctrl-C to exit)
> list files
Error:
{'model': 'ollama/mixtral', 'prompt': 'PROMPT: list files', 'response': " ```python\nimport os\n\nfiles = [f for f in os.listdir('.') if os.path.isfile(f)]\nfor file in files:\n print(file)\n```", 'cost': None, 'error': 'Model not in model_prices_and_context_window.json. You passed model=ollama/mixtral\n'}
Error: Execution error: Model not in model_prices_and_context_window.json. You passed model=ollama/mixtral
What can I do for you? (Ctrl-C to exit)
This error seems to original from litellm https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json
where indeed mixtral isn't listed, and mistral is.
Is it possible to get around this?
I accidentally set a bad OpenAI API key when running rawdog. The only way to fix this is comment out a line of code in llm_client.py or change the OpenAI API key in the hidden config.yaml file in the home directory. Please add the ability to check the API Key and to change it from the CLI while rawdog is running.
(base) jerrymclaughlin@MacBook ~ % rawdog
/ \__
( @\___ ┳┓┏┓┏ ┓┳┓┏┓┏┓
/ O ┣┫┣┫┃┃┃┃┃┃┃┃┓
/ (_____/ ┛┗┛┗┗┻┛┻┛┗┛┗┛
/_____/ U Rawdog v0.1.1
What can I do for you? (Ctrl-C to exit)
> test
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
Error:
{'model': 'gpt-4', 'prompt': 'PROMPT: test', 'response': None, 'cost': None, 'error': "OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: export O*************************************************************FZ5v. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}"}
Error: Execution error: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: export O*************************************************************FZ5v. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}
What can I do for you? (Ctrl-C to exit)
Install rawdog
start rawdog
set the API Key to any value that is not a real API key
I set a new PATH variable for the OpenAI API key
I ran conda deactivate
I ran echo $OPEN_AI_KEY
The correct API Key was there
I ran conda activate base
I ran echo $OPEN_AI_KEY
The correct API Key was there
Still rawdog returned the same error
I opened llm_client.py
I commented out
#self.api_key = get_llm_api_key() or os.environ.get("OPENAI_API_KEY")
I added a new line
self.api_key = os.environ.get("OPENAI_API_KEY")
I saved the file
I ran conda activate base
I ran rawdog
Rawdog prompted me to add an API key
I provided a valid API key
Rawdog worked
I found .rawdog/config.yaml in my home directory
config.yaml had the old, wrong API Key
I changed the API key
I saved the file
I opened llm_client.py
I removed this line
self.api_key = os.environ.get("OPENAI_API_KEY")
I replaced it with this line
self.api_key = get_llm_api_key() or os.environ.get("OPENAI_API_KEY")
I saved the file
I ran rawdog
rawdog worked right away.
(base) jerrymclaughlin@Macbook ~ % rawdog
/ \__
( @\___ ┳┓┏┓┏ ┓┳┓┏┓┏┓
/ O ┣┫┣┫┃┃┃┃┃┃┃┃┓
/ (_____/ ┛┗┛┗┗┻┛┻┛┗┛┗┛
/_____/ U Rawdog v0.1.1
What can I do for you? (Ctrl-C to exit)
> hello world
Hello, world!
What can I do for you? (Ctrl-C to exit)
Please add the ability to check the API Key and to change it from the CLI while rawdog is running.
Thank you @ApophisUSA for helping me to troubleshoot this problem!
litellm.exceptions.NotFoundError: Model not in model_prices_and_context_window.json. You passed model=openai/mistral
File ".../lib/python3.11/site-packages/rawdog/llm_client.py", line 115, in get_script:
if custom_llm_provider:
cost = 0
else:
cost = (
completion_cost(model=model, messages=messages, completion=text)
or 0
)
Fixed:
if custom_llm_provider:
cost = 0
else:
try:
cost = (
completion_cost(model=model, messages=messages, completion=text)
or 0
)
except Exception as e:
cost = 0
Done.
It looks like this field should be self.date
rather than self.data
to match the format from _set_from_env()
I don't see where this app connects to openai, but it does mentat.ai, and sends THEM your openai key, so their fine tuned model is making a gpt call on your behalf....
But then they have your key.
Hello. I have followed the instructions for installing, using pip install rawdog-ai
:
However, when I prompt "rawdog" it gives me:
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "C:\Python312\Scripts\rawdog.exe\__main__.py", line 4, in <module>
File "C:\Python312\Lib\site-packages\rawdog\__main__.py", line 6, in <module>
from rawdog.llm_client import LLMClient
File "C:\Python312\Lib\site-packages\rawdog\llm_client.py", line 8, in <module>
from litellm import completion, completion_cost
File "C:\Python312\Lib\site-packages\litellm\__init__.py", line 518, in <module>
from .utils import (
File "C:\Python312\Lib\site-packages\litellm\utils.py", line 32, in <module>
import pkg_resources
ModuleNotFoundError: No module named 'pkg_resources'
the instructions aren't clear on how to run it on lm studio server
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.