aiwaves-cn / agents Goto Github PK
View Code? Open in Web Editor NEWAn Open-source Framework for Autonomous Language Agents
Home Page: http://www.aiwaves-agents.com/
License: Apache License 2.0
An Open-source Framework for Autonomous Language Agents
Home Page: http://www.aiwaves-agents.com/
License: Apache License 2.0
I've read all the examples, and wonder is there any way to build an SQL agent using this Agents tool?
系统:Mac OS
Python版本: 3.11
tensorflow版本: 2.13.0
pip install ai-agents
python run.py --agent Single_Agent/shopping_assistant/config.json
AttributeError: module 'tensorflow' has no attribute 'contrib'
根据提示,修改脚本文件,路径是:
/opt/anaconda3/envs/agents/lib/python3.11/site-packages/agents/scripts/networks.py
安装: pip install tensorflow_probability
导入依赖包:
import tensorflow_probability as tfp
将 tfd = tf.contrib.distributions 改为 tfd = tfp.distributions
^_^ 问题解决,可以正常运行Demo了。
国内访问不了openai啊,作者大大是不是需要适配一下azure的gpt。这样很多国内的企业就能用了
运行提示错误raise error.APIConnectionError(
openai.error.APIConnectionError: Error communicating with OpenAI: HTTPSConnectionPool(host='api.openai.com', port=443)
请问是一定要连上互联网才能调用大模型吗?本地无网情况下如何操作?
I try to run the given example: python Single_Agent/run_gradio.py --agent Single_Agent/shopping_assistant/config.json
, but it returns the following error message:
server: executing `python Single_Agent/gradio_backend.py --agent Single_Agent/shopping_assistant/config.json` ...
Traceback (most recent call last):
File "/home/shijiayi/Projects/KI3-LLM/agents/examples/Single_Agent/gradio_backend.py", line 8, in <module>
from agents.Agent import Agent
ModuleNotFoundError: No module named 'agents.Agent'
How to solve it?
Thanks!!
pip install -e .
cd examples
python Single_Agent/run_gradio.py --agent Single_Agent/shopping_assistant/config.json
Got the following output
server: executing `python Single_Agent/gradio_backend.py --agent Single_Agent/shopping_assistant/config.json` ...
Traceback (most recent call last):
File "/home/user/anaconda3/lib/python3.9/site-packages/transformers/utils/import_utils.py", line 1130, in _get_module
return importlib.import_module("." + module_name, self.__name__)
File "/home/user/anaconda3/lib/python3.9/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
File "<frozen importlib._bootstrap>", line 986, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 680, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 850, in exec_module
File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
File "/home/user/anaconda3/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 42, in <module>
from ...modeling_utils import PreTrainedModel
File "/home/user/anaconda3/lib/python3.9/site-packages/transformers/modeling_utils.py", line 38, in <module>
from .deepspeed import deepspeed_config, is_deepspeed_zero3_enabled
File "/home/user/anaconda3/lib/python3.9/site-packages/transformers/deepspeed.py", line 37, in <module>
from accelerate.utils.deepspeed import HfDeepSpeedConfig as DeepSpeedConfig
File "/home/user/anaconda3/lib/python3.9/site-packages/accelerate/__init__.py", line 3, in <module>
from .accelerator import Accelerator
File "/home/user/anaconda3/lib/python3.9/site-packages/accelerate/accelerator.py", line 35, in <module>
from .checkpointing import load_accelerator_state, load_custom_state, save_accelerator_state, save_custom_state
File "/home/user/anaconda3/lib/python3.9/site-packages/accelerate/checkpointing.py", line 24, in <module>
from .utils import (
File "/home/user/anaconda3/lib/python3.9/site-packages/accelerate/utils/__init__.py", line 136, in <module>
from .launch import (
File "/home/user/anaconda3/lib/python3.9/site-packages/accelerate/utils/launch.py", line 33, in <module>
from ..utils.other import is_port_in_use, merge_dicts
File "/home/user/anaconda3/lib/python3.9/site-packages/accelerate/utils/other.py", line 32, in <module>
from deepspeed import DeepSpeedEngine
File "/home/user/anaconda3/lib/python3.9/site-packages/deepspeed/__init__.py", line 14, in <module>
from . import module_inject
File "/home/user/anaconda3/lib/python3.9/site-packages/deepspeed/module_inject/__init__.py", line 1, in <module>
from .replace_module import replace_transformer_layer, revert_transformer_layer, ReplaceWithTensorSlicing
File "/home/user/anaconda3/lib/python3.9/site-packages/deepspeed/module_inject/replace_module.py", line 15, in <module>
from ..runtime.zero import GatheredParameters
File "/home/user/anaconda3/lib/python3.9/site-packages/deepspeed/runtime/zero/__init__.py", line 6, in <module>
from .partition_parameters import ZeroParamType
File "/home/user/anaconda3/lib/python3.9/site-packages/deepspeed/runtime/zero/partition_parameters.py", line 22, in <module>
from .linear import LinearModuleForZeroStage3, zero3_linear_wrap
File "/home/user/anaconda3/lib/python3.9/site-packages/deepspeed/runtime/zero/linear.py", line 20, in <module>
from deepspeed.runtime.utils import noop_decorator
File "/home/user/anaconda3/lib/python3.9/site-packages/deepspeed/runtime/utils.py", line 19, in <module>
from torch._six import inf
ModuleNotFoundError: No module named 'torch._six'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/user/Developer/_experiments/agents/examples/Single_Agent/gradio_backend.py", line 7, in <module>
from agents.SOP import SOP
File "/home/user/Developer/_experiments/agents/src/agents/__init__.py", line 2, in <module>
from .SOP import *
File "/home/user/Developer/_experiments/agents/src/agents/SOP.py", line 18, in <module>
from LLM.base_LLM import *
File "/home/user/Developer/_experiments/agents/examples/../src/agents/LLM/base_LLM.py", line 6, in <module>
from utils import save_logs
File "/home/user/Developer/_experiments/agents/examples/../src/agents/utils.py", line 25, in <module>
from text2vec import semantic_search
File "/home/user/anaconda3/lib/python3.9/site-packages/text2vec/__init__.py", line 8, in <module>
from text2vec.bertmatching_model import BertMatchModel
File "/home/user/anaconda3/lib/python3.9/site-packages/text2vec/bertmatching_model.py", line 16, in <module>
from transformers import BertForSequenceClassification, BertTokenizer
File "<frozen importlib._bootstrap>", line 1055, in _handle_fromlist
File "/home/user/anaconda3/lib/python3.9/site-packages/transformers/utils/import_utils.py", line 1121, in __getattr__
value = getattr(module, name)
File "/home/user/anaconda3/lib/python3.9/site-packages/transformers/utils/import_utils.py", line 1120, in __getattr__
module = self._get_module(self._class_to_module[name])
File "/home/user/anaconda3/lib/python3.9/site-packages/transformers/utils/import_utils.py", line 1132, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.bert.modeling_bert because of the following error (look up to see its traceback):
No module named 'torch._six'
Torch version installed:
>>> import torch
>>> print(torch. __version__)
2.0.1+cu118
您好,在运行单agent的shopping agent,使用local model的时候,出现了错误
src/agents/Component/ExtraComponent.py, Line77, in func response_message = json.loads(response["function_call"]["arguments"])
KeyError: 'function_call'
请问如何解决呀?
Hi, I do not fully understand why you are not using the LangChain primities as there are plenty of tools already available. Is there an intent to migrate? Also, where is the roadmap ? thanks
I've set my own proxy in the config file.
Anyone came into this problem?
openai.error.APIConnectionError: Error communicating with OpenAI: HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/embeddings (Caused by SSLError(SSLCertVerificationError(1, "[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: Hostname mismatch, certificate is not valid for 'api.openai.com'. (_ssl.c:997)")))
Hi! It seems like the group chat becomes invite-only once it has over 200 members
pip install -e .
cd examples
python Single_Agent/run_gradio.py --agent Single_Agent/shopping_assistant/config.json
Got the following output:
server: executing `python Single_Agent/gradio_backend.py --agent Single_Agent/shopping_assistant/config.json` ...
[2023-09-26 13:43:39,014] [INFO] [real_accelerator.py:158:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Traceback (most recent call last):
File "/home/user/anaconda3/lib/python3.9/site-packages/transformers/utils/import_utils.py", line 1130, in _get_module
return importlib.import_module("." + module_name, self.__name__)
File "/home/user/anaconda3/lib/python3.9/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
File "<frozen importlib._bootstrap>", line 986, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 680, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 850, in exec_module
File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
File "/home/user/anaconda3/lib/python3.9/site-packages/transformers/models/bert/modeling_bert.py", line 42, in <module>
from ...modeling_utils import PreTrainedModel
File "/home/user/anaconda3/lib/python3.9/site-packages/transformers/modeling_utils.py", line 38, in <module>
from .deepspeed import deepspeed_config, is_deepspeed_zero3_enabled
File "/home/user/anaconda3/lib/python3.9/site-packages/transformers/deepspeed.py", line 37, in <module>
from accelerate.utils.deepspeed import HfDeepSpeedConfig as DeepSpeedConfig
File "/home/user/anaconda3/lib/python3.9/site-packages/accelerate/__init__.py", line 3, in <module>
from .accelerator import Accelerator
File "/home/user/anaconda3/lib/python3.9/site-packages/accelerate/accelerator.py", line 41, in <module>
from .tracking import LOGGER_TYPE_TO_CLASS, GeneralTracker, filter_trackers
File "/home/user/anaconda3/lib/python3.9/site-packages/accelerate/tracking.py", line 50, in <module>
import wandb
File "/home/user/anaconda3/lib/python3.9/site-packages/wandb/__init__.py", line 26, in <module>
from wandb import sdk as wandb_sdk
File "/home/user/anaconda3/lib/python3.9/site-packages/wandb/sdk/__init__.py", line 5, in <module>
from . import wandb_helper as helper # noqa: F401
File "/home/user/anaconda3/lib/python3.9/site-packages/wandb/sdk/wandb_helper.py", line 6, in <module>
from .lib import config_util
File "/home/user/anaconda3/lib/python3.9/site-packages/wandb/sdk/lib/config_util.py", line 9, in <module>
from wandb.util import load_yaml
File "/home/user/anaconda3/lib/python3.9/site-packages/wandb/util.py", line 53, in <module>
import sentry_sdk # type: ignore
File "/home/user/anaconda3/lib/python3.9/site-packages/sentry_sdk/__init__.py", line 1, in <module>
from sentry_sdk.hub import Hub, init
File "/home/user/anaconda3/lib/python3.9/site-packages/sentry_sdk/hub.py", line 8, in <module>
from sentry_sdk.scope import Scope
File "/home/user/anaconda3/lib/python3.9/site-packages/sentry_sdk/scope.py", line 7, in <module>
from sentry_sdk.utils import logger, capture_internal_exceptions
File "/home/user/anaconda3/lib/python3.9/site-packages/sentry_sdk/utils.py", line 885, in <module>
HAS_REAL_CONTEXTVARS, ContextVar = _get_contextvars()
File "/home/user/anaconda3/lib/python3.9/site-packages/sentry_sdk/utils.py", line 855, in _get_contextvars
if not _is_contextvars_broken():
File "/home/user/anaconda3/lib/python3.9/site-packages/sentry_sdk/utils.py", line 816, in _is_contextvars_broken
from eventlet.patcher import is_monkey_patched # type: ignore
File "/home/user/anaconda3/lib/python3.9/site-packages/eventlet/__init__.py", line 17, in <module>
from eventlet import convenience
File "/home/user/anaconda3/lib/python3.9/site-packages/eventlet/convenience.py", line 7, in <module>
from eventlet.green import socket
File "/home/user/anaconda3/lib/python3.9/site-packages/eventlet/green/socket.py", line 21, in <module>
from eventlet.support import greendns
File "/home/user/anaconda3/lib/python3.9/site-packages/eventlet/support/greendns.py", line 79, in <module>
setattr(dns, pkg, import_patched('dns.' + pkg))
File "/home/user/anaconda3/lib/python3.9/site-packages/eventlet/support/greendns.py", line 61, in import_patched
return patcher.import_patched(module_name, **modules)
File "/home/user/anaconda3/lib/python3.9/site-packages/eventlet/patcher.py", line 132, in import_patched
return inject(
File "/home/user/anaconda3/lib/python3.9/site-packages/eventlet/patcher.py", line 109, in inject
module = __import__(module_name, {}, {}, module_name.split('.')[:-1])
File "/home/user/anaconda3/lib/python3.9/site-packages/dns/asyncquery.py", line 34, in <module>
from dns.query import _compute_times, _matches_destination, BadResponse, ssl, \
File "/home/user/anaconda3/lib/python3.9/site-packages/dns/query.py", line 52, in <module>
import httpx
File "/home/user/anaconda3/lib/python3.9/site-packages/httpx/__init__.py", line 2, in <module>
from ._api import delete, get, head, options, patch, post, put, request, stream
File "/home/user/anaconda3/lib/python3.9/site-packages/httpx/_api.py", line 4, in <module>
from ._client import Client
File "/home/user/anaconda3/lib/python3.9/site-packages/httpx/_client.py", line 30, in <module>
from ._transports.default import AsyncHTTPTransport, HTTPTransport
File "/home/user/anaconda3/lib/python3.9/site-packages/httpx/_transports/default.py", line 30, in <module>
import httpcore
File "/home/user/anaconda3/lib/python3.9/site-packages/httpcore/__init__.py", line 1, in <module>
from ._api import request, stream
File "/home/user/anaconda3/lib/python3.9/site-packages/httpcore/_api.py", line 5, in <module>
from ._sync.connection_pool import ConnectionPool
File "/home/user/anaconda3/lib/python3.9/site-packages/httpcore/_sync/__init__.py", line 1, in <module>
from .connection import HTTPConnection
File "/home/user/anaconda3/lib/python3.9/site-packages/httpcore/_sync/connection.py", line 12, in <module>
from .._synchronization import Lock
File "/home/user/anaconda3/lib/python3.9/site-packages/httpcore/_synchronization.py", line 13, in <module>
import trio
File "/home/user/anaconda3/lib/python3.9/site-packages/trio/__init__.py", line 18, in <module>
from ._core import (
File "/home/user/anaconda3/lib/python3.9/site-packages/trio/_core/__init__.py", line 27, in <module>
from ._run import (
File "/home/user/anaconda3/lib/python3.9/site-packages/trio/_core/_run.py", line 2452, in <module>
from ._io_epoll import EpollIOManager as TheIOManager
File "/home/user/anaconda3/lib/python3.9/site-packages/trio/_core/_io_epoll.py", line 188, in <module>
class EpollIOManager:
File "/home/user/anaconda3/lib/python3.9/site-packages/trio/_core/_io_epoll.py", line 189, in EpollIOManager
_epoll = attr.ib(factory=select.epoll)
AttributeError: module 'eventlet.green.select' has no attribute 'epoll'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/user/Developer/_experiments/agents/examples/Single_Agent/gradio_backend.py", line 7, in <module>
from agents.SOP import SOP
File "/home/user/Developer/_experiments/agents/src/agents/__init__.py", line 2, in <module>
from .SOP import *
File "/home/user/Developer/_experiments/agents/src/agents/SOP.py", line 18, in <module>
from LLM.base_LLM import *
File "/home/user/Developer/_experiments/agents/examples/../src/agents/LLM/base_LLM.py", line 6, in <module>
from utils import save_logs
File "/home/user/Developer/_experiments/agents/examples/../src/agents/utils.py", line 25, in <module>
from text2vec import semantic_search
File "/home/user/anaconda3/lib/python3.9/site-packages/text2vec/__init__.py", line 8, in <module>
from text2vec.bertmatching_model import BertMatchModel
File "/home/user/anaconda3/lib/python3.9/site-packages/text2vec/bertmatching_model.py", line 16, in <module>
from transformers import BertForSequenceClassification, BertTokenizer
File "<frozen importlib._bootstrap>", line 1055, in _handle_fromlist
File "/home/user/anaconda3/lib/python3.9/site-packages/transformers/utils/import_utils.py", line 1121, in __getattr__
value = getattr(module, name)
File "/home/user/anaconda3/lib/python3.9/site-packages/transformers/utils/import_utils.py", line 1120, in __getattr__
module = self._get_module(self._class_to_module[name])
File "/home/user/anaconda3/lib/python3.9/site-packages/transformers/utils/import_utils.py", line 1132, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.bert.modeling_bert because of the following error (look up to see its traceback):
module 'eventlet.green.select' has no attribute 'epoll'
Hi, I'd like to use this project as a dependency via Poetry, however, when I install the package from git (since the version on PyPI is vastly outdated), it seems all the submodules are missing.
Reproduction steps:
poetry init
poetry add git+https://github.com/aiwaves-cn/agents.git
site-packages/agents
and observe that only top-level Python files are included; in my case:.
├── evolve.py
├── __init__.py
├── SOP.py
├── State.py
├── template.py
└── utils.py
I suspect this could be regression introduced in 2ae5457. I'm no distutils expert, but it seems specifying just packages = ['agents']
will omit all subpackages (e.g. agents.LLM
).
Perhaps one option to fix it would be to combine previously removed find_packages with package_dir like this?
packages=find_packages(where='src'),
package_dir={"": "src"}
运行 oculist_agent 出现 矩阵不一致的报错。请教下怎么解决,感谢~!
I am unable to run this locally. There are issues with the imports, was anyone able to install and run this successfully?
Currently the WebSearchComponent
supports Google and Bing organic search results.
agents/src/agents/Component/ToolComponent.py
Line 173 in b4ec704
More data can be obtained in a similar way to Langchain (ref: https://github.com/langchain-ai/langchain/blob/48a04aed7539f47a2a12fbe8b863a7813814ff86/libs/langchain/langchain/utilities/serpapi.py#L128-L215).
We'd be happy to integrate SerpApi into aiwaves-agents.
What do you think about supporting more data from Google, Bing, and other search engines results pages?
{ "config":{
"API_KEY" : "xxxxxxx",
"PROXY" : "",
"MAX_CHAT_HISTORY" : "1000",
"User_Names" : "["alexander"]",
"Embed_Model" : "intfloat/multilingual-e5-large"
},
"LLM_type": "OpenAI",
"LLM": {
"temperature": 0.0,
"model": "gpt-3.5-turbo-16k-0613",
"log_path": "logs/god"
},
"agents": {
"Mike": {
"style":"humorous and professional",
"roles":{
"response_state": "customer service personnel"
}
},
"alexander": {
"style":"cold and unruly",
"roles":{
"response_state": "user"
}
}
},
"root": "response_state",
"relations": {
"response_state": {
"0": "response_state"
}
},
"temperature": 0.6,
"log_path": "logs",
"states": {
"end_state": {
},
"response_state": {
"controller": {
"controller_type": "order"
},
"begin_role": "customer service personnel",
"begin_query": "hello, could i can help you?",
"environment_prompt": "",
"roles": [
"customer service personnel",
"user"
],
"LLM_type": "OpenAI",
"LLM": {
"temperature": 1.0,
"model": "gpt-3.5-turbo-16k-0613",
"log_path": "logs/customer_service"
},
"agent_states": {
"customer service personnel": {
"style": {
"role": "customer service personnel"
},
"task": {
"task": "answer user's question"
},
"rule": {
"rule": "Keep your response as concise as possible."
},
"KnowledgeBaseComponent": {
"top_k": 1,
"type": "QA",
"knowledge_path": "Single_Agent/customer_service/easy_question.json"
}
},
"user": {
}
}
}
}
}
C:\Users\xxxxx\Desktop\agents\examples>python run.py --agent Single_Agent/customer_service/customer_knowledge.json
path Single_Agent/customer_service/easy_question.json
Mike(customer service personnel):hello, could i can help you?
alexander:selam
0.7510939240455627
Traceback (most recent call last):
File "C:\Users\xxxx\Desktop\agents\examples\run.py", line 44, in
run(agents,sop,environment)
File "C:\Users\xxxx\Desktop\agents\examples\run.py", line 35, in run
memory = action.process()
File "C:\Users\xxx\AppData\Local\Programs\Python\Python310\lib\site-packages\agents\Action\base_action.py", line 25, in process
for res in response:
File "C:\Users\xxx\AppData\Local\Programs\Python\Python310\lib\site-packages\agents\LLM\base_LLM.py", line 39, in get_stream
save_logs(log_path, messages, ans)
File "C:\Users\xxx\AppData\Local\Programs\Python\Python310\lib\site-packages\agents\utils.py", line 127, in save_logs
with open(log_file, "w", encoding="utf-8") as f:
OSError: [Errno 22] Invalid argument: 'logs/Mike\2023-09-1801:15:51.json'
cd example
python run.py --agent Multi_Agent/debate/config.json
File "run.py", line 3, in
from agents.SOP import SOP
ModuleNotFoundError: No module named 'agents.SOP'
https://github.com/AI-Engineer-Foundation/agent-protocol
Just read your paper. I love the clean and fundamental your approach is to think about entire ecosystem for Agents. Curious if you'd consider using the Agent Protocol we have developed. This OSS project resides under a non-profit called "AI Engineer Foundation". The mission is to establish industry standards so that people do not have to re-invent the wheels, so that we can collaborate more.
By implementing Agent Protocol, you'd be able to communicate with all other Agents out there that also conform to the same protocol as well as being able to be benchmarked.
AutoGPT is running a hackathon and they have created a benchmarking tool that is built with AgentProtocol. Other projects that have adopted us other than AutoGPT is: GPTEngineer, PolyAGI, Beebot, E2B... the list is growing.
Excited to hear back from you!
Hi 🤗 !
Very cool work with Gradio demos linked on the Readme! It would be nice to host these awesome demos on the Hugging Face Hub !
Some of the benefits of sharing your models through the Hub would be:
This is a step-by-step guide explaining the process in case you're interested. 😊 This is our docs on Community GPU Grants.
Config:
{
"config": {
"API_KEY": "sk-xxxx",
"PROXY": "http://127.0.0.1:23457",
"API_BASE": "",
"MAX_CHAT_HISTORY": "5",
"MIN_CATEGORY_SIM": "0.7",
"FETSIZE": "3",
"User_Names": "[\"Mike\"]",
"SHOPPING_SEARCH": "Your search url"
},
"LLM_type": "OpenAI",
"LLM": {
"temperature": 0.3,
"model": "gpt-3.5-turbo-16k-0613",
"log_path": "logs/god"
},
"root": "shopping_state",
"relations": {
"shopping_state": {
"0": "shopping_state",
"1": "end_state"
}
},
"agents": {
"Jonh": {
"style": "humorous",
"roles": {
"shopping_state": "Shopping assistant"
}
},
"Mike": {
"style": "indifferent",
"roles": {
"shopping_state": "customer"
}
}
},
"states": {
"end_state": {},
"shopping_state": {
"roles": [
"Shopping assistant",
"customer"
],
"controller": {
"controller_type": "order",
"judge_system_prompt": "",
"judge_last_prompt": "response<end>0<end>",
"judge_extract_words": "end",
"call_system_prompt": "",
"call_last_prompt": "",
"call_extract_words": "end"
},
"environment_prompt": "The current scenario is an AI shopping guide answering customers' questions and recommending corresponding products based on the user's needs to induce users to buy. The main roles are: shopping_assistant (Jonh) is responsible for answering users’ questions, and customers (Mike) come to consult shopping opinions.",
"begin_role": "Shopping assistant",
"begin_query": "Welcome to the store, is there anything you want to buy?",
"agent_states": {
"Shopping assistant": {
"style": {
"role": "Shopping assistant",
"style": "humorous"
},
"task": {
"task": "Guide users to purchase goods"
},
"rule": {
"rule": "Your language should be as concise as possible, without too much nonsense. You have to repeatedly guide customers to purchase goods"
},
"CategoryRequirementsComponent": {
"information_path": [
"Single_Agent/shopping_assistant/bb_info.json",
"Single_Agent/shopping_assistant/toy_info.json"
]
}
},
"customer": {
"style": {
"role": "customer"
},
"task": {
"task": "Make things difficult for the shopping guide and ask him to provide as much product information as possible"
},
"rule": {
"rule": "Your language should be as concise as possible, without too much nonsense. You cannot ask questions repeatedly. You need to ask different aspects each time, and you can also ask questions reasonably. You can choose not to shop, but you must ask for as much information as possible. You need to constantly ask for different aspects of product information and follow up reasonably. Remember, your tone needs to be cold and your style needs to be tricky"
}
}
}
}
}
}
Command:
python run.py --agent Single_Agent/shopping_assistant/config.json
Error Log:
Jonh(Shopping assistant):Welcome to the store, is there anything you want to buy?
Mike: hell
Jonh(Shopping assistant):Hey Mike, welcome to the store! How can I assist you today? Is there something specific you're looking to buy?
Mike: Buy toys
Traceback (most recent call last):
File "/Users/zhangyajun/Documents/CodeWorkSpace/skyjun/agents/examples/run.py", line 44, in <module>
run(agents,sop,environment)
File "/Users/zhangyajun/Documents/CodeWorkSpace/skyjun/agents/examples/run.py", line 34, in run
action = current_agent.step(current_state,user_input) #component_dict = current_state[self.role[current_node.name]] current_agent.compile(component_dict)
File "/Users/zhangyajun/Documents/CodeWorkSpace/skyjun/agents/examples/../src/agents/Agent/Agent.py", line 131, in step
response,res_dict = self.act()
File "/Users/zhangyajun/Documents/CodeWorkSpace/skyjun/agents/examples/../src/agents/Agent/Agent.py", line 153, in act
system_prompt, last_prompt, res_dict = self.compile()
File "/Users/zhangyajun/Documents/CodeWorkSpace/skyjun/agents/examples/../src/agents/Agent/Agent.py", line 218, in compile
response = component.func(self)
File "/Users/zhangyajun/Documents/CodeWorkSpace/skyjun/agents/examples/../src/agents/Component/ExtraComponent.py", line 89, in func
topk_result = matching_category(
File "/Users/zhangyajun/Documents/CodeWorkSpace/skyjun/agents/examples/../src/agents/utils.py", line 357, in matching_category
top_k_cat = torch.topk(total_scores, k=top_k)
RuntimeError: selected index k out of range
可否增加对中文的支持或者修改,run singlesop的run.py时中文写不进json文件中
ModuleNotFoundError: No module named 'utils'
`
from agents import SOP
Traceback (most recent call last):
File "", line 1, in
File "/Users/.../anaconda3/envs/agents/lib/python3.9/site-packages/agents/init.py", line 2, in
from .SOP import *
File "/Users/.../anaconda3/envs/agents/lib/python3.9/site-packages/agents/SOP.py", line 18, in
from LLM.base_LLM import *
ModuleNotFoundError: No module named 'LLM'
`
Getting errors when I try the demos e.g Software company
While running the code, I am getting this error
openai.error.APIConnectionError: Error communicating with OpenAI: Please check proxy URL. It is malformed and could be missing the host.
Also what to be given in API_Base and proxy in Config file?
is there any requirement to the server? I try to run it, but failed due to my server 2.5GB RAM, 2 CORES.
不知道啥问题·····导致这个demo一直无法完整对话。
“Human-Agent Interation” should be "Human-Agent Interaction"
I use "chat_bot" config, when I run the program on terminal, displays the bot text below:
-- Yang bufan(Yang bufan):hello,What are you looking for me for?
The code stopped when running on the line "environment.update_memory". How to deal with it?
When I use the pip intall ai-agents
, the install returns a error message:
ERROR: Cannot install ai-agents==0.0.1, ai-agents==0.0.2 and ai-agents==0.0.3 because these package versions have conflicting dependencies.
The conflict is caused by:
ai-agents 0.0.3 depends on langchain==0.0.308
ai-agents 0.0.2 depends on langchain==0.0.308
ai-agents 0.0.1 depends on gradio==3.39.0
How to solve it?
Thanks!!
PROXY = #Your proxy
API_BASE = # Your api_base
What do the two values PROXY and API_BASE mean?
I have a txt file which is my local knowledge base and now I want to use utils.process_document function to process my file. The database json file it generated is not the "QA" type. So when I add this tool into my config.json and run the run.py, it returns the following error:
AttributeError: 'KnowledgeBaseComponent' object has no attribute 'kb_answers'
.
The code is here:
else:
for hit in hits:
matching_idx = hit["corpus_id"]
if self.kb_chunks[matching_idx] in temp:
pass
else:
knowledge = knowledge + f"{self.kb_answers[matching_idx]}\n\n"
temp.append(self.kb_answers[matching_idx])
if len(temp) == self.top_k:
break
print(hits[0]["score"])
score = hits[0]["score"]
if score < 0.5:
return {"prompt": "No matching knowledge base"}
else:
print(knowledge)
return {"prompt": "The relevant content is: " + knowledge + "\n"}
When I run the run_gradio.py from software_company dir, it returns the following error:
KeyError: 'pwd'
The related code is here:
if state == 99:
# finish
fs = [self.cache['pwd']+'/output_code/'+_ for _ in os.listdir(self.cache['pwd']+'/output_code')]
yield gr.File.update(value=fs, visible=True, interactive=True),\
history, \
gr.Chatbot.update(visible=True),\
gr.Button.update(visible=True, interactive=True, value="Start"),\
gr.Button.update(visible=True, interactive=True),\
gr.Textbox.update(visible=True, interactive=True, placeholder="Please input your requirement", value=""),\
gr.Button.update(visible=False)
return
Palm
LLama
Claude
It seems strange to me how a paper that discusses that agents should be usable by technical and non-technical does not have any documentation on how to setup and run this repo.
I run run_gradio.py with pycharm, but it failed, the logs is as below:
Traceback (most recent call last):
File "D:\programs\anaconda3\envs\agents\lib\site-packages\gradio\routes.py", line 442, in run_predict
output = await app.get_blocks().process_api(
File "D:\programs\anaconda3\envs\agents\lib\site-packages\gradio\utils.py", line 861, in exit
matplotlib.use(self.original_backend)
File "D:\programs\anaconda3\envs\agents\lib\site-packages\matplotlib_init.py", line 1276, in use
plt.switch_backend(name)
File "D:\programs\anaconda3\envs\agents\lib\site-packages\matplotlib\pyplot.py", line 343, in switch_backend
canvas_class = module.FigureCanvas
AttributeError: module 'backend_interagg' has no attribute 'FigureCanvas'
Your software company demo cannot work.
My config.json looks like:
{
"config": {
"API_KEY": "sk-mAPN8QMPJKY56xifrpT3BlbkFJCxEKRHRBMfMtNWFALpoq",
"PROXY": "",
"API_BASE": "",
"MAX_CHAT_HISTORY": "1000",
"User_Names": "[\"alexander\"]"
},
"WebSearchComponent": {
"engine_name": "google",
"api": {
"google": {
"cse_id": "94ecfac400840f5",
"api_key": "AIzaSyWP6l-rxAHsV6VFs-mZUPBAhZBOSq-Ck"
},
"bing": "Your bing key"
}
},
"LLM_type": "OpenAI",
...
cse_id
and api_key
are correct: I tested them in a separate script.
I call it with python run.py --agent config.json
and get:
File "/Users/wodecki/anaconda3/envs/aiagents/lib/python3.10/site-packages/agents/State.py", line 123, in init_components
component_dict["WebSearchComponent"] = WebSearchComponent(
File "/Users/wodecki/anaconda3/envs/aiagents/lib/python3.10/site-packages/agents/Component/ToolComponent.py", line 184, in __init__
assert engine_name in WebSearchComponent.__ENGINE_NAME__
AssertionError
Any idea why?
pip install -e .
cd examples
EVENTLET_NO_GREENDNS="yes" python Single_Agent/run_gradio.py --agent Single_Agent/shopping_assistant/config.json
(Added EVENTLET_NO_GREENDNS="yes"
beacuse of #94 (comment))Got the following output:
server: executing `python Single_Agent/gradio_backend.py --agent Single_Agent/shopping_assistant/config.json` ...
[2023-09-26 13:58:06,608] [INFO] [real_accelerator.py:158:get_accelerator] Setting ds_accelerator to cuda (auto detect)
Traceback (most recent call last):
File "/home/user/Developer/_experiments/agents/examples/Single_Agent/gradio_backend.py", line 7, in <module>
from agents.SOP import SOP
File "/home/user/Developer/_experiments/agents/src/agents/__init__.py", line 2, in <module>
from .SOP import *
File "/home/user/Developer/_experiments/agents/src/agents/SOP.py", line 18, in <module>
from LLM.base_LLM import *
File "/home/user/Developer/_experiments/agents/examples/../src/agents/LLM/base_LLM.py", line 6, in <module>
from utils import save_logs
File "/home/user/Developer/_experiments/agents/examples/../src/agents/utils.py", line 28, in <module>
from langchain.document_loaders import UnstructuredFileLoader
File "/home/user/anaconda3/lib/python3.9/site-packages/langchain/__init__.py", line 6, in <module>
from langchain.agents import MRKLChain, ReActChain, SelfAskWithSearchChain
File "/home/user/anaconda3/lib/python3.9/site-packages/langchain/agents/__init__.py", line 31, in <module>
from langchain.agents.agent import (
File "/home/user/anaconda3/lib/python3.9/site-packages/langchain/agents/agent.py", line 23, in <module>
from langchain.agents.agent_iterator import AgentExecutorIterator
File "/home/user/anaconda3/lib/python3.9/site-packages/langchain/agents/agent_iterator.py", line 30, in <module>
from langchain.tools import BaseTool
File "/home/user/anaconda3/lib/python3.9/site-packages/langchain/tools/__init__.py", line 25, in <module>
from langchain.tools.arxiv.tool import ArxivQueryRun
File "/home/user/anaconda3/lib/python3.9/site-packages/langchain/tools/arxiv/tool.py", line 8, in <module>
from langchain.utilities.arxiv import ArxivAPIWrapper
File "/home/user/anaconda3/lib/python3.9/site-packages/langchain/utilities/__init__.py", line 7, in <module>
from langchain.utilities.apify import ApifyWrapper
File "/home/user/anaconda3/lib/python3.9/site-packages/langchain/utilities/apify.py", line 3, in <module>
from langchain.document_loaders import ApifyDatasetLoader
File "/home/user/anaconda3/lib/python3.9/site-packages/langchain/document_loaders/__init__.py", line 89, in <module>
from langchain.document_loaders.github import GitHubIssuesLoader
File "/home/user/anaconda3/lib/python3.9/site-packages/langchain/document_loaders/github.py", line 37, in <module>
class GitHubIssuesLoader(BaseGitHubLoader):
File "pydantic/main.py", line 198, in pydantic.main.ModelMetaclass.__new__
File "pydantic/fields.py", line 506, in pydantic.fields.ModelField.infer
File "pydantic/fields.py", line 436, in pydantic.fields.ModelField.__init__
File "pydantic/fields.py", line 552, in pydantic.fields.ModelField.prepare
File "pydantic/fields.py", line 663, in pydantic.fields.ModelField._type_analysis
File "pydantic/fields.py", line 808, in pydantic.fields.ModelField._create_sub_type
File "pydantic/fields.py", line 436, in pydantic.fields.ModelField.__init__
File "pydantic/fields.py", line 552, in pydantic.fields.ModelField.prepare
File "pydantic/fields.py", line 668, in pydantic.fields.ModelField._type_analysis
File "/home/user/anaconda3/lib/python3.9/typing.py", line 852, in __subclasscheck__
return issubclass(cls, self.__origin__)
TypeError: issubclass() arg 1 must be a class
config
{
"config": {
"API_KEY": "sk-",
"PROXY": "http://127.0.0.1:7890",
"API_BASE": "",
"MAX_CHAT_HISTORY": "5",
"MIN_CATEGORY_SIM": "0.7",
"FETSIZE": "3",
"User_Names": "[\"Mike\"]",
"SHOPPING_SEARCH": "Your search url"
},
"LLM_type": "OpenAI",
"LLM": {
"temperature": 0.3,
"model": "gpt-3.5-turbo-16k-0613",
"log_path": "logs/god"
},
"root": "shopping_state",
"relations": {
"shopping_state": {
"0": "shopping_state",
"1": "end_state"
}
},
"agents": {
"Jonh": {
"style": "humorous",
"roles": {
"shopping_state": "Shopping assistant"
}
},
"Mike": {
"style": "indifferent",
"roles": {
"shopping_state": "customer"
}
}
},
"states": {
"end_state": {},
"shopping_state": {
"roles": [
"Shopping assistant",
"customer"
],
"controller": {
"controller_type": "order",
"judge_system_prompt": "",
"judge_last_prompt": "response<end>0<end>",
"judge_extract_words": "end",
"call_system_prompt": "",
"call_last_prompt": "",
"call_extract_words": "end"
},
"environment_prompt": "The current scenario is an AI shopping guide answering customers' questions and recommending corresponding products based on the user's needs to induce users to buy. The main roles are: shopping_assistant (Jonh) is responsible for answering users’ questions, and customers (Mike) come to consult shopping opinions.",
"begin_role": "Shopping assistant",
"begin_query": "Welcome to the store, is there anything you want to buy?",
"agent_states": {
"Shopping assistant": {
"style": {
"role": "Shopping assistant",
"style": "humorous"
},
"task": {
"task": "Guide users to purchase goods"
},
"rule": {
"rule": "Your language should be as concise as possible, without too much nonsense. You have to repeatedly guide customers to purchase goods"
},
"CategoryRequirementsComponent": {
"information_path": [
"Single_Agent/shopping_assistant/bb_info.json",
"Single_Agent/shopping_assistant/toy_info.json"
]
}
},
"customer": {
"style": {
"role": "customer"
},
"task": {
"task": "Make things difficult for the shopping guide and ask him to provide as much product information as possible"
},
"rule": {
"rule": "Your language should be as concise as possible, without too much nonsense. You cannot ask questions repeatedly. You need to ask different aspects each time, and you can also ask questions reasonably. You can choose not to shop, but you must ask for as much information as possible. You need to constantly ask for different aspects of product information and follow up reasonably. Remember, your tone needs to be cold and your style needs to be tricky"
}
}
}
}
}
}
command & error
python run.py --agent Single_Agent/shopping_assistant/config.json
Jonh(Shopping assistant):Welcome to the store, is there anything you want to buy?
Mike:candy
Traceback (most recent call last):
File "/workspace/dhl/agents/examples/run.py", line 44, in <module>
run(agents,sop,environment)
File "/workspace/dhl/agents/examples/run.py", line 34, in run
action = current_agent.step(current_state,user_input) #component_dict = current_state[self.role[current_node.name]] current_agent.compile(component_dict)
File "/workspace/dhl/agents/examples/../src/agents/Agent/Agent.py", line 131, in step
response,res_dict = self.act()
File "/workspace/dhl/agents/examples/../src/agents/Agent/Agent.py", line 153, in act
system_prompt, last_prompt, res_dict = self.compile()
File "/workspace/dhl/agents/examples/../src/agents/Agent/Agent.py", line 218, in compile
response = component.func(self)
File "/workspace/dhl/agents/examples/../src/agents/Component/ExtraComponent.py", line 89, in func
topk_result = matching_category(
File "/workspace/dhl/agents/examples/../src/agents/utils.py", line 345, in matching_category
input_embeder = get_embedding(inputtext)
File "/workspace/dhl/agents/examples/../src/agents/utils.py", line 53, in get_embedding
embed = embedding_model.create(
File "/workspace/dhl/conda/envs/ag/lib/python3.10/site-packages/openai/api_resources/embedding.py", line 33, in create
response = super().create(*args, **kwargs)
File "/workspace/dhl/conda/envs/ag/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 153, in create
response, _, api_key = requestor.request(
File "/workspace/dhl/conda/envs/ag/lib/python3.10/site-packages/openai/api_requestor.py", line 298, in request
resp, got_stream = self._interpret_response(result, stream)
File "/workspace/dhl/conda/envs/ag/lib/python3.10/site-packages/openai/api_requestor.py", line 700, in _interpret_response
self._interpret_response_line(
File "/workspace/dhl/conda/envs/ag/lib/python3.10/site-packages/openai/api_requestor.py", line 765, in _interpret_response_line
raise self.handle_error_response(
openai.error.RateLimitError: Rate limit reached for default-text-embedding-ada-002 in organization org-lzTj3mnzjnM5lxYLwxprz3di on requests per min. Limit: 3 / min. Please try again in 20s. Contact us through our help center at help.openai.com if you continue to have issues. Please add a payment method to your account to increase your rate limit. Visit https://platform.openai.com/account/billing to add a payment method
请问下替换自己本地部署的llm的api,这块有没有文档指引
Hi, I am having trouble when running it in my local environment. What is proxy and api base that needs to be added in the config file? Where can i generate that information?
When I run multi_agent using run_gradio command, I will get:
raise AccessDenied(self.pid, self._name)
psutil.AccessDenied: (pid=89569)
Traceback (most recent call last):
File "/.../agents_pg/.venv/lib/python3.11/site-packages/psutil/_psosx.py", line 346, in wrapper
return fun(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.../agents_pg/.venv/lib/python3.11/site-packages/psutil/_psosx.py", line 503, in connections
rawlist = cext.proc_connections(self.pid, families, types)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
PermissionError: [Errno 1] Operation not permitted (originated from proc_pidinfo(PROC_PIDLISTFDS) 1/2)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/.../agents/examples/Muti_Agent/software_company/run_gradio.py", line 5, in
from gradio_base import WebUI, UIHelper, PORT, HOST, Client
File "/.../agents_pg/agents/examples/Muti_Agent/software_company/../../Gradio_Config/gradio_base.py", line 91, in
PORT = check_port(PORT)
^^^^^^^^^^^^^^^^
File "/.../agents_pg/agents/examples/Muti_Agent/software_company/../../Gradio_Config/gradio_base.py", line 75, in check_port
if is_port_in_use(port+i) == False:
^^^^^^^^^^^^^^^^^^^^^^
File "/.../agents_pg/agents/examples/Muti_Agent/software_company/../../Gradio_Config/gradio_base.py", line 64, in is_port_in_use
for conn in psutil.net_connections():
^^^^^^^^^^^^^^^^^^^^^^^^
File "/.../agents_pg/.venv/lib/python3.11/site-packages/psutil/init.py", line 2163, in net_connections
return _psplatform.net_connections(kind)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.../agents_pg/.venv/lib/python3.11/site-packages/psutil/_psosx.py", line 248, in net_connections
cons = Process(pid).connections(kind)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/.../agents_pg/.venv/lib/python3.11/site-packages/psutil/_psosx.py", line 353, in wrapper
raise AccessDenied(self.pid, self._name)
psutil.AccessDenied: (pid=89569)
This website sends to the default nginx page: http://www.aiwaves-agents.com/
The Discord invite is now invalid: https://discord.gg/DDPBeFt7
Great paper all though running with Gradio and on Terminal gives this error using the example configs. Have not tested with FastAPI yet.
C:\Users\bosef\agents\examples>python run.py --agent Single_Agent/customer_service/customer_websearch.json
Traceback (most recent call last):
File "C:\Users\bosef\agents\examples\run.py", line 43, in
agents,sop,environment = init(args.agent)
File "C:\Users\bosef\agents\examples\run.py", line 13, in init
sop = SOP.from_config(config)
File "C:\Users\bosef\AppData\Local\Programs\Python\Python310\lib\site-packages\agents\SOP.py", line 72, in from_config
sop = SOP(**config)
File "C:\Users\bosef\AppData\Local\Programs\Python\Python310\lib\site-packages\agents\SOP.py", line 44, in init
self.init_states(kwargs["states"])
File "C:\Users\bosef\AppData\Local\Programs\Python\Python310\lib\site-packages\agents\SOP.py", line 78, in init_states
self.states[state_name] = State(**state_dict)
File "C:\Users\bosef\AppData\Local\Programs\Python\Python310\lib\site-packages\agents\State.py", line 31, in init
self.init_components(kwargs["agent_states"])
File "C:\Users\bosef\AppData\Local\Programs\Python\Python310\lib\site-packages\agents\State.py", line 123, in init_components
component_dict["WebSearchComponent"] = WebSearchComponent(
File "C:\Users\bosef\AppData\Local\Programs\Python\Python310\lib\site-packages\agents\Component\ToolComponent.py", line 183, in init
assert engine_name in WebSearchComponent.ENGINE_NAME
AssertionError
Running the Muti_agent/software_company example locally, I get:
Traceback (most recent call last):
File "/var/www/html/agents/examples/run.py", line 44, in <module>
run(agents,sop,environment)
File "/var/www/html/agents/examples/run.py", line 35, in run
memory = action.process()
File "/var/www/html/agents/examples/../src/agents/Action/base_action.py", line 39, in process
title = extract(all,"title")
NameError: name 'extract' is not defined
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.