Comments (7)
Also I have been further testing with gpt-4o and i seem to be getting this error fairly often.
openai.InternalServerError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}
here is the full stacktrace:
Traceback (most recent call last): File "/home/nathan/Documents/repo/opspilot-query-auto-agent/src/group_chat/service.py", line 116, in process_metrics_callback await overseer.a_initiate_chat(manager, message=plan) File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1057, in a_initiate_chat await self.a_send(msg2send, recipient, silent=silent) File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 682, in a_send await recipient.a_receive(message, self, request_reply, silent) File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 829, in a_receive reply = await self.a_generate_reply(sender=sender) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1996, in a_generate_reply final, reply = await reply_func( ^^^^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/groupchat.py", line 746, in a_run_chat reply = await speaker.a_generate_reply(sender=self) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1996, in a_generate_reply final, reply = await reply_func( ^^^^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1363, in a_generate_oai_reply return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.11/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1361, in _generate_oai_reply return self.generate_oai_reply(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1300, in generate_oai_reply extracted_response = self._generate_oai_reply_from_client( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1319, in _generate_oai_reply_from_client response = llm_client.create( ^^^^^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/oai/client.py", line 638, in create response = client.create(params) ^^^^^^^^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/oai/client.py", line 285, in create response = completions.create(**params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_utils/_utils.py", line 275, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 581, in create return self._post( ^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1233, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 922, in request return self._request( ^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 998, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1046, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 998, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1046, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1013, in _request raise self._make_status_error_from_response(err.response) from None
i wasnt able to get the printed response returned but id be happy to try to get it if you could point me where i should put a print line
Got it. Thanks
from autogen.
@sonichi do we not use the updated version of the tiktoken?
from autogen.
We don't restrict the tiktoken version. So @Nathan-Intergral could you elaborate the old version comment?
from autogen.
@sonichi Yeah the version is not restricted, its just that the version you download as part of the dependencies don't work right now, not a big deal. As i said in the original post, i simply upgraded the dependency in my personal project to fix the issue.
The only real issue is the second one mentioned in token_count_utils.py
from autogen.
Also I have been further testing with gpt-4o and i seem to be getting this error fairly often.
openai.InternalServerError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}
here is the full stacktrace:
Traceback (most recent call last):
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/src/group_chat/service.py", line 116, in process_metrics_callback
await overseer.a_initiate_chat(manager, message=plan)
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1057, in a_initiate_chat
await self.a_send(msg2send, recipient, silent=silent)
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 682, in a_send
await recipient.a_receive(message, self, request_reply, silent)
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 829, in a_receive
reply = await self.a_generate_reply(sender=sender)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1996, in a_generate_reply
final, reply = await reply_func(
^^^^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/groupchat.py", line 746, in a_run_chat
reply = await speaker.a_generate_reply(sender=self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1996, in a_generate_reply
final, reply = await reply_func(
^^^^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1363, in a_generate_oai_reply
return await asyncio.get_event_loop().run_in_executor(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1361, in _generate_oai_reply
return self.generate_oai_reply(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1300, in generate_oai_reply
extracted_response = self._generate_oai_reply_from_client(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1319, in _generate_oai_reply_from_client
response = llm_client.create(
^^^^^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/oai/client.py", line 638, in create
response = client.create(params)
^^^^^^^^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/autogen/oai/client.py", line 285, in create
response = completions.create(**params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_utils/_utils.py", line 275, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 581, in create
return self._post(
^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1233, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 922, in request
return self._request(
^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 998, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1046, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 998, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1046, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/home/nathan/Documents/repo/opspilot-query-auto-agent/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1013, in _request
raise self._make_status_error_from_response(err.response) from None
i wasnt able to get the printed response returned but id be happy to try to get it if you could point me where i should put a print line
from autogen.
Different version different capabilities for All AI cause the will great if enterprise version if free, still need knowledment. Learn more to model Lang
from autogen.
from autogen.
Related Issues (20)
- [.Net][Document]: Use AutoGen.Net agent as model in AG Studio HOT 1
- [.Net][Bug]: AutoGen.WebAPI not released
- [Bug]: Websocket demo not working
- [Roadmap]: Google Integrations
- [Issue]: Gemini models do not execute code in Autogen Studio
- [.Net][Feature Request]: Rename the namespace of AutoGen.WebAPI from AutoGen.Service to AutoGen.WebAPI
- [Issue]: How can I disable caching? HOT 3
- [Bug]: Handle CTRL+C gracefully
- [Bug]: Not understanding warning message from autogen during ConverseableAgent creation.
- [.Net][Feature Request] Change the return type from `Task<string>` to `Task<object>` to support more return type in FunctionCallMiddleware
- [Issue]: AutoGen is the copy of CAMEL? HOT 4
- [.Net][Issue]: Invoking functions and streaming result not working HOT 1
- [Bug]: When using Reviewer Agent
- [Bug]: Using reviewer agent (AssistantAgent) with claude-3-5-sonnet-20240620 crashes HOT 2
- [Issue]: Human Input Mode behavior different in python vs .net HOT 1
- [Issue]: Error When Resuming GroupChat: Invalid Parameter for 'tool' Role Messages
- [Bug]: Faulty calculation of custom pricing HOT 1
- [.Net][Bug]: Image message fails with OpenAI HOT 3
- [.Net][Bug]: The response from admin is "function*"。, which is either not in the candidates list or not in the correct format. HOT 1
- [.Net][Document]: Add AutoGen.Net implementation of the examples from `AI Agentic Design Patterns with AutoGen`
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from autogen.