Comments (8)
from autogen.
- For this screen shot, which agent is it?
- Can you go to Azure OpenAI studio(the page in your first screenshot), select the
gpt-35
model, then click on "Open in Playground". On the playground page, middle panel click on "View code", you should see the settings of the model look like this:
openai.api_type = "azure"
openai.api_base = "https://xxxxxxx.openai.azure.com/"
openai.api_version = "xxxxxx"
openai.api_key = ......
Can you confirm they are the same as what you put in the second screenshot?
- Still in "View code", line 14, there's "engine" which should match the name you put in the second screenshot. Can you also confirm that?
from autogen.
Hi @cheng-tan
Really strange issue, I see if I use 2023-07-01-preview API, it works, but when I use 2023-05-15 api version, it couldnot work
2023-07-01-preview is in the code playground
But if I use 2023-05-15 in other project , it works....
- For this screen shot, which agent is it?
- Can you go to Azure OpenAI studio(the page in your first screenshot), select the
gpt-35
model, then click on "Open in Playground". On the playground page, middle panel click on "View code", you should see the settings of the model look like this:openai.api_type = "azure" openai.api_base = "https://xxxxxxx.openai.azure.com/" openai.api_version = "xxxxxx" openai.api_key = ......
Can you confirm they are the same as what you put in the second screenshot?
- Still in "View code", line 14, there's "engine" which should match the name you put in the second screenshot. Can you also confirm that?
from autogen.
Do you face this? Try many times, but I still don't know how to solve.
Thank you very much.
my another error is :
"Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpcore_exceptions.py", line 10, in map_exceptions
yield
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpcore_backends\sync.py", line 206, in connect_tcp
sock = socket.create_connection(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\socket.py", line 851, in create_connection
raise exceptions[0]
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\socket.py", line 836, in create_connection
sock.connect(sa)
TimeoutError: [WinError 10060] 。
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpx_transports\default.py", line 67, in map_httpcore_exceptions
yield
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpx_transports\default.py", line 231, in handle_request
resp = self._pool.handle_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpcore_sync\connection_pool.py", line 268, in handle_request
raise exc
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpcore_sync\connection_pool.py", line 251, in handle_request
response = connection.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpcore_sync\connection.py", line 99, in handle_request
raise exc
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpcore_sync\connection.py", line 76, in handle_request
stream = self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpcore_sync\connection.py", line 124, in _connect
stream = self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpcore_backends\sync.py", line 205, in connect_tcp
with map_exceptions(exc_map):
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\contextlib.py", line 158, in exit
self.gen.throw(typ, value, traceback)
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpcore_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectTimeout: [WinError 10060]。
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\openai_base_client.py", line 885, in _request
response = self._client.send(
^^^^^^^^^^^^^^^^^^
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpx_client.py", line 915, in send
response = self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpx_client.py", line 943, in _send_handling_auth
response = self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpx_client.py", line 980, in _send_handling_redirects
response = self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpx_client.py", line 1016, in _send_single_request
response = transport.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpx_transports\default.py", line 230, in handle_request
with map_httpcore_exceptions():
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\contextlib.py", line 158, in exit
self.gen.throw(typ, value, traceback)
File "C:\ProgramData\Anaconda3\envs\autogen\Lib\site-packages\httpx_transports\default.py", line 84, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectTimeout: [WinError 10060]。
"
from autogen.
Also facing the same issue here. When I look at the log it seems that the models are not being updated according to the new information I set UI?
INFO: 127.0.0.1:51138 - "POST /api/messages HTTP/1.1" 200 OK
INFO: 127.0.0.1:51146 - "GET /api/[email protected] HTTP/1.1" 200 OK
INFO: 127.0.0.1:51146 - "GET /api/[email protected] HTTP/1.1" 200 OK
INFO: 127.0.0.1:51209 - "POST /api/models HTTP/1.1" 200 OK
INFO: 127.0.0.1:51225 - "POST /api/models HTTP/1.1" 200 OK
INFO: 127.0.0.1:51257 - "POST /api/models HTTP/1.1" 200 OK
INFO: 127.0.0.1:51257 - "GET /build HTTP/1.1" 307 Temporary Redirect
INFO: 127.0.0.1:51257 - "GET /build/ HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51257 - "GET /app-e668c2f8516018ac0a1c.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51262 - "GET /framework-0e118549f33203964972.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51263 - "GET /webpack-runtime-5d7173c472e7866cdb8d.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51263 - "GET /page-data/app-data.json HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51262 - "GET /page-data/build/page-data.json HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51263 - "GET /favicon-32x32.png?v=46131b614287332146ea078703a67d38 HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51265 - "GET /manifest.webmanifest HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51263 - "GET /318ce576f236b79fd96f75904c13f6e55c3eee57-bf26f3994740657e0089.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51262 - "GET /component---src-pages-build-tsx-327ca1b88cf4c2d6fc61.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51262 - "GET /api/[email protected] HTTP/1.1" 200 OK
INFO: 127.0.0.1:51262 - "GET /api/[email protected] HTTP/1.1" 200 OK
INFO: 127.0.0.1:51262 - "GET /page-data/gallery/page-data.json HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51263 - "GET /page-data/index/page-data.json HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51263 - "GET /page-data/index/page-data.json HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51263 - "GET /page-data/gallery/page-data.json HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51262 - "GET /component---src-pages-index-tsx-2342fef3effa18a4d864.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51263 - "GET /component---src-pages-gallery-index-tsx-25eaa0ea2f26f1c4361f.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51272 - "GET /api/[email protected] HTTP/1.1" 200 OK
INFO: 127.0.0.1:51272 - "GET /api/[email protected] HTTP/1.1" 200 OK
INFO: 127.0.0.1:51283 - "POST /api/agents HTTP/1.1" 200 OK
INFO: 127.0.0.1:51283 - "GET /api/[email protected] HTTP/1.1" 200 OK
INFO: 127.0.0.1:51285 - "POST /api/agents HTTP/1.1" 200 OK
INFO: 127.0.0.1:51285 - "GET /api/[email protected] HTTP/1.1" 200 OK
INFO: 127.0.0.1:51285 - "POST /api/agents HTTP/1.1" 200 OK
INFO: 127.0.0.1:51285 - "GET /6c05b3ab656a8bfc225a64b3d73a36a92f83c05e-6a87df80060a5da491fc.js HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51285 - "GET /api/[email protected] HTTP/1.1" 200 OK
INFO: 127.0.0.1:51288 - "GET /api/[email protected] HTTP/1.1" 200 OK
INFO: 127.0.0.1:51285 - "GET /images/svgs/welcome.svg HTTP/1.1" 304 Not Modified
INFO: 127.0.0.1:51285 - "GET /api/[email protected]&session_id=763b6ac8-73c8-45b4-8c91-7f5980e5a683 HTTP/1.1" 200 OK
INFO: 127.0.0.1:51285 - "GET /api/[email protected]&session_id=763b6ac8-73c8-45b4-8c91-7f5980e5a683 HTTP/1.1" 200 OK
agent_spec.config LLMConfig(config_list=[{'model': 'gpt-4-1106-preview'}, {'model': 'gpt-3.5-turbo-16k'}, {'model': 'TheBloke/zephyr-7B-alpha-AWQ', 'base_url': 'http://localhost:8000/v1'}], temperature=0.1, cache_seed=None, timeout=600)
The LLMConfig
still shows the default list of models, but I updated it in UI:
from autogen.
remember the deployment name should be same as model. It should be working
from autogen.
@monuminu Do you mean the name in green in the screenshot above should be the same as model name in Azure OpenAI deployment, or? These are the deployments I have:
Why In the log it shows these three model gpt-4-1106-preview
, gpt-3.5-turbo-16k
and TheBloke/zephyr-7B-alpha-AWQ
while I have different models setup (as shown in previous screenshot). Shouldn't it be gpt-4
, gpt-3.5-turbo-16k
and dall-e-3
?
INFO: 127.0.0.1:51285 - "GET /api/[email protected]&session_id=763b6ac8-73c8-45b4-8c91-7f5980e5a683 HTTP/1.1" 200 OK
agent_spec.config LLMConfig(config_list=[{'model': 'gpt-4-1106-preview'}, {'model': 'gpt-3.5-turbo-16k'}, {'model': 'TheBloke/zephyr-7B-alpha-AWQ', 'base_url': 'http://localhost:8000/v1'}], temperature=0.1, cache_seed=None, timeout=600)
from autogen.
- For this screen shot, which agent is it?
- Can you go to Azure OpenAI studio(the page in your first screenshot), select the
gpt-35
model, then click on "Open in Playground". On the playground page, middle panel click on "View code", you should see the settings of the model look like this:openai.api_type = "azure" openai.api_base = "https://xxxxxxx.openai.azure.com/" openai.api_version = "xxxxxx" openai.api_key = ......
Can you confirm they are the same as what you put in the second screenshot?
- Still in "View code", line 14, there's "engine" which should match the name you put in the second screenshot. Can you also confirm that?
@lz-chen Can you try deleting the default models and add your new model? Are you able to follow step 2 and 3 in this comment to verify your configs are correct?
from autogen.
Related Issues (20)
- [Bug]: PowerShell existence test broken on Mac HOT 3
- [Bug]: cant run local model HOT 2
- Decision Making in Text World Environments
- .
- [Bug]: Can't run on Windows 11 due to "np.Inf" error HOT 7
- [Bug]: Crash: PermissionDenied/"powershell" error on MacOS 14.3.1 HOT 6
- [Bug]: LFS problem with conversable-agent.jpg
- [Roadmap] Multimodal Orchestration HOT 5
- [Bug]: Unable to create new agent - Error occurred while creating agent: table agents has no column named description HOT 5
- [Bug]: LocalCommandLineCodeExecutor doesn't save code files HOT 3
- [.Net][Bug]: AutoGen.SourceGenerator miss adding AutoGen.Core namespace in its generated file
- [Feature Request]: Tutorial Chapter on Tool Use
- [Bug]: Error occurred while creating agent: table agents has no column named description HOT 4
- [Documentation] Topic Category for AutoGen + Non-OpenAI Models HOT 8
- [Issue]: Fix type and default value of the code_execution_config parameter in UserProxyAgent
- [Bug]: error:AttributeError: 'str' object has no attribute 'model' appears in agentchat_groupchat_RAG.ipynb
- [Bug]: Changing previously saved agent model deletes agent HOT 1
- [Issue]: zero cost in chat_res.cost while already called the openai gpt-3.5-turbo
- [Bug]: Issue starting autogen when using a windows os with WLS (Ubuntu) to run celery scripts
- [Feature Request]: Command line utils to improve developer experience
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from autogen.