comfyui-kepopenai's Introduction
comfyui-kepopenai's People
comfyui-kepopenai's Issues
openai_api_key
Hello,
great idea for the plugin.
but to make our lives easier, is it possible to define the open_api_key somewhere in a file instead of env variable?
Thank you
Doesn't seem to work anymore; should prefer GPT-4o
Error occurred when executing KepOpenAI_ImageWithPrompt: Error code: 404 - {'error': {'message': 'The model `gpt-4-vision-preview` has been deprecated, learn more here: https://platform.openai.com/docs/deprecations', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}} File "/src/ComfyUI/execution.py", line 152, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) File "/src/ComfyUI/execution.py", line 82, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) File "/src/ComfyUI/execution.py", line 75, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) File "/src/ComfyUI/custom_nodes/ComfyUI-KepOpenAI/nodes.py", line 40, in generate_completion response = self.open_ai_client.chat.completions.create( File "/root/.pyenv/versions/3.10.14/lib/python3.10/site-packages/openai/_utils/_utils.py", line 277, in wrapper return func(*args, **kwargs) File "/root/.pyenv/versions/3.10.14/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 646, in create return self._post( File "/root/.pyenv/versions/3.10.14/lib/python3.10/site-packages/openai/_base_client.py", line 1266, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "/root/.pyenv/versions/3.10.14/lib/python3.10/site-packages/openai/_base_client.py", line 942, in request return self._request( File "/root/.pyenv/versions/3.10.14/lib/python3.10/site-packages/openai/_base_client.py", line 1046, in _request raise self._make_status_error_from_response(err.response) from None
Feature request - Add LM studio as an option
Hi,
First, thank you for this node - I have been using it for many months with great success.
I have had a few occasions where OpenAI is too restrictive and the costs add up, so I've been wanting to use open source LlaVa models.
Would it be possible to add the LM studio server as an option in addition to the openai API ? Or even show the LM studio as an option and a picklist of the available llava / vision models with your node?
I read that LM Studio server works similarly to openai in the request response but I don't know if that makes it easy to add them to your nodes or not? https://lmstudio.ai/docs/local-server
Thank you,
Eric
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. ๐๐๐
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google โค๏ธ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.