Giter Club home page Giter Club logo

comfyui-deploy's Introduction

ComfyUI Deploy

Open source comfyui deployment platform, a vercel for generative workflow infra. (serverless hosted gpu with vertical intergation with comfyui)

Join Discord to chat more or visit Comfy Deploy to get started!

Check out our latest nextjs starter kit with Comfy Deploy

63shots_so 1# How it works

  1. Comfy Deploy Dashboard (https://comfydeploy.com) or self-hosted version
  2. Machines (Long running, on-premise ComfyUI machines and serverless)
  3. Supports runpods, modal, and hosted ComfyDeploy machines (powered by modal)
comfydeploy1.mp4

Setting up a basic sd txt2img API

comfydeploy_base-sd-setup.mp4

Comfy Deploy Plugin Installation

Plugin lets you set up the machine as a target machine, and also upload workflow directly from there

  1. cd custom_nodes
  2. git clone https://github.com/BennyKok/comfyui-deploy.git
  3. Go to (https://comfydeploy.com) or a self-hosted version
    • Machines -> Add Machines
      • Enter a name and the URL of your machines (set up Ngrok for a public URL for your machines)
      • Create a new ComfyDeploy machines (pick any custom nodes).

Usecase

  1. Deploy a complicated comfy workflow with a versioning system
  2. Manage versioning and easily preview different generated versions' output
  3. Persistent API generated for Production and Staging environment
  4. Run the same comfyui workflow across different remote machines

467shots_so 1# Status & Timeline

WIP, welcomes contributors!! Please join Discord -> https://discord.gg/EEYcQmdYZw

Primary goal -> release v0.1.0 of stable Comfy Deploy

Major areas

  • Security enforcement
  • Error handling
  • QOL workflow improvement
  • API usage examples
  • Load balancing
  • Workflow dependencies checking (custom nodes)
  • Remote machines
  • Serverless machines? Possible to set up a clean environment via Salad, Modal, etc
  • LCM realtime web socket image gen

Tech Stack

  • Shadcn UI

  • NextJS

  • Clerk (Auth)

  • Neon / Vercel Postgres (Database)

  • Drizzle (ORM)

  • R2 / S3 (Object Storage)

Development

  1. git clone https://github.com/BennyKok/comfyui-deploy
  2. cd web
  3. bun i
  4. Start docker
  5. cp .env.example .env.local
  6. Replace JWT_SECRET with openssl rand -hex 32
  7. Get a local clerk dev key for NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY and CLERK_SECRET_KEY
  8. Keep a terminal live for bun run db-dev
  9. Execute the local migration to create the initial data bun run migrate-local
  10. Finally start the next server with bun dev

Schema Changes

  1. bun run generate
  2. bun run migrate-local

Special Thanks

  • comfyui
  • oss/acc

Self Hosting with Vercel

Video Tutorial Created by Ross and Syn

Build command

next build && bun run migrate-production

Install command

npx [email protected] install

Env key setup

POSTGRES_URL=

NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY=
CLERK_SECRET_KEY=

SPACES_ENDPOINT="http://localhost:4566"
SPACES_ENDPOINT_CDN="http://localhost:4566"
SPACES_BUCKET="comfyui-deploy"
SPACES_KEY="xyz"
SPACES_SECRET="aaa"

# generate using -> openssl rand -hex 32
JWT_SECRET=

# r2 settings
SPACES_REGION="auto"
SPACES_CDN_FORCE_PATH_STYLE="true"
SPACES_CDN_DONT_INCLUDE_BUCKET="true"

# digital ocean settings
SPACES_REGION="nyc3"
SPACES_CDN_FORCE_PATH_STYLE="false"

# s3 settings
SPACES_REGION="nyc3"
SPACES_CDN_DONT_INCLUDE_BUCKET="false"
SPACES_CDN_FORCE_PATH_STYLE="true"

comfyui-deploy's People

Contributors

bennykok avatar ecjojo avatar emmanuelmr18 avatar haohaocreates avatar hongsiu avatar jiseopx avatar karrixlee avatar mortlsyn avatar nicholaskao1029 avatar san45600 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

comfyui-deploy's Issues

Fly builder out of memory

Running locally, the code reports an error:ailed to find Server Action "null". This request might be from an older or newer deployment. Original error: Invariant: Missing 'next-action' header.

Error: Failed to find Server Action "null". This request might be from an older or newer deployment. Original error: Invariant: Missing 'next-action' header.
at t3 (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:39:1693)
at /data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:38:7079
at AsyncLocalStorage.run (node:async_hooks:338:14)
at t4 (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:38:6387)
at rk (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:39:25940)
at /data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:39:28648
at AsyncLocalStorage.run (node:async_hooks:338:14)
at Object.wrap (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:16172)
at /data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:39:28536
at AsyncLocalStorage.run (node:async_hooks:338:14)
at Object.wrap (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:36:15420)
at r_ (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:39:28463)
at rW.render (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/compiled/next-server/app-page.runtime.dev.js:39:32559)
at doRender (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/server/base-server.js:1394:44)
at cacheEntry.responseCache.get.routeKind (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/server/base-server.js:1555:34)
at ResponseCache.get (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/server/response-cache/index.js:49:26)
at DevServer.renderToResponseWithComponentsImpl (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/server/base-server.js:1463:53)
at /data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/server/base-server.js:992:121
at NextTracerImpl.trace (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/server/lib/trace/tracer.js:104:20)
at DevServer.renderToResponseWithComponents (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/server/base-server.js:992:41)
at DevServer.renderErrorToResponseImpl (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/server/base-server.js:2114:35)
at async pipe.req.req (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/server/base-server.js:1988:30)
at async DevServer.pipeImpl (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/server/base-server.js:911:25)
at async /data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/server/dev/next-dev-server.js:331:20
at async Span.traceAsyncFn (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/trace/trace.js:151:20)
at async DevServer.handleRequest (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/server/dev/next-dev-server.js:328:24)
at async invokeRender (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/server/lib/router-server.js:163:21)
at async handleRequest (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/server/lib/router-server.js:357:24)
at async requestHandlerImpl (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/server/lib/router-server.js:366:13)
at async Server.requestListener (/data/lidong/Comfyui2/ComfyUI/custom_nodes/comfyui-deploy/web/node_modules/next/dist/server/lib/start-server.js:144:13)

Cannot find downloaded model in the check point model list

Hello there, I'm new to the comfy deploy and I'm encountering some problems.

The workflow I'm using:

The Example SDXL Turbo workflow, I cloned it.

Problem I met:

I added Juggernaut V10 in to storage, the path is checkpoints/Juggernaut-X-RunDiffusion-NSFW.safetensors.

The model usl is: https://huggingface.co/RunDiffusion/Juggernaut-X-v10/resolve/main/Juggernaut-X-RunDiffusion-NSFW.safetensors

it successfully downloaded, but when I'm trying to use it in the workflow, I cannot find it in the dropdown list of the checkpoint.

My question is, am I doing it in the wrong way, or did I missed some step? What should I do to add a model from Hugging Face and use it in the checkpoint?

Thanks!

405 error in test run

image

Hello, I experienced a 405 error when I tried to do a test run in the workflow tab after configuring the following setup.
I'm curious about the cause:

  1. Installed comfy-deploy plugin on my PC and uploaded a workflow.
  2. Opened comfyui on port 8188 on my PC and used ngrok to connect it to a domain.
  3. Registered the domain as an endpoint in the custom machine, setting the type as comfy-deploy-serverless.
  4. Pressed 'run' in the workflow tab and entered a test prompt.
  5. Got 405 error

Adding models as dependency fails

Hi!
I'm trying to add models on a machine for a workflow. But no matter how I install the model, it doesn't work.
Edit: here's the model and custom node I'm installing: https://github.com/kohya-ss/ControlNet-LLLite-ComfyUI?tab=readme-ov-file
The custom node gets installed but its model stays undefined instead of: controllllite_v01032064e_sdxl_canny_anime.safetensors

Ways I've tried:
1 - On the comfyDeploy.com workflow dependencies models add with either the comfyui manager search dropdown, the civitai search dropdown or pasting a huggingface url, all lead to the same: a green notification pops up saying that the model is being downloaded, but then nothing else happens. The models list is never updated and running the workflow results in an error saying that the model is missing.

2 - Editing the workflow, going in comfyui manager and installing the model from there. Seems to work, but when I try to deploy, it doesn't seem to see the new dependency.

Thanks for letting me know how to properly add models, checkpoint or controlnets as dependencies.

Issue with uploading images to S3

Hi, I was trying to upload files to S3 bucket, but continuously getting 403 Forbidden error.
Bucket policy and permissions are correct and works perfectly using AWS CLI, but not via comfy-deploy.

I tried with localhost as well, files are uploaded, but broken images in UI.

Could you tell how to setup S3 bucket config for this project?
Thanks!

comfyui_deploy.users" does not exist

pg_proxy-1 | 2024/02/19 09:11:59 Got 6 bytes pg->client: WgAAAAVJ
postgres-1 | 2024-02-19 09:11:59.959 UTC [195] ERROR: relation "comfyui_deploy.users" does not exist at character 66
postgres-1 | 2024-02-19 09:11:59.959 UTC [195] STATEMENT: select "id", "username", "name", "created_at", "updated_at" from "comfyui_deploy"."users" "usersTable" where "usersTable"."id" = $1 limit $2
pg_proxy-1 | 2024/02/19 09:11:59 Got 5 bytes client->pg: WAAAAAQ=
pg_proxy-1 | 2024/02/19 09:11:59 failed to read from socket: EOF

clerk seems not login

[help]Upon successfully deploying according to the documentation, the page continually refreshes, The screen remains resolutely blank.
image

[clerk middleware debug info]
[clerk debug start: authMiddleware]
URL debug, {
"url": "http://localhost:3000/",
"method": "GET",
"headers": "{"accept":"text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,/;q=0.8,application/signed-exchange;v=b3;q=0.7","accept-encoding":"gzip, deflate","accept-language":"zh-CN,zh;q=0.9","connection":"close","cookie":"__client_uat=1711597789; ph_your-api-key_posthog=%7B%22distinct_id%22%3A%22018e832d-080a-7853-bd0e-a4494146a9bc%22%2C%22%24sesid%22%3A%5B1711614327889%2C%22018e8419-5e06-79d9-8321-a5c7fbd7d5f0%22%2C1711613238790%5D%7D; __clerk_db_jwt=eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.eyJkZXYiOiJkdmJfMmVJZ3dqa0c4dWNJQWtUdm9pMlFXeVQ4bkV5IiwiaWQiOiJjbGllbnRfMmVJZ3l1RW83UGlFZlNNeVhQbDZwbXJwTHlKIiwicm90YXRpbmdfdG9rZW4iOiI0MHo2OXA4Z2NkZ2twem1lNnBmYWF5ZW04ZTg1b3JwNGk3c3RpcmwyIn0.holImn5TZZ9Xasg39J-X-tpAtsVTHOJHZo-q-uwptfDmeAhYImen2Xrk9uXXeBucN9cNHyRydRU9gA_aJoTjxrA-4riHNlcFslJHrS3K9NwKLV1LHyP06Xfi2CtCCPx5Mbeo33U_8HOTdI1rH1pFgPfVP-08-J-wr5bYhdZ5DVmWzolUjDTKGNz88YeEQ7FC99eUznazZqhn1u2YZ3gPg_s80heXx5S_bcnzak2MA1SiiM71d1_Y9GlR7dpOZSAM4sGf60NwWrJp3KaVJWIiYTCBJW-IE17KGQdHUnQRYdBHiB8NbWZOSV2rU5GDrZS2h1CryvAdn6EE2myFdwWFZA; __session=eyJhbGciOiJSUzI1NiIsImNhdCI6ImNsX0I3ZDRQRDExMUFBQSIsImtpZCI6Imluc18yZUlVZzBJNDROcHpUaVBlZ3dPUlNrcW9lM0EiLCJ0eXAiOiJKV1QifQ.eyJhenAiOiJodHRwOi8vMTIzLjYwLjc4LjExNDo4MDAzIiwiZXhwIjoxNzExNjE2NTYzLCJpYXQiOjE3MTE2MTY1MDMsImlzcyI6Imh0dHBzOi8vbGVhZGluZy1hbmNob3Z5LTI2LmNsZXJrLmFjY291bnRzLmRldiIsIm5iZiI6MTcxMTYxNjQ5Mywic2lkIjoic2Vzc18yZUloMjZsSkdXT1UyMm9Va3ZCWEpaVWEwNHciLCJzdWIiOiJ1c2VyXzJlSVdKdkg2STlGdml0RkttVXZIT0NDOXM4cyJ9.oAGEd9DbQX1ECVc86HEPJKeXPzrW7PCgArJFl1EjsFooVitk9nWabEYrtlvGCS7vFEC6fI6gPfj9TIg-K3wIMTyruLgEqlHaYDbW2-8cDNweo9MELKG2k1T_vY_qhfKi4EaMRRd8vSD6ta8xiR0tn-_MoJH_18-GcPerIrv5ynvL6A6fTKw0REMQnbPX4CKcGO-h3agyben6vMel8UIdwZsxUf8YW6lj-nv51aJS8uJXSDDSLbLL4h6Hs266ZEY6YGfDzwkuEZGRGrYgokKo-j0dNaYQVVzjTHHb1ZcPVvNfihHydg-dfHnAbXddBSBw7SW1FLFlqVXhd9xHyOriLQ","host":"123.60.78.114","referer":"http://x.x.x.x:8003/\",\"upgrade-insecure-requests\":\"1\",\"user-agent\":\"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36","x-forwarded-for":"47.96.236.37, 47.96.236.37","x-forwarded-host":"123.60.78.114","x-forwarded-port":"3000","x-forwarded-proto":"http","x-real-ip":"47.96.236.37"}",
"nextUrl": "http://localhost:3000/",
"clerkUrl": "http://x.x.x.x/"
}
Options debug, {
"debug": true,
"beforeAuth": false,
"afterAuth": false
}
authenticateRequest state is interstitial, {
"status": "interstitial",
"reason": "cross-origin-referrer",
"message": "",
"frontendApi": "leading-anchovy-26.clerk.accounts.dev",
"publishableKey": "pk_test_bGVhZGluZy1hbmNob3Z5LTI2LmNsZXJrLmFjY291bnRzLmRldiQ",
"isSatellite": false,
"domain": "",
"proxyUrl": "",
"signInUrl": "",
"isSignedIn": false,
"isInterstitial": true,
"isUnknown": false,
"token": null
}
[clerk debug end: authMiddleware] (@clerk/nextjs=4.29.3,next=14.1.0)

[nginx config]
server {
listen 8003;
server_name _;
location / {
proxy_pass http://127.0.0.1:3000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_cache_bypass $http_pragma;
proxy_no_cache $http_pragma;
}
}

Modal Integration

As per README, it is "possible to set up a clean environment via Salad, Modal, etc."

Modal is perfectly suited to deploy comfyui-deploy, pun intended, as it can "scale to infinity, and to zero." In other words, Modal can handle deployment and scaling of ComfyUI by @comfyanonymous.

Check out their image, video, and 3D use cases here.

Modal became generally available on Oct 10, or 74 days ago.

For coding inspiration, check out a1111_webui.py by @modal-labs or running Stable Diffusion WebUI using Modal by 14nigo.net.

Love

I freaking love you project, man!

I just came to say this. I wanted you to know.

It saved me so much time from having to craft cogs to run flows on replicate.

Freaking awesome!

Thank you for this!

Running `bun dev` throws Request is not defined error

Hi guys,

great effort! I came across this project while searching for a way to deploy my comfyui workflows. This is just what I was searching for. I tried to get it running locally, but I'm running into issues. I managed to set up the docker images in docker-compose, but when I'm running bun dev I get this error:

~/code/comfyui-deploy/web ‹main*› » bun dev
$ next dev
/Users/florian.vallen/code/comfyui-deploy/web/node_modules/.pnpm/[email protected][email protected][email protected]/node_modules/next/dist/server/web/spec-extension/request.js:28
class NextRequest extends Request {
                          ^

ReferenceError: Request is not defined
    at Object.<anonymous> (/Users/florian.vallen/code/comfyui-deploy/web/node_modules/.pnpm/[email protected][email protected][email protected]/node_modules/next/dist/server/web/spec-extension/request.js:28:27)
    at Module._compile (node:internal/modules/cjs/loader:1165:14)
    at Object.Module._extensions..js (node:internal/modules/cjs/loader:1219:10)
    at Module.load (node:internal/modules/cjs/loader:1043:32)
    at Function.Module._load (node:internal/modules/cjs/loader:878:12)
    at Module.require (node:internal/modules/cjs/loader:1067:19)
    at Module.mod.require (/Users/florian.vallen/code/comfyui-deploy/web/node_modules/.pnpm/[email protected][email protected][email protected]/node_modules/next/dist/server/require-hook.js:64:28)
    at require (node:internal/modules/cjs/helpers:103:18)
    at Object.<anonymous> (/Users/florian.vallen/code/comfyui-deploy/web/node_modules/.pnpm/[email protected][email protected][email protected]/node_modules/next/dist/server/web/spec-extension/adapters/next-request.js:37:18)
    at Module._compile (node:internal/modules/cjs/loader:1165:14)

Does anyone know what I can do about this?

Thanks, Florian

Load Dynamic Number of Images As Input

I would like to be able to as input provide a list of images in certain cases. This is achieved easily in comfyUI by uploading individual pictures and making a batch out of them, but there is no dynamic way to automatically generate the batch from a list of pictures. As a complex example showing some edge cases of input let's say I was doing a pic2pic image generation, I was providing a batch of images to ipAdapter, an individual picture for the latent p2p stuff, and a picture batch for a roop like face swap step.

The desired inputs are:
IpAdapter Image: 1 to N pictures
Input Image: 1 picture
RoopSwap Image: 1 to N pictures

If there are custom_nodes which are compatible with adding N pictures dynamically and already compatible with ComfyDeploy that would also a great solution.

Crashing on restart from ComfyUI-Manager

python_embeded\python.exe: Can't open file '\ComfyUI\custom_nodes\comfyui-deploy\ComfyUI\main.py'. I think it has to be in the main directory of ComfyUI, but it's looking in the nodes folder.

Example Page

A page showing some basic comfyui workflow setup with ComfyDeploy

It could be using the workflow share features, having a list of hardcoded public share + listed somewhere.

No such file or directory: 'comfy-deploy.log'

This issue happens when I start ComfyUI (Windows portable version). Everything seems to work but it just can't find the specific log file.

Could be due to path issues since I am using the portable version of ComfyUI.

Error:

FileNotFoundError: [Errno 2] No such file or directory: 'E:\\AI\\ComfyUI_windows_portable\\ComfyUI\\comfy-deploy.log'
ERROR:aiohttp.server:Error handling request
Traceback (most recent call last):
  File "E:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\aiohttp\web_protocol.py", line 452, in _handle_request
    resp = await request_handler(request)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\aiohttp\web_app.py", line 543, in _handle
    resp = await handler(request)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\aiohttp\web_middlewares.py", line 114, in impl
    return await handler(request)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AI\ComfyUI_windows_portable\ComfyUI\server.py", line 47, in cache_control
    response: web.Response = await handler(request)
                             ^^^^^^^^^^^^^^^^^^^^^^
  File "E:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-deploy\custom_routes.py", line 160, in websocket_handler
    await send_first_time_log(sid)
  File "E:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-deploy\custom_routes.py", line 466, in send_first_time_log
    with open(log_file_path, 'r') as file:
         ^^^^^^^^^^^^^^^^^^^^^^^^

TypeError: 'NoneType' object is not iterable erroe while running workflow

I started getting issues with workflow_api since latest update, tried many options but no luck, can you tell whats causing this issue while running workflow from both UI and API.

Traceback (most recent call last):
File "/home/ComfyUI/compfyui/lib/python3.10/site-packages/aiohttp/web_protocol.py", line 452, in _handle_request
resp = await request_handler(request)
File "/home//ComfyUI/compfyui/lib/python3.10/site-packages/aiohttp/web_app.py", line 543, in _handle
resp = await handler(request)
File "/home/ComfyUI/compfyui/lib/python3.10/site-packages/aiohttp/web_middlewares.py", line 114, in impl
return await handler(request)
File "/home/ComfyUI/server.py", line 41, in cache_control
response: web.Response = await handler(request)
File "/home/ComfyUI/server.py", line 53, in cors_middleware
response = await handler(request)
File "/home/ComfyUI/custom_nodes/comfyui-deploy/custom_routes.py", line 193, in comfy_deploy_run
apply_random_seed_to_workflow(workflow_api)
File "/home/ComfyUI/custom_nodes/comfyui-deploy/custom_routes.py", line 100, in apply_random_seed_to_workflow
for key in workflow_api:
TypeError: 'NoneType' object is not iterable

Support ComfyUIDeployExternalNumber and ComfyUIDeployExternalNumberInt

When using number nodes, ComfyUIDeployExternalNumber and ComfyUIDeployExternalNumberInt, the values are not replaced when running a workflow.

createRun.ts:106 only accounts for text nodes.

    // Replace the inputs
    if (inputs && workflow_api) {
      for (const key in inputs) {
        Object.entries(workflow_api).forEach(([_, node]) => {
          if (node.inputs["input_id"] === key) {
            node.inputs["input_id"] = inputs[key];
            // Fix for external text default value
            if (node.class_type == "ComfyUIDeployExternalText") {
              node.inputs["default_value"] = inputs[key];
            }
        });
      }
    }

There's a few issues here.

  1. While inputs[key] has the type string | number, it will always be a string here. If you just pass node.inputs["default_value"] = inputs[key]; then ComfyUI will fail when running the workflow.
  2. If you provide a number value in one of these inputs in a run, then clear it in the next the value is set to "" instead of the original undefined, so if you fix for the first issue you then have to validate empty strings.

This is a quick and dirty fix I implemented to get around this. I also found the node.inputs["input_id"] = inputs[key]; line does absolutely nothing.

    // Replace the inputs
    if (inputs && workflow_api) {
      for (const key in inputs) {
        Object.entries(workflow_api).forEach(([_, node]) => {
          if (node.inputs["input_id"] === key) {
            if (inputs[key] === "") return;

            let value: string | number = inputs[key].toString();

            if (node.class_type === "ComfyUIDeployExternalNumber") {
              value = parseFloat(value);
            } else if (node.class_type === "ComfyUIDeployExternalNumberInt") {
              value = parseInt(value);
            }

            node.inputs["default_value"] = value;
          }
        });
      }
    }

Checkpoint loading problem

Error occurred when executing CheckpointLoaderSimple:

Error while deserializing header: HeaderTooLarge

File "/comfyui/execution.py", line 153, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "/comfyui/execution.py", line 83, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "/comfyui/execution.py", line 76, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "/comfyui/nodes.py", line 476, in load_checkpoint
out = comfy.sd.load_checkpoint_guess_config(ckpt_path, output_vae=True, output_clip=True, embedding_directory=folder_paths.get_folder_paths("embeddings"))
File "/comfyui/comfy/sd.py", line 427, in load_checkpoint_guess_config
sd = comfy.utils.load_torch_file(ckpt_path)
File "/comfyui/comfy/utils.py", line 13, in load_torch_file
sd = safetensors.torch.load_file(ckpt, device=device.type)
File "/usr/local/lib/python3.10/site-packages/safetensors/torch.py", line 308, in load_file
with safe_open(filename, framework="pt", device=device) as f:

were working with VPS, need Parallelization

would it be possible to massively parallelize this at the step-level somehow? would be interesting to run e.g. LCM video gen for 6s across 1000 machines vs 10 minutes across 1 (in theory same overall usage, modulo how models are cached)

my asnwer

Massively parallelizing a task like video generation across multiple machines is theoretically feasible and could significantly reduce total processing time. Here's how it might work:

  1. Task Splitting: The video generation task would be divided into smaller segments, each segment processed by a different machine. For a 6-second video, if each machine handles a fraction of a second, 1000 machines could process simultaneously.

  2. Infrastructure: Cloud services with auto-scaling capabilities would be ideal for spinning up multiple instances. Container orchestration systems like Kubernetes could manage the deployment and scaling of these instances.

  3. Load Balancing: A load balancer would distribute the task segments across the available machines to ensure even workload distribution.

  4. Caching Models: To optimize performance, models could be cached locally on each machine or on a fast, distributed cache that all instances can access quickly.

  5. Coordination: A master process would be necessary to coordinate the task distribution, track the progress of each segment, and assemble the final video.

  6. Networking and Bandwidth: Network bandwidth and latency need consideration, as the transfer of large datasets and models between machines can become a bottleneck.

  7. Cost Considerations: Although the overall usage might be the same, cloud providers often charge for the number of instances, data transfer, and storage, which could affect the cost.

  8. Parallelization Overhead: There's an overhead associated with parallelization (e.g., initialization, synchronization), which might not make the scaling perfectly linear.

In practice, achieving optimal parallelization requires careful planning and consideration of the bottlenecks and overheads associated with distributed computing. It's a common approach in large-scale computing tasks and is particularly relevant in fields like machine learning and data processing, where tasks can be easily parallelized and distributed across multiple computing nodes.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.