Giter Club home page Giter Club logo

comfyui_native_dynamicrafter's Introduction

ComfyUI - Native DynamiCrafter

DynamiCrafter that works natively with ComfyUI's nodes, optimizations, ControlNet, and more.

image

DynamiCrafter_00298 DynamiCrafter_00327

Note

While this is still considered WIP (or beta), everything should be fully functional and adaptable to various workflows.

Getting Started

Go to your custom_nodes directory in ComfyUI, and install by:

git clone https://github.com/ExponentialML/ComfyUI_Native_DynamiCrafter.git

Important

This is a rapid release project. If there are any issues installing from main, the last stable branch is here. If everything is working fine, you can ignore this, but will miss out on the latest features.

Installation

The pruned UNet checkpoints have been uploaded to HuggingFace. Each variant is working and fully functional.

https://huggingface.co/ExponentialML/DynamiCrafterUNet

Instructions

You will also need a VAE, The CLIP model used with Stable Diffusion 2.1, and the Open CLIP Vision Model. All of the necessary model downloads are at that link.

If you aready have the base SD models, you do not need to download them (just use the CheckpointSimpleLoader without the model part).

Place the DynamiCrafter models inside ComfyUI_Path/models/dynamicrafter_models

If you are downloading the CLIP and VAE models separately, place them under their respective paths in the ComfyUI_Path/models/ directory.

Usage

  • model: The loaded DynamiCrafter model.

  • clip_vision: The CLIP Vision Checkpoint.

  • vae: A Stable Diffusion VAE. If it works with < SD 2.1, it will work with this.

  • image_proj_model: The Image Projection Model that is in the DynamiCrafter model file.

  • images: The input images necessary for inference. If you are doing interpolation, you can simply batch two images together, check the toggle (see below), and everything will be handled automatically.

  • use_interpolation: Use the interpolation mode with the interpolation model variant. You can interpolate any two frames (images), or predict the rest using one input.

  • fps: Controls the speed of the video. If you're using a 256 based model, the highest plausible value is 4

  • frames: The amount of frames to use. If you're doing interpolation, the max is 16. This is strictly enforced as it doesn't work properly (blurry results) if set higher.

  • model (output): The output into the a Sampler.

  • empty_latent: An empty latent with the same size and frames as the processed ones.

  • latent_img: If you're doing Img2Img based workflows, this is the necessary one to use.

ControlNet Support

You can now use DynamiCrafter by applying ControlNet to the Spatial (image) portion to guide video generations in various ways. The ControlNets are based on 2.1, so you must download them at the link below (Thanks @thibaud !) .

ControlNet 2.1: https://huggingface.co/thibaud/controlnet-sd21

After you download them, you can use them as you would with any other workflow.

Tips

Tip

You don't have to use the latent outputs. As long as you use the same frame length (as your batch size) and same height and with as your image inputs, you can use your own latents. This means that you can experiment with inpainting and so on.

Tip

You can choose which frame you use as init by using VAE Encode Inpaint or Set Latent Noise Mask. You set the beginning batch mask to full black, while the rest are at full white. This also means you can do interpolation with regular models. As these workflows are more advanced, examples will arrive at a future date.

TODO

  • Add various workflows.
  • Add advanced workflows.
  • Add support for Spatial Transformer options.
  • Add ControlNet support.
  • Ensure attention optimizations are working properly.
  • Add autoregressive nodes (this may be a separate repository)
  • Add examples. (For more, check here).

Credits

Thanks to @Doubiiu for for open sourcing DynamiCrafter! Please support their work, and please follow any license terms they may uphold.

comfyui_native_dynamicrafter's People

Contributors

exponentialml avatar

Stargazers

 avatar  avatar Tho avatar  avatar Yuan-Man avatar  avatar  avatar leolee avatar  avatar  avatar  avatar dkluffy avatar  avatar  avatar  avatar  avatar 小子欠扁 avatar Michael avatar  avatar  avatar Tasha Upchurch avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar AliBug avatar  avatar  avatar  avatar  avatar  avatar syddharth avatar  avatar ImpactFrames avatar  avatar  avatar Vanche avatar wolf6gl avatar KA avatar yolain avatar tu avatar Lau Van Kiet avatar chen zhong avatar  avatar  avatar  avatar  avatar leisu avatar Lee Jia Keat avatar redstar avatar  avatar DoubleXING avatar  avatar  avatar lizhou.zhu avatar Mel Massadian avatar  avatar  avatar  avatar  avatar VALADI K JAGANATHAN avatar  avatar  avatar Shiimizu avatar  avatar  avatar CorvaeOboro avatar David Marx avatar  avatar Moshferatu avatar  avatar  avatar  avatar  avatar  avatar ray avatar Paragoner avatar  avatar Jeff Cook avatar Jonathan Fischoff avatar toyxyz avatar Wibur avatar  avatar

Watchers

 avatar

comfyui_native_dynamicrafter's Issues

encountered an error when running this workflow using ComfyUI on Colab

Hi,I am using ComfyUI on Colab, and I encountered a problem when running this workflow; it seems that the DynamiCrafter model I downloaded was not recognized. I have saved the DynamiCrafter model at the location /content/ComfyUI/models/checkpoints/dynamicrafter_models.
1714585592478

image

Failed to validate prompt for output 49:

  • RescaleCFG 52:
    • Return type mismatch between linked nodes: model, DynamiCrafter != MODEL
  • VAELoader 53:
    • Value not in list: vae_name: '1.5\vae-ft-mse-840000-ema-pruned.safetensors' not in ['Anime.vae.pt', 'AnythingmodelVAEv4.0.pt', 'VAE84.vae.pt', 'orangemix.vae.pt', 'sdxl_vae.safetensors']
  • DynamiCrafterProcessor 40:
    • Exception when validating inner node: tuple index out of range
      Output will be ignored

Error occurred when executing KSampler: CrossAttention.efficient_forward() got an unexpected keyword argument 'value'

Error occurred when executing KSampler:

CrossAttention.efficient_forward() got an unexpected keyword argument 'value'

File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\nodes.py", line 1344, in sample
return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\nodes.py", line 1314, in common_ksampler
samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 9, in informative_sample
return original_sample(*args, **kwargs) # This code helps interpret error messages that occur within exceptions but does not have any impact on other operations.
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\sampling.py", line 313, in motion_sample
return orig_comfy_sample(model, noise, *args, **kwargs)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\control_reference.py", line 47, in refcn_sample
return orig_comfy_sample(model, *args, **kwargs)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\utils.py", line 111, in uncond_multiplier_check_cn_sample
return orig_comfy_sample(model, *args, **kwargs)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\sample.py", line 37, in sample
samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 761, in sample
return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 663, in sample
return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 650, in sample
output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 629, in inner_sample
samples = sampler.sample(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 534, in sample
samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\k_diffusion\sampling.py", line 154, in sample_euler_ancestral
denoised = model(x, sigmas[i] * s_in, **extra_args)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 272, in call
out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 616, in call
return self.predict_noise(*args, **kwargs)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 619, in predict_noise
return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 258, in sampling_function
out = calc_cond_batch(model, conds, x, timestep, model_options)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\samplers.py", line 216, in calc_cond_batch
output = model_options['model_function_wrapper'](model.apply_model, {"input": input_x, "timestep": timestep_, "c": c, "cond_or_uncond": cond_or_uncond}).chunk(batch_chunks)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI_Native_DynamiCrafter\nodes.py", line 128, in _forward
x_out = apply_model(
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI-Advanced-ControlNet\adv_control\utils.py", line 63, in apply_model_uncond_cleanup_wrapper
return orig_apply_model(self, *args, **kwargs)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\comfy\model_base.py", line 97, in apply_model
model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\networks\openaimodel3d.py", line 751, in forward
h = forward_timestep_embed(
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\networks\openaimodel3d.py", line 38, in forward_timestep_embed
x = layer(x, context)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\attention.py", line 554, in forward
x = block(x, context=context, **kwargs)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\attention.py", line 364, in forward
return checkpoint(self._forward, input_tuple, self.parameters(), self.checkpoint)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\common.py", line 94, in checkpoint
return func(*inputs)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\attention.py", line 425, in _forward
n = self.attn1(n, context=context_attn1, value=value_attn1)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "D:\AI\ComfyUI-aki-v1.3\ComfyUI-aki-v1.3\python\lib\site-packages\torch\nn\modules\module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)

Sizes of tensors must match except in dimension 1. Expected size 1280 but got size 1024 for tensor number 1 in the list.

use your workflow and got an error
Sizes of tensors must match except in dimension 1. Expected size 1280 but got size 1024 for tensor number 1 in the list.

Error occurred when executing KSampler:

Sizes of tensors must match except in dimension 1. Expected size 1280 but got size 1024 for tensor number 1 in the list.

File "D:\ai\ComfyUI-aki-v1.1\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "D:\ai\ComfyUI-aki-v1.1\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "D:\ai\ComfyUI-aki-v1.1\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "D:\ai\ComfyUI-aki-v1.1\nodes.py", line 1369, in sample
return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
File "D:\ai\ComfyUI-aki-v1.1\nodes.py", line 1339, in common_ksampler
samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
File "D:\ai\ComfyUI-aki-v1.1\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 22, in informative_sample
raise e
File "D:\ai\ComfyUI-aki-v1.1\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 9, in informative_sample
return original_sample(*args, **kwargs) # This code helps interpret error messages that occur within exceptions but does not have any impact on other operations.
File "D:\ai\ComfyUI-aki-v1.1\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\sampling.py", line 267, in motion_sample
return orig_comfy_sample(model, noise, *args, **kwargs)
File "D:\ai\ComfyUI-aki-v1.1\comfy\sample.py", line 100, in sample
samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
File "D:\ai\ComfyUI-aki-v1.1\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 1380, in KSampler_sample
return _KSampler_sample(*args, **kwargs)
File "D:\ai\ComfyUI-aki-v1.1\comfy\samplers.py", line 705, in sample
return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
File "D:\ai\ComfyUI-aki-v1.1\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 1399, in sample
return _sample(*args, **kwargs)
File "D:\ai\ComfyUI-aki-v1.1\comfy\samplers.py", line 610, in sample
samples = sampler.sample(model_wrap, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
File "D:\ai\ComfyUI-aki-v1.1\comfy\samplers.py", line 548, in sample
samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
File "D:\ai\ComfyUI-aki-v1.1\python\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "D:\ai\ComfyUI-aki-v1.1\comfy\k_diffusion\sampling.py", line 154, in sample_euler_ancestral
denoised = model(x, sigmas[i] * s_in, **extra_args)
File "D:\ai\ComfyUI-aki-v1.1\python\lib\site-packages\torch\nn\modules\module.py", line 1511, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "D:\ai\ComfyUI-aki-v1.1\python\lib\site-packages\torch\nn\modules\module.py", line 1520, in _call_impl
return forward_call(*args, **kwargs)
File "D:\ai\ComfyUI-aki-v1.1\comfy\samplers.py", line 286, in forward
out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, model_options=model_options, seed=seed)
File "D:\ai\ComfyUI-aki-v1.1\python\lib\site-packages\torch\nn\modules\module.py", line 1511, in _wrapped_call_impl
return self.call_impl(*args, **kwargs)
File "D:\ai\ComfyUI-aki-v1.1\python\lib\site-packages\torch\nn\modules\module.py", line 1520, in call_impl
return forward_call(*args, **kwargs)
File "D:\ai\ComfyUI-aki-v1.1\comfy\samplers.py", line 273, in forward
return self.apply_model(*args, **kwargs)
File "D:\ai\ComfyUI-aki-v1.1\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 1012, in apply_model
out = super().apply_model(*args, **kwargs)
File "D:\ai\ComfyUI-aki-v1.1\comfy\samplers.py", line 270, in apply_model
out = sampling_function(self.inner_model, x, timestep, uncond, cond, cond_scale, model_options=model_options, seed=seed)
File "D:\ai\ComfyUI-aki-v1.1\comfy\samplers.py", line 250, in sampling_function
cond_pred, uncond_pred = calc_cond_uncond_batch(model, cond, uncond
, x, timestep, model_options)
File "D:\ai\ComfyUI-aki-v1.1\custom_nodes\ComfyUI-TiledDiffusion.patches.py", line 4, in calc_cond_uncond_batch
return calc_cond_uncond_batch_original_tiled_diffusion_39266ca6(model, cond, uncond, x_in, timestep, model_options)
File "D:\ai\ComfyUI-aki-v1.1\comfy\samplers.py", line 222, in calc_cond_uncond_batch
output = model_options['model_function_wrapper'](model.apply_model, {"input": input_x, "timestep": timestep
, "c": c, "cond_or_uncond": cond_or_uncond}).chunk(batch_chunks)
File "D:\ai\ComfyUI-aki-v1.1\custom_nodes\ComfyUI_Native_DynamiCrafter\nodes.py", line 128, in _forward
x_out = apply_model(
File "D:\ai\ComfyUI-aki-v1.1\comfy\model_base.py", line 96, in apply_model
model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
File "D:\ai\ComfyUI-aki-v1.1\python\lib\site-packages\torch\nn\modules\module.py", line 1511, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "D:\ai\ComfyUI-aki-v1.1\python\lib\site-packages\torch\nn\modules\module.py", line 1520, in _call_impl
return forward_call(*args, **kwargs)
File "D:\ai\ComfyUI-aki-v1.1\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\networks\openaimodel3d.py", line 723, in forward
context = context_processor(context, num_video_frames, img_emb=img_emb)
File "D:\ai\ComfyUI-aki-v1.1\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\networks\openaimodel3d.py", line 318, in context_processor
context = torch.cat([context, img_emb.to(context.device, context.dtype)], dim=1

Conflict with DynamiCrafter

449 | chaojie | ComfyUI-DynamiCrafter | Better Dynamic, Higher Resolution, and Stronger Coherence!Conflicted Nodes:DynamiCrafterLoader [ComfyUI_Native_DynamiCrafter] | Try updateDisableUninstall -- | -- | -- | -- | --   | 613 | ExponentialML | ComfyUI - Native DynamiCrafter | DynamiCrafter that works natively with ComfyUI's nodes, optimizations, ControlNet, and more.Conflicted Nodes:DynamiCrafterLoader [ComfyUI-DynamiCrafter] | Try updateDisableUninstall 449 chaojie [ComfyUI-DynamiCrafter](https://github.com/chaojie/ComfyUI-DynamiCrafter) Better Dynamic, Higher Resolution, and Stronger Coherence! Conflicted Nodes: DynamiCrafterLoader [ComfyUI_Native_DynamiCrafter]

Try updateDisableUninstall
613 ExponentialML ComfyUI - Native DynamiCrafter DynamiCrafter that works natively with ComfyUI's nodes, optimizations, ControlNet, and more.
Conflicted Nodes:
DynamiCrafterLoader [ComfyUI-DynamiCrafter]

Try updateDisableUninstall

Please rename or fix this, since I need both controlnet in your version and motion scale function in the original version.

Error occurred when executing KSampler : CrossAttention.efficient_forward() got an unexpected keyword argument 'value'

Error occurred when executing KSampler:

CrossAttention.efficient_forward() got an unexpected keyword argument 'value'

File "D:\tools\ComfyUI-aki-v1.2\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\nodes.py", line 1369, in sample
return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\nodes.py", line 1339, in common_ksampler
samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 9, in informative_sample
return original_sample(*args, **kwargs) # This code helps interpret error messages that occur within exceptions but does not have any impact on other operations.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\sampling.py", line 267, in motion_sample
return orig_comfy_sample(model, noise, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\comfy\sample.py", line 100, in sample
samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 1380, in KSampler_sample
return _KSampler_sample(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\comfy\samplers.py", line 705, in sample
return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 1399, in sample
return _sample(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\comfy\samplers.py", line 610, in sample
samples = sampler.sample(model_wrap, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\comfy\samplers.py", line 548, in sample
samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2.ext\Lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\comfy\k_diffusion\sampling.py", line 137, in sample_euler
denoised = model(x, sigma_hat * s_in, **extra_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2.ext\Lib\site-packages\torch\nn\modules\module.py", line 1511, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2.ext\Lib\site-packages\torch\nn\modules\module.py", line 1520, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\comfy\samplers.py", line 286, in forward
out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, model_options=model_options, seed=seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2.ext\Lib\site-packages\torch\nn\modules\module.py", line 1511, in _wrapped_call_impl
return self.call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2.ext\Lib\site-packages\torch\nn\modules\module.py", line 1520, in call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\comfy\samplers.py", line 273, in forward
return self.apply_model(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 1012, in apply_model
out = super().apply_model(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\comfy\samplers.py", line 270, in apply_model
out = sampling_function(self.inner_model, x, timestep, uncond, cond, cond_scale, model_options=model_options, seed=seed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\comfy\samplers.py", line 250, in sampling_function
cond_pred, uncond_pred = calc_cond_uncond_batch(model, cond, uncond
, x, timestep, model_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\custom_nodes\ComfyUI-TiledDiffusion.patches.py", line 4, in calc_cond_uncond_batch
return calc_cond_uncond_batch_original_tiled_diffusion_aaa7df7f(model, cond, uncond, x_in, timestep, model_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\comfy\samplers.py", line 222, in calc_cond_uncond_batch
output = model_options['model_function_wrapper'](model.apply_model, {"input": input_x, "timestep": timestep
, "c": c, "cond_or_uncond": cond_or_uncond}).chunk(batch_chunks)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\custom_nodes\ComfyUI_Native_DynamiCrafter\nodes.py", line 128, in _forward
x_out = apply_model(
^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\comfy\model_base.py", line 96, in apply_model
model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2.ext\Lib\site-packages\torch\nn\modules\module.py", line 1511, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2.ext\Lib\site-packages\torch\nn\modules\module.py", line 1520, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\networks\openaimodel3d.py", line 751, in forward
h = forward_timestep_embed(
^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\networks\openaimodel3d.py", line 38, in forward_timestep_embed
x = layer(x, context)
^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2.ext\Lib\site-packages\torch\nn\modules\module.py", line 1511, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2.ext\Lib\site-packages\torch\nn\modules\module.py", line 1520, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\attention.py", line 554, in forward
x = block(x, context=context, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2.ext\Lib\site-packages\torch\nn\modules\module.py", line 1511, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2.ext\Lib\site-packages\torch\nn\modules\module.py", line 1520, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\attention.py", line 364, in forward
return checkpoint(self._forward, input_tuple, self.parameters(), self.checkpoint)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\common.py", line 94, in checkpoint
return func(*inputs)
^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\attention.py", line 425, in _forward
n = self.attn1(n, context=context_attn1, value=value_attn1)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2.ext\Lib\site-packages\torch\nn\modules\module.py", line 1511, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\tools\ComfyUI-aki-v1.2.ext\Lib\site-packages\torch\nn\modules\module.py", line 1520, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

CrossAttention.efficient_forward() got an unexpected keyword argument 'value'

Error occurred when executing KSampler //Inspire:

CrossAttention.efficient_forward() got an unexpected keyword argument 'value'

File "E:\Blender_ComfyUI\ComfyUI\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "E:\Blender_ComfyUI\ComfyUI\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "E:\Blender_ComfyUI\ComfyUI\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "E:\Blender_ComfyUI\ComfyUI\custom_nodes\ComfyUI-Inspire-Pack\inspire\a1111_compat.py", line 77, in sample
return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise, noise_mode, incremental_seed_mode=batch_seed_mode, variation_seed=variation_seed, variation_strength=variation_strength)
File "E:\Blender_ComfyUI\ComfyUI\custom_nodes\ComfyUI-Inspire-Pack\inspire\a1111_compat.py", line 42, in common_ksampler
samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
File "E:\Blender_ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack\modules\impact\sample_error_enhancer.py", line 9, in informative_sample
return original_sample(*args, **kwargs) # This code helps interpret error messages that occur within exceptions but does not have any impact on other operations.
File "E:\Blender_ComfyUI\ComfyUI\custom_nodes\comfyui-diffusion-cg\recenter.py", line 29, in sample_center
return SAMPLE(*args, **kwargs)
File "E:\Blender_ComfyUI\ComfyUI\custom_nodes\ComfyUI-AnimateDiff-Evolved\animatediff\sampling.py", line 267, in motion_sample
return orig_comfy_sample(model, noise, *args, **kwargs)
File "E:\Blender_ComfyUI\ComfyUI\comfy\sample.py", line 100, in sample
samples = sampler.sample(noise, positive_copy, negative_copy, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
File "E:\Blender_ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 1380, in KSampler_sample
return _KSampler_sample(*args, **kwargs)
File "E:\Blender_ComfyUI\ComfyUI\comfy\samplers.py", line 705, in sample
return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
File "E:\Blender_ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 1399, in sample
return _sample(*args, **kwargs)
File "E:\Blender_ComfyUI\ComfyUI\comfy\samplers.py", line 610, in sample
samples = sampler.sample(model_wrap, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
File "E:\Blender_ComfyUI\ComfyUI\comfy\samplers.py", line 548, in sample
samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
File "E:\Blender_ComfyUI\python_embeded\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "E:\Blender_ComfyUI\ComfyUI\comfy\k_diffusion\sampling.py", line 137, in sample_euler
denoised = model(x, sigma_hat * s_in, **extra_args)
File "E:\Blender_ComfyUI\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in call_impl
return forward_call(*args, **kwargs)
File "E:\Blender_ComfyUI\ComfyUI\comfy\samplers.py", line 286, in forward
out = self.inner_model(x, sigma, cond=cond, uncond=uncond, cond_scale=cond_scale, model_options=model_options, seed=seed)
File "E:\Blender_ComfyUI\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in call_impl
return forward_call(*args, **kwargs)
File "E:\Blender_ComfyUI\ComfyUI\comfy\samplers.py", line 273, in forward
return self.apply_model(*args, **kwargs)
File "E:\Blender_ComfyUI\ComfyUI\custom_nodes\ComfyUI_smZNodes\smZNodes.py", line 1012, in apply_model
out = super().apply_model(*args, **kwargs)
File "E:\Blender_ComfyUI\ComfyUI\comfy\samplers.py", line 270, in apply_model
out = sampling_function(self.inner_model, x, timestep, uncond, cond, cond_scale, model_options=model_options, seed=seed)
File "E:\Blender_ComfyUI\ComfyUI\comfy\samplers.py", line 250, in sampling_function
cond_pred, uncond_pred = calc_cond_uncond_batch(model, cond, uncond
, x, timestep, model_options)
File "E:\Blender_ComfyUI\ComfyUI\comfy\samplers.py", line 222, in calc_cond_uncond_batch
output = model_options['model_function_wrapper'](model.apply_model, {"input": input_x, "timestep": timestep
, "c": c, "cond_or_uncond": cond_or_uncond}).chunk(batch_chunks)
File "E:\Blender_ComfyUI\ComfyUI\custom_nodes\ComfyUI_Native_DynamiCrafter\nodes.py", line 128, in _forward
x_out = apply_model(
File "E:\Blender_ComfyUI\ComfyUI\comfy\model_base.py", line 96, in apply_model
model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
File "E:\Blender_ComfyUI\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "E:\Blender_ComfyUI\ComfyUI\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\networks\openaimodel3d.py", line 751, in forward
h = forward_timestep_embed(
File "E:\Blender_ComfyUI\ComfyUI\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\networks\openaimodel3d.py", line 38, in forward_timestep_embed
x = layer(x, context)
File "E:\Blender_ComfyUI\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "E:\Blender_ComfyUI\ComfyUI\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\attention.py", line 554, in forward
x = block(x, context=context, **kwargs)
File "E:\Blender_ComfyUI\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "E:\Blender_ComfyUI\ComfyUI\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\attention.py", line 364, in forward
return checkpoint(self._forward, input_tuple, self.parameters(), self.checkpoint)
File "E:\Blender_ComfyUI\ComfyUI\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\common.py", line 94, in checkpoint
return func(*inputs)
File "E:\Blender_ComfyUI\ComfyUI\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\attention.py", line 425, in _forward
n = self.attn1(n, context=context_attn1, value=value_attn1)
File "E:\Blender_ComfyUI\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)

Error occurred when executing DynamiCrafterLoader: Error(s) in loading state_dict for Resampler:

image
image

Error occurred when executing DynamiCrafterLoader:

Error(s) in loading state_dict for Resampler:
Missing key(s) in state_dict: "latents", "proj_in.weight", "proj_in.bias", "proj_out.weight", "proj_out.bias", "norm_out.weight", "norm_out.bias", "layers.0.0.norm1.weight", "layers.0.0.norm1.bias", "layers.0.0.norm2.weight", "layers.0.0.norm2.bias", "layers.0.0.to_q.weight", "layers.0.0.to_kv.weight", "layers.0.0.to_out.weight", "layers.0.1.0.weight", "layers.0.1.0.bias", "layers.0.1.1.weight", "layers.0.1.3.weight", "layers.1.0.norm1.weight", "layers.1.0.norm1.bias", "layers.1.0.norm2.weight", "layers.1.0.norm2.bias", "layers.1.0.to_q.weight", "layers.1.0.to_kv.weight", "layers.1.0.to_out.weight", "layers.1.1.0.weight", "layers.1.1.0.bias", "layers.1.1.1.weight", "layers.1.1.3.weight", "layers.2.0.norm1.weight", "layers.2.0.norm1.bias", "layers.2.0.norm2.weight", "layers.2.0.norm2.bias", "layers.2.0.to_q.weight", "layers.2.0.to_kv.weight", "layers.2.0.to_out.weight", "layers.2.1.0.weight", "layers.2.1.0.bias", "layers.2.1.1.weight", "layers.2.1.3.weight", "layers.3.0.norm1.weight", "layers.3.0.norm1.bias", "layers.3.0.norm2.weight", "layers.3.0.norm2.bias", "layers.3.0.to_q.weight", "layers.3.0.to_kv.weight", "layers.3.0.to_out.weight", "layers.3.1.0.weight", "layers.3.1.0.bias", "layers.3.1.1.weight", "layers.3.1.3.weight".

File "G:\AI\comfy2\ComfyUI_windows_portable\ComfyUI\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "G:\AI\comfy2\ComfyUI_windows_portable\ComfyUI\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "G:\AI\comfy2\ComfyUI_windows_portable\ComfyUI\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "G:\AI\comfy2\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_Native_DynamiCrafter\nodes.py", line 364, in load_dynamicrafter
image_proj_model = get_image_proj_model(image_proj_dict)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "G:\AI\comfy2\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_Native_DynamiCrafter\utils\model_utils.py", line 96, in get_image_proj_model
ImageProjModel.load_state_dict(state_dict)
File "G:\AI\comfy2\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 2153, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(

name 'ff_in' is not defined

image

Error occurred when executing DynamiCrafterLoader:

name 'ff_in' is not defined

File "C:\ComfyUI\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUI\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUI\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUI\custom_nodes\ComfyUI_Native_DynamiCrafter\nodes.py", line 357, in load_dynamicrafter
model = model_base.BaseModel(
^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUI\comfy\model_base.py", line 62, in init
self.diffusion_model = unet_model(**unet_config, device=device, operations=operations)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUI\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\networks\openaimodel3d.py", line 484, in init
TemporalTransformer(
File "C:\ComfyUI\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\attention.py", line 617, in init
self.transformer_blocks = nn.ModuleList([
^
File "C:\ComfyUI\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\attention.py", line 618, in
BasicTransformerBlock(
File "C:\ComfyUI\custom_nodes\ComfyUI_Native_DynamiCrafter\lvdm\modules\attention.py", line 310, in init
self.ff_in = ff_in or inner_dim is not None
^^^^^

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.