This set of nodes will apply various activation functions directly to latent tensors. Have fun and experiment with different non-linear transformations of the latent space!
The following activation functions are included in this pack:
-
ReLU:
ReLU(x) = max(0, x)
-
Sigmoid:
Sigmoid(x) = 1 / (1 + e-x)
-
Tanh:
Tanh(x) = (ex - e-x) / (ex + e-x)
-
Leaky ReLU:
Leaky ReLU(x) = max(0.01x, x)
-
ELU:
ELU(x) = x
ifx > 0
, elseELU(x) = α (ex - 1)
-
Softplus:
Softplus(x) = log(1 + ex)
-
Swish:
Swish(x) = x * Sigmoid(x)
-
GELU:
GELU(x) = 0.5x(1 + tanh(√(2/π) (x + 0.044715x3)))
-
SELU:
SELU(x) = λ x
ifx > 0
, elseSELU(x) = λ α (ex - 1)
where
λ ≈ 1.0507
andα ≈ 1.67326
. -
Mish:
Mish(x) = x * tanh(Softplus(x))
-
PReLU:
PReLU(x) = x
ifx > 0
, elsePReLU(x) = ax
- Clone or download this repository to your
ComfyUI/custom_nodes
directory.
- In your ComfyUI workflow, add one of the activation nodes (e.g.,
ReLU Activation
) after a node that outputs a latent tensor (such asKSampler
orLoadLatent
). - Connect the output of the activation node to a
VAEDecode
node to generate an image from the transformed latent. - Adjust the following parameters to control the effect of the activation:
- Strength: Determines the intensity of the activation function (how much of the transformed latent is mixed with the original).
- Add to Original: If enabled, the activated latent will be added to the original latent. If disabled, the original latent will be replaced.
- Normalize: If enabled, the transformed latent will be normalized to have zero mean and unit variance.
- Clamp: If enabled, the values of the transformed latent will be clamped within a specified range (
Clamp Min
andClamp Max
). - Composite: If enabled, the activated latent will be composited (blended) with the upscaled original latent.
- Blend Amount: Controls the blending ratio during compositing (0.0 to 1.0).
- Additional Parameters: Certain activation functions have additional parameters (e.g.,
alpha
for ELU,beta
andthreshold
for Softplus,negative_slope
for Leaky ReLU, etc.).
Contributions are welcome! Feel free to submit issues, fork the repository, and create pull requests.
This project is licensed under the MIT License.