Giter Club home page Giter Club logo

biniou's Introduction

  • 👋 Hi, I’m @Woolverine94
  • 👀 I’m interested in generative AI and GNU/Linux
  • 📫 How to reach me : [email protected]

biniou's People

Contributors

woolverine94 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

biniou's Issues

AnimateLCM: Assertion failed, number < max_number in function 'icvExtractPattern'

Describe the bug

[ERROR:[email protected]] global cap.cpp:643 open VIDEOIO(CV_IMAGES): raised OpenCV exception:

OpenCV(4.9.0) /io/opencv/modules/videoio/src/cap_images.cpp:274: error: (-215:Assertion failed) number < max_number in function 'icvExtractPattern'

To Reproduce
Steps to reproduce the behavior:

  1. Build image with CUDA support
  2. Run container with docker run -it --gpus=all --restart=always -p 7860:7860 -v //e/AI/biniou/outputs:/home/biniou/biniou/outputs -v //e/AI/biniou/models:/home/biniou/biniou/models -v //e/AI/biniou/cache//e//home/biniou/.cache/huggingface -v //e/AI/biniou/gfpgan:/home/biniou/biniou/gfpgan biniou:latest
  3. Choose Video -> AnimateLCM (didn't change any settings)
  4. Type in prompt & hit Execute
  5. Wait until error occure

Expected behavior
Video output.

Console log

>>>[AnimateLCM 📼 ]: starting module
Keyword arguments {'safety_checker': StableDiffusionSafetyChecker(
  (vision_model): CLIPVisionModel(
    (vision_model): CLIPVisionTransformer(
      (embeddings): CLIPVisionEmbeddings(
        (patch_embedding): Conv2d(3, 1024, kernel_size=(14, 14), stride=(14, 14), bias=False)
        (position_embedding): Embedding(257, 1024)
      )
      (pre_layrnorm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
      (encoder): CLIPEncoder(
        (layers): ModuleList(
          (0-23): 24 x CLIPEncoderLayer(
            (self_attn): CLIPAttention(
              (k_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (v_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (q_proj): Linear(in_features=1024, out_features=1024, bias=True)
              (out_proj): Linear(in_features=1024, out_features=1024, bias=True)
            )
            (layer_norm1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
            (mlp): CLIPMLP(
              (activation_fn): QuickGELUActivation()
              (fc1): Linear(in_features=1024, out_features=4096, bias=True)
              (fc2): Linear(in_features=4096, out_features=1024, bias=True)
            )
            (layer_norm2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
          )
        )
      )
      (post_layernorm): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
    )
  )
  (visual_projection): Linear(in_features=1024, out_features=768, bias=False)
)} are not expected by AnimateDiffPipeline and will be ignored.
The config attributes {'center_input_sample': False, 'flip_sin_to_cos': True, 'freq_shift': 0, 'mid_block_type': 'UNetMidBlock2DCrossAttn', 'only_cross_attention': False, 'attention_head_dim': 8, 'dual_cross_attention': False, 'class_embed_type': None, 'addition_embed_type': None, 
'num_class_embeds': None, 'upcast_attention': False, 'resnet_time_scale_shift': 'default', 'resnet_skip_time_act': False, 'resnet_out_scale_factor': 1.0, 'time_embedding_type': 'positional', 'time_embedding_dim': None, 'time_embedding_act_fn': None, 'timestep_post_act': None, 'conv_in_kernel': 3, 'conv_out_kernel': 3, 'projection_class_embeddings_input_dim': None, 'class_embeddings_concat': False, 'mid_block_only_cross_attention': None, 'cross_attention_norm': None, 'addition_embed_type_num_heads': 64} were passed to UNetMotionModel, but are not expected and will be ignored. Please verify your config.json configuration file.
[ERROR:[email protected]] global cap.cpp:643 open VIDEOIO(CV_IMAGES): raised OpenCV exception:

OpenCV(4.9.0) /io/opencv/modules/videoio/src/cap_images.cpp:274: error: (-215:Assertion failed) number < max_number in function 'icvExtractPattern'


>>>[AnimateLCM 📼 ]: generated 1 batch(es) of 1
>>>[AnimateLCM 📼 ]: Settings : Model=emilianJR/epiCRealism | Adapter=wangfuyun/AnimateLCM | Sampler=LCM | Steps=4 | CFG scale=2 | Video length=16 frames | Size=512x512 | GFPGAN=True | Token merging=0 | nsfw_filter=True | Prompt=breathing heart | Negative prompt= | Seed List=9893499623
>>>[Text2Video-Zero 📼 ]: leaving module
>>>[biniou 🧠]: Generation finished in 00:01:33 (93 seconds)
Traceback (most recent call last):
  File "/home/biniou/biniou/env/lib/python3.11/site-packages/gradio/queueing.py", line 407, in call_prediction
    output = await route_utils.call_process_api(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/biniou/biniou/env/lib/python3.11/site-packages/gradio/route_utils.py", line 226, in call_process_api
    output = await app.get_blocks().process_api(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/biniou/biniou/env/lib/python3.11/site-packages/gradio/blocks.py", line 1559, in process_api
    data = self.postprocess_data(fn_index, result["prediction"], state)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/biniou/biniou/env/lib/python3.11/site-packages/gradio/blocks.py", line 1447, in postprocess_data
    prediction_value = block.postprocess(prediction_value)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/biniou/biniou/env/lib/python3.11/site-packages/gradio/components/video.py", line 273, in postprocess
    processed_files = (self._format_video(y), None)
                       ^^^^^^^^^^^^^^^^^^^^^
  File "/home/biniou/biniou/env/lib/python3.11/site-packages/gradio/components/video.py", line 350, in _format_video
    video = self.make_temp_copy_if_needed(video)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/biniou/biniou/env/lib/python3.11/site-packages/gradio/components/base.py", line 233, in make_temp_copy_if_needed
    temp_dir = self.hash_file(file_path)
               ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/biniou/biniou/env/lib/python3.11/site-packages/gradio/components/base.py", line 197, in hash_file
    with open(file_path, "rb") as f:
         ^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: 'outputs/1713882195_259951_9893499623.mp4'

Hardware

  • RAM size: 32GB
  • Processor physical core number or vcore number : 24
  • GPU : Nvidia Geforce RTX 3080

Desktop

  • latest NVIDIA Studio drivers
  • Windows 10 22H2
  • Docker for Windows with WSL2 / CUDA support

error on ubuntu install

ERROR: Could not find a version that satisfies the requirement diffusers==git+ht tps://github.com/huggingface/diffusers.git@6246c70d2150dcc58415facfe5c199f49b4d2 af1 (from versions: 0.0.1, 0.0.2, 0.0.3, 0.0.4, 0.1.0, 0.1.1, 0.1.2, 0.1.3, 0.2. 0, 0.2.1, 0.2.2, 0.2.3, 0.2.4, 0.3.0, 0.4.0, 0.4.1, 0.4.2, 0.5.0, 0.5.1, 0.6.0, 0.7.0, 0.7.1, 0.7.2, 0.8.0, 0.8.1, 0.9.0, 0.10.0, 0.10.1, 0.10.2, 0.11.0, 0.11.1 , 0.12.0, 0.12.1, 0.13.0, 0.13.1, 0.14.0, 0.15.0, 0.15.1, 0.16.0, 0.16.1, 0.17.0 , 0.17.1, 0.18.0, 0.18.1, 0.18.2, 0.19.0, 0.19.1, 0.19.2, 0.19.3, 0.20.0, 0.20.1 , 0.20.2, 0.21.0, 0.21.1, 0.21.2, 0.21.3, 0.21.4, 0.22.0, 0.22.1, 0.22.2, 0.22.3 , 0.23.0, 0.23.1, 0.24.0, 0.25.0, 0.25.1, 0.26.0, 0.26.1, 0.26.2, 0.26.3)
ERROR: No matching distribution found for diffusers==git+https://github.com/hugg ingface/diffusers.git@6246c70d2150dcc58415facfe5c199f49b4d2af1

docker issues

after using cuda docker Dockerfile to build images,there is syntaxerror in controlnet.py file

docker run -it --gpus all --restart=always -p 7870:7860
-v /mnt/user/ai/biniou/outputs:/home/biniou/biniou/outputs
-v /mnt/user/ai/biniou/models:/home/biniou/biniou/models
-v /mnt/user/ai/biniou/cache:/home/biniou/.cache/huggingface
-v /mnt/user/ai/biniou/gfpgan:/home/biniou/biniou/gfpgan
biniou:latest

Traceback (most recent call last):
File "/home/biniou/biniou/webui.py", line 14, in
from ressources import *
File "/home/biniou/biniou/ressources/init.py", line 21, in
from .controlnet import *
File "/home/biniou/biniou/ressources/controlnet.py", line 137
match preprocessor_controlnet:
^
SyntaxError: invalid syntax

Use UTF-8 for saving files with text (since it can be non-ASCII)

Describe the bug
Translation into a language with non-Latin (i.e. non-ASCII-encodable, e.g. Cyrillic) script fails due to an attempt to write into a file in ascii encoding.

To Reproduce
Steps to reproduce the behavior:

  1. Go to "Text/nllb translation"
  2. Choose non-Latin-based language (e.g.Russian) as target (output) language
  3. Request translation of any text (e.g. "Hello")
  4. See the error

Expected behavior
Translation completes normally

Console log
Only the relevant part

Traceback (most recent call last):
  File "/home/trolley813/development/experimental/biniou/env/lib/python3.11/site-packages/gradio/queueing.py", line 407, in call_prediction
    output = await route_utils.call_process_api(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/trolley813/development/experimental/biniou/env/lib/python3.11/site-packages/gradio/route_utils.py", line 226, in call_process_api
    output = await app.get_blocks().process_api(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/trolley813/development/experimental/biniou/env/lib/python3.11/site-packages/gradio/blocks.py", line 1550, in process_api
    result = await self.call_function(
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/trolley813/development/experimental/biniou/env/lib/python3.11/site-packages/gradio/blocks.py", line 1185, in call_function
    prediction = await anyio.to_thread.run_sync(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/trolley813/development/experimental/biniou/env/lib/python3.11/site-packages/anyio/to_thread.py", line 33, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/trolley813/development/experimental/biniou/env/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
    return await future
           ^^^^^^^^^^^^
  File "/home/trolley813/development/experimental/biniou/env/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 807, in run
    result = context.run(func, *args)
             ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/trolley813/development/experimental/biniou/env/lib/python3.11/site-packages/gradio/utils.py", line 661, in wrapper
    response = f(*args, **kwargs)
               ^^^^^^^^^^^^^^^^^^
  File "/home/trolley813/development/experimental/biniou/env/lib/python3.11/site-packages/gradio/utils.py", line 661, in wrapper
    response = f(*args, **kwargs)
               ^^^^^^^^^^^^^^^^^^
  File "/home/trolley813/development/experimental/biniou/ressources/common.py", line 327, in wrap_func
    result = func(*args, **kwargs)
            ^^^^^^^^^^^^^^^^^^^^^
  File "/home/trolley813/development/experimental/biniou/ressources/nllb.py", line 296, in text_nllb
    filename_nllb = write_file(output_nllb)
                    ^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/trolley813/development/experimental/biniou/ressources/common.py", line 248, in write_file
    savefile.write(content)
UnicodeEncodeError: 'ascii' codec can't encode characters in position 0-4: ordinal not in range(128)

Screenshots
Probably not needed, since it's described in the text above.

Hardware (please complete the following information):

  • RAM size: 64 GB
  • Processor physical core number or vcore number: 16 (32)
  • Storage type of biniou installation : HDD
  • Free storage space for biniou installation : 200GB+
  • GPU : Nvidia Geforce 2080 Super

Desktop (please complete the following information):

  • OS: EndeavourOS (Arch Linux)
  • Browser: Firefox
  • Version: 120
  • Python version : 3.11.6

Additional informations

  • I access the webui through the ip 127.0.0.1 (browser and biniou installation on the same system) :
    [X] Yes [ ] No

Additional context
The solution (which worked for me) is to open the relevant files in UTF-8 mode, but I'm unsure that there's no other places to fix. Here's in ressources/common.py:

def write_file(*args) :
    timestamp = time.time()
    savename = f"outputs/{timestamp}.txt"
    content = ""
    for idx, data in enumerate(args):
        content += f"{data} \n"
    with open(savename, 'w', "utf-8") as savefile:
        savefile.write(content)
    return savename

def write_seeded_file(seed, *args) :
    timestamp = time.time()
    savename = f"outputs/{seed}_{timestamp}.txt"
    content = ""
    for idx, data in enumerate(args):
        content += f"{data} \n"
    with open(savename, 'w', "utf-8") as savefile:
        savefile.write(content)
    return savename

Can't enable CUDA on Windows

Biniou on Windows 10 64bit. Hardware: Intel i5 9th gen, 16 GB, GeForce 560. Cuda 12.1

I go to global settings, switch to cuda, click on update. When I restart Biniou, the log says:

"c:\Users\User\biniou>.\env\Scripts\activate
WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built for:
PyTorch 2.1.0+cu121 with CUDA 1201 (you have 2.1.0+cpu)
Python 3.11.6 (you have 3.11.5)
Please reinstall xformers (see https://github.com/facebookresearch/xformers#installing-xformers)
Memory-efficient attention, SwiGLU, sparse and more won't be available.
Set XFORMERS_MORE_DETAILS=1 for more details

[biniou 🧠]: Up and running at https://192.168.100.235:7860/?__theme=dark
Running on local URL: https://0.0.0.0:7860

To create a public link, set share=True in launch()."

Does this mean my cuda version isn't correct or that my PyTorch version is not correct? How do I switch PyTorch manually?

Empty settings tabs

Describe the bug
Every tab except the first in settings is empty

To Reproduce
Steps to reproduce the behavior:

  1. Go to 'Global settings'
  2. Click on tabs
  3. See error

Expected behavior
Non-empty tabs with related settings

Console log

>>>[biniou 🧠]: Up and running at https://192.168.1.72:7860/?__theme=dark
IMPORTANT: You are using gradio version 3.50.2, however version 4.29.0 is available, please upgrade.
--------
Running on local URL:  https://0.0.0.0:7860

To create a public link, set `share=True` in `launch()`.
>>>[Kandinsky 🖼️ ]: starting module
>>>[GFPGAN-mini 🔎]: enhanced 1 image
>>>[Kandinsky 🖼️ ]: generated 1 batch(es) of 1
>>>[Kandinsky 🖼️ ]: Settings : Model=kandinsky-community/kandinsky-2-2-decoder | Sampler=Euler | Steps=25 | CFG scale=4 | Size=512x512 | GFPGAN=True | Prompt=a cute kitten | Negative prompt=
>>>[Kandinsky 🖼️ ]: leaving module
>>>[biniou 🧠]: Generation finished in 00:06:49 (409 seconds)
>>>[Kandinsky 🖼️ ]: starting module
E:\ai\Biniou\biniou>REM ***********************

E:\ai\Biniou\biniou>REM *** BINIOU LAUNCHER ***

E:\ai\Biniou\biniou>REM ***********************

E:\ai\Biniou\biniou>call venv.cmd

E:\ai\Biniou\biniou>.\env\Scripts\activate
A matching Triton is not available, some optimizations will not be enabled.
Error caught was: No module named 'triton'
>>>[biniou 🧠]: Up and running at https://192.168.1.72:7860/?__theme=dark
IMPORTANT: You are using gradio version 3.50.2, however version 4.29.0 is available, please upgrade.
--------
Running on local URL:  https://0.0.0.0:7860

To create a public link, set `share=True` in `launch()`.
ERROR:asyncio:Exception in callback _ProactorBasePipeTransport._call_connection_lost(None)
handle: <Handle _ProactorBasePipeTransport._call_connection_lost(None)>
Traceback (most recent call last):
  File "D:\Python311\Lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "D:\Python311\Lib\asyncio\proactor_events.py", line 165, in _call_connection_lost
    self._sock.shutdown(socket.SHUT_RDWR)
ConnectionResetError: [WinError 10054] Удаленный хост принудительно разорвал существующее подключение
ERROR:asyncio:Exception in callback _ProactorBasePipeTransport._call_connection_lost(None)
handle: <Handle _ProactorBasePipeTransport._call_connection_lost(None)>
Traceback (most recent call last):
  File "D:\Python311\Lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "D:\Python311\Lib\asyncio\proactor_events.py", line 165, in _call_connection_lost
    self._sock.shutdown(socket.SHUT_RDWR)
ConnectionResetError: [WinError 10054] Удаленный хост принудительно разорвал существующее подключение
ERROR:asyncio:Exception in callback _ProactorBasePipeTransport._call_connection_lost(None)
handle: <Handle _ProactorBasePipeTransport._call_connection_lost(None)>
Traceback (most recent call last):
  File "D:\Python311\Lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "D:\Python311\Lib\asyncio\proactor_events.py", line 165, in _call_connection_lost
    self._sock.shutdown(socket.SHUT_RDWR)
ConnectionResetError: [WinError 10054] Удаленный хост принудительно разорвал существующее подключение
ERROR:asyncio:Exception in callback _ProactorBasePipeTransport._call_connection_lost(None)
handle: <Handle _ProactorBasePipeTransport._call_connection_lost(None)>
Traceback (most recent call last):
  File "D:\Python311\Lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "D:\Python311\Lib\asyncio\proactor_events.py", line 165, in _call_connection_lost
    self._sock.shutdown(socket.SHUT_RDWR)
ConnectionResetError: [WinError 10054] Удаленный хост принудительно разорвал существующее подключение
ERROR:asyncio:Exception in callback _ProactorBasePipeTransport._call_connection_lost(None)
handle: <Handle _ProactorBasePipeTransport._call_connection_lost(None)>
Traceback (most recent call last):
  File "D:\Python311\Lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "D:\Python311\Lib\asyncio\proactor_events.py", line 165, in _call_connection_lost
    self._sock.shutdown(socket.SHUT_RDWR)
ConnectionResetError: [WinError 10054] Удаленный хост принудительно разорвал существующее подключение
ERROR:asyncio:Exception in callback _ProactorBasePipeTransport._call_connection_lost(None)
handle: <Handle _ProactorBasePipeTransport._call_connection_lost(None)>
Traceback (most recent call last):
  File "D:\Python311\Lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "D:\Python311\Lib\asyncio\proactor_events.py", line 165, in _call_connection_lost
    self._sock.shutdown(socket.SHUT_RDWR)
ConnectionResetError: [WinError 10054] Удаленный хост принудительно разорвал существующее подключение
ERROR:asyncio:Exception in callback _ProactorBasePipeTransport._call_connection_lost(None)
handle: <Handle _ProactorBasePipeTransport._call_connection_lost(None)>
Traceback (most recent call last):
  File "D:\Python311\Lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "D:\Python311\Lib\asyncio\proactor_events.py", line 165, in _call_connection_lost
    self._sock.shutdown(socket.SHUT_RDWR)
ConnectionResetError: [WinError 10054] Удаленный хост принудительно разорвал существующее подключение
ERROR:asyncio:Exception in callback _ProactorBasePipeTransport._call_connection_lost(None)
handle: <Handle _ProactorBasePipeTransport._call_connection_lost(None)>
Traceback (most recent call last):
  File "D:\Python311\Lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "D:\Python311\Lib\asyncio\proactor_events.py", line 165, in _call_connection_lost
    self._sock.shutdown(socket.SHUT_RDWR)
ConnectionResetError: [WinError 10054] Удаленный хост принудительно разорвал существующее подключение
ERROR:asyncio:Exception in callback _ProactorBasePipeTransport._call_connection_lost(None)
handle: <Handle _ProactorBasePipeTransport._call_connection_lost(None)>
Traceback (most recent call last):
  File "D:\Python311\Lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "D:\Python311\Lib\asyncio\proactor_events.py", line 165, in _call_connection_lost
    self._sock.shutdown(socket.SHUT_RDWR)
ConnectionResetError: [WinError 10054] Удаленный хост принудительно разорвал существующее подключение
ERROR:asyncio:Exception in callback _ProactorBasePipeTransport._call_connection_lost(None)
handle: <Handle _ProactorBasePipeTransport._call_connection_lost(None)>
Traceback (most recent call last):
  File "D:\Python311\Lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "D:\Python311\Lib\asyncio\proactor_events.py", line 165, in _call_connection_lost
    self._sock.shutdown(socket.SHUT_RDWR)
ConnectionResetError: [WinError 10054] Удаленный хост принудительно разорвал существующее подключение
ERROR:asyncio:Exception in callback _ProactorBasePipeTransport._call_connection_lost(None)
handle: <Handle _ProactorBasePipeTransport._call_connection_lost(None)>
Traceback (most recent call last):
  File "D:\Python311\Lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "D:\Python311\Lib\asyncio\proactor_events.py", line 165, in _call_connection_lost
    self._sock.shutdown(socket.SHUT_RDWR)
ConnectionResetError: [WinError 10054] Удаленный хост принудительно разорвал существующее подключение
ERROR:asyncio:Exception in callback _ProactorBasePipeTransport._call_connection_lost(None)
handle: <Handle _ProactorBasePipeTransport._call_connection_lost(None)>
Traceback (most recent call last):
  File "D:\Python311\Lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "D:\Python311\Lib\asyncio\proactor_events.py", line 165, in _call_connection_lost
    self._sock.shutdown(socket.SHUT_RDWR)
ConnectionResetError: [WinError 10054] Удаленный хост принудительно разорвал существующее подключение
ERROR:asyncio:Exception in callback _ProactorBasePipeTransport._call_connection_lost(None)
handle: <Handle _ProactorBasePipeTransport._call_connection_lost(None)>
Traceback (most recent call last):
  File "D:\Python311\Lib\asyncio\events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "D:\Python311\Lib\asyncio\proactor_events.py", line 165, in _call_connection_lost
    self._sock.shutdown(socket.SHUT_RDWR)
ConnectionResetError: [WinError 10054] Удаленный хост принудительно разорвал существующее подключение
>>>[Kandinsky 🖼️ ]: starting module
>>>[GFPGAN-mini 🔎]: enhanced 1 image
>>>[Kandinsky 🖼️ ]: generated 1 batch(es) of 1
>>>[Kandinsky 🖼️ ]: Settings : Model=kandinsky-community/kandinsky-2-2-decoder | Sampler=Euler | Steps=25 | CFG scale=4 | Size=512x512 | GFPGAN=True | Prompt=a cute kitten | Negative prompt=
>>>[Kandinsky 🖼️ ]: leaving module
>>>[biniou 🧠]: Generation finished in 00:06:49 (409 seconds)
>>>[Kandinsky 🖼️ ]: starting module

Screenshots
image

Hardware (please complete the following information):

  • RAM size: 32GB
  • Processor physical core number or vcore number : 6
  • Storage type of biniou installation : HDD
  • Free storage space for biniou installation : 294GB free, 25GB deployed
  • GPU : NVIDIA GeForce GTX 1060 6GB

Desktop (please complete the following information):

  • OS: Windows 11 22H2
  • Browser chrome (thorium)
  • Version 124.0.6367.218
  • Python version 3.11.6

Smartphone (please complete the following information):

  • N/A

Additional informations

  • I access the webui through the ip 127.0.0.1 (browser and biniou installation on the same system) :
    [X] Yes [ ] No

Additional context
Add any other context about the problem here.

Windows 11 documentation

I want to test out Biniou however, I am afraid to run biniou_netinstall.exe based on the warning you have on the instructions as I dont want to mess with my windows,
I need to install it using venv myself and i need clear instruction.

I've tried installing it figuring out the dependancies but now I am stuck on this error:

(venv) E:\Ai__Project\biniou>python webui.py
A matching Triton is not available, some optimizations will not be enabled.
Error caught was: No module named 'triton'
>>>[biniou 🧠]: Up and running at https://192.168.1.73:7860/?__theme=dark
Exception in thread Thread-4 (run):
Traceback (most recent call last):
  File "C:\Python311\Lib\threading.py", line 1038, in _bootstrap_inner
    self.run()
  File "C:\Python311\Lib\threading.py", line 975, in run
    self._target(*self._args, **self._kwargs)
  File "E:\Ai__Project\biniou\venv\Lib\site-packages\uvicorn\server.py", line 65, in run
    return asyncio.run(self.serve(sockets=sockets))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Python311\Lib\asyncio\runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "C:\Python311\Lib\asyncio\runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Python311\Lib\asyncio\base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "E:\Ai__Project\biniou\venv\Lib\site-packages\uvicorn\server.py", line 69, in serve
    await self._serve(sockets)
  File "E:\Ai__Project\biniou\venv\Lib\site-packages\uvicorn\server.py", line 76, in _serve
    config.load()
  File "E:\Ai__Project\biniou\venv\Lib\site-packages\uvicorn\config.py", line 399, in load
    self.ssl: ssl.SSLContext | None = create_ssl_context(
                                      ^^^^^^^^^^^^^^^^^^^
  File "E:\Ai__Project\biniou\venv\Lib\site-packages\uvicorn\config.py", line 112, in create_ssl_context
    ctx.load_cert_chain(certfile, keyfile, get_password)
FileNotFoundError: [Errno 2] No such file or directory
Traceback (most recent call last):
  File "E:\Ai__Project\biniou\webui.py", line 9668, in <module>
    demo.queue(concurrency_count=8).launch(
  File "E:\Ai__Project\biniou\venv\Lib\site-packages\gradio\blocks.py", line 2033, in launch
    ) = networking.start_server(
        ^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\Ai__Project\biniou\venv\Lib\site-packages\gradio\networking.py", line 207, in start_server
    raise OSError(
OSError: Cannot find empty port in range: 7860-7860. You can specify a different port by setting the GRADIO_SERVER_PORT environment variable or passing the `server_port` parameter to `launch()`.

(venv) E:\Ai__Project\biniou>

can't start app using docker

There was a problem when trying to write in your cache folder (/home/biniou/.cache/huggingface/hub). Please, ensure the directory exists and can be written to.
There was a problem when trying to write in your cache folder (/home/biniou/.cache/huggingface/hub). You should set the environment variable TRANSFORMERS_CACHE to a writable directory.
Traceback (most recent call last):
File "/home/biniou/biniou/webui.py", line 14, in
from ressources import *
File "/home/biniou/biniou/ressources/init.py", line 2, in
from .llamacpp import *
File "/home/biniou/biniou/ressources/llamacpp.py", line 11, in
os.makedirs(model_path_llamacpp, exist_ok=True)
File "/usr/lib/python3.9/os.py", line 225, in makedirs
mkdir(name, mode)
PermissionError: [Errno 13] Permission denied: './models/llamacpp/'

2024-01-31 02 13 03

IOPaint models

Is your feature request related to a problem? Please describe.
No problems

Describe the solution you'd like
I'd like for biniou to include models that are used by https://github.com/Sanster/IOPaint

Describe alternatives you've considered
Using IOpaint itself

Additional context
-

Prompt Kandisky

Discussed in #8

Originally posted by koinkoin-project December 16, 2023
Hello

J'ai utilisé prompt generator SD pour générer le prompt d'une image (je tente de faire un logo), je te partage mon log ici :

>>>[Kandinsky 🖼️ ]: starting module Token indices sequence length is longer than the specified maximum sequence length for this model (694 > 77). Running this sequence through the model will result in indexing errors The following part of your input was truncated because CLIP can only handle sequences up to 77 tokens: ['tti. by raymond swanland and anato finnstark and greg rutkowski, oil painting, cgsociety, national geographic, best practice, best practice, from artstation, world renowned artists, from deviantart, trending on artstation, pinterest, by greg rutkowski, 4 k, insane details, intricate, elegent, cinematic, 8 k, unreal engine, video game graphics, path traced, 8 k, hyperrealistic, octane render, 3 d, ray tracing, cinematic lighting, path tracing, unreal engine, hyper resolution, insanely detailed and intricate, elegant, digital painting, mystical, mystical, mist, earthtone colors, smooth, clear focus, cinematic lighting, masterpiece, clear, crisp focus, environmental light, ultra realistic, hyper - detailed, high definition, depth of field, 4 k, sharp focus, insanely detailed and intricate, masterpiece, cinematic lighting, award winning, trending on artstation, cgsociety, pixiv, zbrush central, concept art, behance hd 8 k ultrawide angle, by greg rutkowski, by eddie mendoza, by jesper ejsingwayje, by rhads, by rhads, makoto shinkai and lois van baarle, by rhads, by rhads, by monetto, by moebius, by ross tronoke, by rossdraws, by rhads, by rhads, deviantart, iridescent, by beeple, ross, by ross tran biowarender, makoto shatj kings, octane, by makoto shinkai, by christopher balaskas, matte painting, by ridley scott, red and alena a volumetric lighting, 8 k, octane render, high quality, john how dofraut, cgsoc leyendecker, by kelly kelly morelsk, by alena, 4 k, colorful and edward hopper, 3 d render, logotype, highly detailed, john howfrautk and john how dofka, logotype, 8 k 8 k, wizard the alena beeple render, logotype 8 k resolution, digital 2 d render, intricate 8 k, cinematic, hyperrealistic, high resolution, high quality, wizard treebeard, wizard the forest, a landscape, an forest, environment, concept art, in the depth of gods, scenic landscape, hyper detail, alan sao bua render, a hyper - realism, environmental character, turbulent sea, ultra photorealistic alan treebeard, 8 k, dramatic lighting, cinematic, anamorphic, cinematic by biolumination, high coherence, atmospheric, anamorphic cinematic by godlike, highly detailed and detailed painting 8 k, concept art, path tracing, cinematic art, alphonse mucha 4 k 8 k, high resolution, unreal engine, dada digital 2 d and wizard herpher art, high epcot, by k. logo, studio quality, atmospheric watermark, atmospheric hyper detailed, an ultra - detailed, hyper - detailed, high librarian, magical tree, magical forest hooligan dark atmosphere, logotype 8 k and daedric druid lense, studio quality', '<|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|><|endoftext|>']

J'ai utilisé Kadinsky-2-1, et ça a planté. Le fait de relancer ce module avec les parametres par defaut provoque une erreur. Le module Kadansky-2-2-decoder ne provoque pas d'erreur avec ce prompt.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.