Giter Club home page Giter Club logo

dain-app's Introduction

DAIN-APP Application

This is the source code for the video interpolation application Dain-App, developed on top of the source code of DAIN Dain GIT Project

Table of Contents

  1. Introduction
  2. Citation
  3. Requirements and Dependencies
  4. Installation
  5. Running application with interface
  6. Running application with command line
  7. Slow-motion Generation
  8. Training New Models
  9. Google Colab Demo

Introduction

Dain-App comes with a user interface and a command line script to help new users to start using it with little change to the code. You can also get the Windows binary from the build section.

You can see a few results from those Youtube videos. Turning animations to 60FPS. Turning Sprite Art to 60FPS. Turning Stop Motion to 60FPS. Turning ANIME P1 to 60FPS. Turning ANIME P2 to 60FPS.

Citation

If you find the code and datasets useful in your research, please cite:

@article{Dain-App,
	title={Dain-App: Application for Video Interpolations},
	author={Gabriel Poetsch},
	year={2020}
}
@inproceedings{DAIN,
    author    = {Bao, Wenbo and Lai, Wei-Sheng and Ma, Chao and Zhang, Xiaoyun and Gao, Zhiyong and Yang, Ming-Hsuan}, 
    title     = {Depth-Aware Video Frame Interpolation}, 
    booktitle = {IEEE Conference on Computer Vision and Pattern Recognition},
    year      = {2019}
}
@article{MEMC-Net,
     title={MEMC-Net: Motion Estimation and Motion Compensation Driven Neural Network for Video Interpolation and Enhancement},
     author={Bao, Wenbo and Lai, Wei-Sheng, and Zhang, Xiaoyun and Gao, Zhiyong and Yang, Ming-Hsuan},
     journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
     doi={10.1109/TPAMI.2019.2941941},
     year={2018}
}	

Requirements and Dependencies

  • numba=0.51.2
  • numpy=1.19.2
  • opencv-python=4.4.0.46
  • pillow=8.0.1
  • pyqt5=5.15.1
  • python=3.8.5
  • scikit-learn=0.23.2
  • scipy=1.5.4
  • torch=1.7.0+cu110
  • torchvision=0.8.1+cu110
  • tqdm=4.51.0
  • ffmpeg

Installation

If you use Linux. You'd better use conda env. The environment by requirements.txt has been tested on Ubuntu 18.04 and Arch Linux.

conda create --name DAIN-APP --file requirements.txt
conda activate DAIN-APP
pip install 'opencv-contrib-python==4.4.0.46'

Some tips:

  • Install pyqt5 by pip may cause crash issue. If you meet this error:

    QObject::moveToThread: Current thread (0x55c1b72bc630) is not the object's thread (0x55c1b72f1eb0). Cannot move to target thread (0x55c1b72bc630)

try to install pyqt by conda instead.

pip uninstall pyqt5
pip uninstall pyqt5-sip
conda install pyqt

Check out the Colab code:

Then, run the script below while inside Dain-App dir to compile the remaining modules. For more details, please refer Installation of DAIN README Doc.

$ ./build-app.sh

Running application with interface

$ python my_design.py

Or run the script to automatically initialize conda and start the GUI

$ ./GUI.sh

Running application with command line

You can see all commands for CLI using this code:

python my_design.py -cli -h

A example of a working code:

python  my_design.py -cli --input "gif/example.gif" -o "example_folder/" -on "interpolated.gif" -m "model_weights/best.pth" -fh 3 --interpolations 2 --depth_awarenes 0 --loop 0 -p 0 --alpha 0 --check_scene_change 10 --png_compress 0 --crf 1 --pixel_upscale_downscale_before 1 --pixel_downscale_upscale_after 1 --pixel_upscale_after 1 --mute_ffmpeg 1 --split_size_x -1 --split_size_y -1 --split_pad 150 --half 0 --step_extract 1 --step_interpolate 1 --batch_size 1 --use_benchmark 0 --force_flow 1 --smooth_flow 0 --downscale -1 --fast_mode 0

Training New Models

Currently Dain-App training code is broken, to train new models, use the DAIN github and import the models to Dain-App

Google Colab Demo

Contact

Gabriel Poetsch

License

See MIT License

dain-app's People

Contributors

bostwickenator avatar burguerjohn avatar ferrahwolfeh avatar heylonnhp avatar huhai463127310 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dain-app's Issues

Blank error on attempted interpolation

An error window popup appears with a blank error and the console window reads this:

"App Crash Log Traceback (most recent call last):
File "my_design.py", line 72, in run
dain_class.RenderVideo(myRenderData)
File "my_DAIN_class.py", line 1190, in RenderVideo
self.StepRenderInterpolation(self.myRenderData)
File "my_DAIN_class.py", line 1073, in StepRenderInterpolation
images = self.DoInterpolation(X1, X2, loops)
File "my_DAIN_class.py", line 817, in DoInterpolation
image = interpolate_(self.model, self.myRenderData, x0, x1, False)
File "my_DAIN_class.py", line 202, in interpolate_
y_s,offset,filter = gl_model(X0, X1, padding, myRenderData.flowForce, myRenderData.SmoothFlow, myRenderData.ShareFlow, convert, not bool(myRenderData.fastMode))
File "site-packages\torch\nn\modules\module.py", line 727, in _call_impl
File "networks\DAIN.py", line 268, in forward
File "networks\DAIN.py", line 441, in forward_flownets
File "site-packages\torch\nn\modules\module.py", line 727, in _call_impl
File "PWCNet\PWCNet.py", line 329, in forward
File "PWCNet\PWCNet.py", line 201, in warp
AssertionError"

I'm not sure what's going on. Haven't gotten a single successful attempt yet. I've used other AI networks for upscaling images and audio separation and whatnot without issues. But this isn't working. According to my attempts to install all the requirements, I already have them all.

How do I create a working version from the source code?

When I run the Windows executable version (1.0 alpha) I run out of memory as I only have 16Gb, I see there is a resume fix how can I add that fix to the Windows executable version alternatively how do I run the latest version of the source code. Sorry I dont have a clue about development stuff.

Output video frame rate problem

Output video frame rate problem, frame loss, video time becomes longer. The video is 29.97 FPS, the output is set to 60 FPS, but the final video output is 47 FPS (47.68350 FPS)

DAIN output FPS

Inputted 832x1080 video into DAIN at flat 30.0FPS and set to interpolate at 2x.

Expected output - 60.0FPS
Actual output - 59.974FPS

Is there any way to set it to output at true 60.0FPS? I'm trying to splice audio from the original 30.0FPS video in, but the output video is a different length.

"CUDA out of memory" on RTX3090

Hello,

got this error while tried to interpolate 1 hour 24FPS 576p video into 48FPS closer to its end (83101 frame of 95203)

RuntimeError: CUDA out of memory. Tried to allocate 1.14 GiB (GPU 0; 24.00 GiB total capacity; 439.51 MiB already allocated; 21.30 GiB free; 632.00 MiB reserved in total by PyTorch)

After I restarted system and deleted last interpolated frames i got this

RuntimeError: CUDA out of memory. Tried to allocate 8.66 GiB (GPU 0; 24.00 GiB total capacity; 144.61 MiB already allocated; 13.53 GiB free; 8.39 GiB reserved in total by PyTorch)

Have no idea why all this free memory can't be used here.

r7 5800x, 32GB@4000MHz, RTX3090

Google Colab Demo error

Attempt to execute your Colab Demo throws an error: "ModuleNotFoundError: No module named 'filterinterpolation_cuda'

Error while creating conda environment

I recently built a manjaro machine with a GTX 1070 to try and compile the DAIN-APP from scratch and fix the memory leak issue.

I added pytorch to my channels

conda config --append channels pytorch

Then I tried creating the conda environment,

conda create --name DAIN-APP --file requirements.txt

And now I am only missing one package:

Collecting package metadata (current_repodata.json): done
Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: failed

PackagesNotFoundError: The following packages are not available from current channels:

  - mindepthflowprojection-cuda==0.0.0=pypi_0

Current channels:

  - https://repo.anaconda.com/pkgs/main/linux-64
  - https://repo.anaconda.com/pkgs/main/noarch
  - https://repo.anaconda.com/pkgs/r/linux-64
  - https://repo.anaconda.com/pkgs/r/noarch
  - https://conda.anaconda.org/conda-forge/linux-64
  - https://conda.anaconda.org/conda-forge/noarch
  - https://conda.anaconda.org/pytorch/linux-64
  - https://conda.anaconda.org/pytorch/noarch

To search for alternate channels that may provide the conda package you're
looking for, navigate to

    https://anaconda.org

and use the search bar at the top of the page.

What is this package, and where do I find it? Do I even need it? Do I remove it from the requirements.txt and attempt a compile?

I can't seem to find it anywhere, and a quick google search doesn't really seem to turn up much either.

mindepthflowprojection-cuda=0.0.0=pypi_0

Ultra HFR support (real time 240fps, 360fps, 480fps on true 240Hz, 360Hz, 480Hz displays)

Hello -- I'm the founder of Blur Busters and the creator of the world's most popular motion tests called Blur Busters TestUFO (www.testufo.com) which is commonly used by owners of high-Hz monitors.

Anybody who sees the UFO icon on a reviewer website, they're using the free tests that I invented -- very popular among high-Hz gamers.

I am looking for tools to help create Ultra HFR content (real time 240fps, 480fps and 1000fps on real-time 240Hz, 480Hz and 1000Hz displays). Traditionally, the method was to use a slo-mo camera, and speed up the footage using instructions in the Ultra HFR FAQ

I tried to research the DAIN tool to see if it will support UltraHFR, e.g. converting 24fps to 240fps video files (for 240Hz monitors), or 60fps to 360fps video files (for 360Hz monitors), but documentation prominently shows 60fps and 60Hz,

Doubling Hz halves motion blur without needing black frames (or strobing like a CRT), so Ultra HFR has had some recent buss, especially since the new Christie Digital Cinema Projector now supports 480fps 480Hz operation. If you've seen a 120Hz iPad, or played at 120Hz at full frame rate for an extended time period -- you already have seen the benefits.

I have an ASUS 360 Hz PG259QN gaming monitor, so I want to create test 240fps and 360fps video files out of existing 30fps and 60fps video recordings I have here, with far better quality than classic interpolation.

I wrote an article called Frame Rate Amplification Technology and I consider your DAIN within this universe. New improvements have occured as of late, so I'm writing a new article later this year to mention DAIN in a sequel to this article too.

BONUS feature request: Add support for DAIN to convert existing slo-mo files to real-time motion? For example converting a 120fps slo-mo file (in a 30fps file), and output at a custom frame rate (e.g. 240fps) for output to a 240Hz monitor. That way, the ffmpeg step can be skipped. I'm willing to do the ffmpeg step first though, as long as there is support for ultra-high output frame rates.

How the experimental.pth performs?

hi, thanks for your code sharing!

I found that there is experimental.pth model weight in your repo, I wonder how this new model performs.

Does it produce better visual experience?
Is this model trained on special datasets?

render complete but no output video

as the title says. on the command prompt it says the render is complete and the all the frames are interpolated but when i open the output folder it is empty, nothing is there. And every time i retry step three (since the frames are already interpolated) it will just say the render is completed but still no video. So what could be the problem?

Is it expected to see duplicate objects

First of all, thanks for the tool!

Second, is it expected to get this results like this? If yes, any suggestions to fix it? I tried as many combinations of settings as I could.
image

If it's not clear, the arrow is duplicated. It seems it just faded between the frames instead of actually 'moving' the pixels. I understand fading is necessary, but I'd imagine that when a pixel is that far away, it should switch from fading to moving (I also understand it's a hard problem to solve and that ML models aren't really working with if-then's, but just thinking out loud here)

Blank error on attempt

An error message box pops up that says "Error:" and nothing else. In the console window it reads this. Any ideas what's going on here? I haven't been able to successfully produce anything with program yet no matter what settings or methods I use.

"App Crash Log Traceback (most recent call last):
File "my_design.py", line 72, in run
dain_class.RenderVideo(myRenderData)
File "my_DAIN_class.py", line 1190, in RenderVideo
self.StepRenderInterpolation(self.myRenderData)
File "my_DAIN_class.py", line 1073, in StepRenderInterpolation
images = self.DoInterpolation(X1, X2, loops)
File "my_DAIN_class.py", line 817, in DoInterpolation
image = interpolate_(self.model, self.myRenderData, x0, x1, False)
File "my_DAIN_class.py", line 202, in interpolate_
y_s,offset,filter = gl_model(X0, X1, padding, myRenderData.flowForce, myRenderData.SmoothFlow, myRenderData.ShareFlow, convert, not bool(myRenderData.fastMode))
File "site-packages\torch\nn\modules\module.py", line 727, in _call_impl
File "networks\DAIN.py", line 268, in forward
File "networks\DAIN.py", line 441, in forward_flownets
File "site-packages\torch\nn\modules\module.py", line 727, in _call_impl
File "PWCNet\PWCNet.py", line 329, in forward
File "PWCNet\PWCNet.py", line 201, in warp
AssertionError"

Unusing frames hang in memory

image
i can't process 20 minutes video because all processed frames remain in memory and there is no free space for a new frame.
as you say "Only VRAM using in DAINAPP" wtf?

IndexError: list index out of range

When I trying to render a video in step 3 (1 and 2 step working without problems) I've got message.
Crash [Saved on crash_log.txt]:
Check out Discord for possible solutions:
Check Errors fix on Discord.

Error: list index out of range

In crash log I've got message:
App Crash Log Traceback (most recent call last):
File "my_design.py", line 72, in run
File "my_DAIN_class.py", line 1194, in RenderVideo
File "my_DAIN_class.py", line 1144, in StepCreateVideo
File "my_DAIN_class.py", line 526, in create_video2
IndexError: list index out of range

I have program, source video, target place for video in names and subdirectories without spaces.
I've got this comunicate before 3rd step and error when I click YES. When clicking NO program don't start.
This may rename a few files inside the interpolated_frames folder, are you sure?
Settings for render is in images:
screen 7
1 screen
2 screen
3 screen
4 screen
5 screen
6 screen

Specification is:
I7-3770
16GB (4x4GB) DDR3 1600MHz Dual channel
GTX 1060 6GB
Gigabyte Z77-DS3H (rev 1.0)
Files and program in 2 separate HDD drives.
What do I need to do to render video without errors?

Hardware chart from 1st step to 3rd and error
Hardware chart

From text console in another file:
Log.txt

PS: Sorry for my English if It's wrong.

Empty interpolated frames folder

I created a png sequence by removing duplicate frames and now trying step 2 but i get empty interpolated frames folder . I m sorta new to editing videos. Anyone help?

'int' object is not subscriptable

i get this crash when i runing the program

App Crash Log Traceback (most recent call last):
  File "my_design.py", line 72, in run
  File "my_DAIN_class.py", line 1190, in RenderVideo
  File "my_DAIN_class.py", line 1022, in StepRenderInterpolation
TypeError: 'int' object is not subscriptable

OS: win10 20h2 10.0.19042.746
CPU: [email protected]
MEM: 32G DDR4
GPU: RTX3090 whit driver version 457.51

Unrecognised symbol

issue

Hi, I am trying to use your model to interpolate some frames however I keep running into this issue. I have a gif with 4 frames total at 1 fps. the frames are of size 256x256. Could you please tell me how I could overcome this symbol issue?

No video output?????

I have waited for like 6 hours for the bloody frames to be generated, and now it won't be output to a video for some reason... I have been trying many things with dain to try to get it to work, but does anyone know how i can get all the frames and output them as a video in a way that works?

Sky extreme artifacts

In large areas of color gradients, there is a very bad artifact. As shown here:

Editor.3.mp4

Too old Grafic card

I tried to start the program using Nvidia GT630, actually really old GPU. The program on the start notice that it can not work witout CUDA V5.0 and above, while my GPU do not work with it, nevertheless, i suppose so, cause have to install last version of CUDA.

Here is the notice
photo_2021-03-03_01-17-55

error when it is more then 1 input file

When i try to render 2 or more files...

App Crash Log Traceback (most recent call last):
File "my_design.py", line 72, in run
File "my_DAIN_class.py", line 1190, in RenderVideo
File "my_DAIN_class.py", line 1069, in StepRenderInterpolation
UnboundLocalError: local variable 'loops' referenced before assignment

Feel sad because I have about 200 little videos with 10-15 framerate... it will be hard to do it manual

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.