Giter Club home page Giter Club logo

davos's Introduction

davos

Build Status Documentation Status

davos is an FTP download automation tool that allows you to scan various FTP servers for files you are interested in. This can be used to configure file feeds as part of a wider workflow.

Why use davos?

A fair number of services still rely on "file-drops" to transport data from place to place. A common practice is to configure a cron job to periodically trigger FTP/SFTP programs to download those files. davos is relatively similar, only it also adds a web UI to the whole process, making the management of these schedules easier.

How it works

Hosts

All periodic scans (Schedules) require a host to connect to. These can be added individually:

https://raw.githubusercontent.com/linuxserver/davos/master/docs/host.png

Schedules

Each schedule contains all of the required information pertaining to the files it is interested in. This includes the host it needs to connect to, where to look for the files, where to download them, and how often:

https://raw.githubusercontent.com/linuxserver/davos/master/docs/schedule1.png

It is also possible to limit what the schedule downloads by applying filters to each scan. davos will only download files that match its list of given filters. If no filters are applied to a schedule, all files will be downloaded. Each schedule also keeps an internal record of what it scanned in the previous run, so it won't download the same file twice.

https://raw.githubusercontent.com/linuxserver/davos/master/docs/schedule2.png

Once each file has been downloaded, davos can also notify you via Pushbullet, as well as sending downstream requests to other services. This is particularly useful if another service makes use of the file davos has just downloaded.

https://raw.githubusercontent.com/linuxserver/davos/master/docs/schedule3.png

Running

Finally, schedules can be started or stopped at any point, using the schedules listing UI:

https://raw.githubusercontent.com/linuxserver/davos/master/docs/list.PNG

Changelog

  • 2.2.2

    • Updated log4j dependency to 2.16.0, accounting for CVE-2021-44228
  • 2.2.1

    • Fixed bug where lastRunTime got reset whenever a change was made to a schedule.
    • General refactoring of code, plus added unit tests.
    • Allow $filename resolution in URLs of API calls.
  • 2.2.0

    • The filter pattern matcher now resolves '*' to zero or more characters, rather than one or more.
    • The scanned items list can now be cleared.
    • Added a Last Run field to the scanned items modal.
    • Included readthedocs documentation!
    • Added SNS capability to notifications area
    • Updated FTPS connections to run over Explicit TLS, rather than Implicit SSL
      • This may or may not break existing schedules that use FTPS prior to 2.2.0.
    • Improved some areas of DEBUG logging
    • Schedules page now automatically updates when files are downloading
    • Added identity file authentication for SFTP connections
    • Included a version checker to help prompt users when a new version is available
      • Full disclosure: This makes a GET request to GitHub to ascertain the latest release version.
  • 2.1.2

    • Fixed NaN bug caused by empty files (Div/0)
    • Fixed recursive delete issue for directories in FTP and SFTP connections.
  • 2.1.1

    • Fixed primitive issue on Schedule model for new fields
  • 2.1.0

    • Mandatory filtering allows schedules to only download files when at least one filter has been set.
    • Form validation on Hosts and Schedule pages
    • New theme
    • Inverse filtering allows schedules to download files that DO NOT match provided filters.
    • "Test Connection" button added to Hosts page
    • Schedules can now delete the remote copy of each file once the download has completed. This is separate to the Post-download actions.
    • New intervals: "Every minute" and "Every 5 minutes"

davos's People

Contributors

joshstark avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

davos's Issues

No Filter = No Download

Change the filter download behaviour to be "If no label present, don't download anything" instead of "No label, grab everything".

Does not sync files changed

Is it possible to pull files after being scanned by the schedule?

For instance.. I have a scheduled job that sees a new file, downloads it, but can't move it yet due to being locked by a process (basically a live log). Not a big deal. However, when the job runs again it does not see any changes made to the file to download and over write the previous version.

Looking to see how to keep syncing modified files.

Q - clear some items from database but not all

is there a way - either manually by editing files or through the app to remove only single items from the cached list of already downloaded files?

is it possible to connect to the running container and remove things from the DB?

Clear items in "Last scanned" list

Allow for ability to delete items in last scanned list for a schedule. This is to allow for subsequent runs to pick up previously scanned files.

Sub-folders scanning

Hello !

Firstly Davos is exactly the app that I'm looking for a long time ๐Ÿ‘

I installed Davos with Docker, configured my host, and created a schedule to copy some folders that I need to download. My schedule settings (to begin) are very basics: download everything new with recursive mode (folders with their contents). So I don't modified anything except activating Recursive mode.

The FTP has this following directory structure :

/
 directory0/ <--------- Davos don't download files/folders above this one
            directoryA/ 
                       directory1/
                                 fileA
                       file1 
                       file2
            directoryB/ <--------- Davos downloads files/folders above this one
                       directory1/
                                 fileA
                       file1
                       file2  

When I setup the Host Directory with the Directory0 path, no files or folders are triggered.
When I setup the Host Directory with directoryA/B, folders and files are triggered and downloaded.

Do you know why ?

Thanks for your help !

Identity file, ssh private key not working.

Running davos latest as of today. When using IdentityFile and testing the connection it throws an error:

 There was an error: invalid privatekey: [B@6e153327

Keys where just generated with ssh-keygen no extra options provided, the key works as expected when I do a ssh user@host -i ./davos_rsa.

Davos definitely seeing the key, if I provide a non existent path it complains it doesn't exist, so the problem seems to be with the key parsing, there's also no reading permission issues.

Happened using the following ssh version: OpenSSH_8.2p1 Ubuntu-4ubuntu0.2, OpenSSL 1.1.1f 31 Mar 2020

Here I provide the public and private key for reproducing, no security concerns this keys were generated for testing, not using them anywhere.

Public key

ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDeSJdBaZfxf3OoEHr+rGitwtsxPKGvJF+cj29bS5z+U68hAKNe+GnHoQJW1qMVdr+jtGiEjjj13Hc5n7jPpi7BqQOZ7Q/X8i2Pp3DFd/d8YiNoVH1fY5ZRTjxFlUcU8YVA7KudCst+1dEmbsQm2fv5qyLvLH3DHPvTBoWWY2zFSCmD39Sb/e3307jP72f8fRv9brsaFiTWsvTO6HsEyA9Yj5V0IScH6DJjRCtsSrbwnFEOhjWd9nNTDGYgW/90EccXFNNbrY1fX0lHqqOaJsPwGiM9cDw89fZr5stHtSuyIca02Dvn9cl1JBh7xUuPmNRULQo8fV7zhKagXYb6FQlXiO3RdLtWy563BctC24KXdWDb/XtZ3MNg85Z9QAcSZz7V/Iqwxd6I7TlztmzTaSFlTq7GhIn40S4LySxtRYVTUzw3KSxlfUBZmTQjR8KuxokMI8e8lB8MtjPDHcIM2a7mmR0hFmpjQGNVpzgE5JJVeUClS8KJ5wXf4ioWSPdZPIM= muniter@ubuserver

Private key

-----BEGIN OPENSSH PRIVATE KEY-----
b3BlbnNzaC1rZXktdjEAAAAABG5vbmUAAAAEbm9uZQAAAAAAAAABAAABlwAAAAdzc2gtcn
NhAAAAAwEAAQAAAYEA3kiXQWmX8X9zqBB6/qxorcLbMTyhryRfnI9vW0uc/lOvIQCjXvhp
x6ECVtajFXa/o7RohI449dx3OZ+4z6YuwakDme0P1/Itj6dwxXf3fGIjaFR9X2OWUU48RZ
VHFPGFQOyrnQrLftXRJm7EJtn7+asi7yx9wxz70waFlmNsxUgpg9/Um/3t99O4z+9n/H0b
/W67GhYk1rL0zuh7BMgPWI+VdCEnB+gyY0QrbEq28JxRDoY1nfZzUwxmIFv/dBHHFxTTW6
2NX19JR6qjmibD8BojPXA8PPX2a+bLR7UrsiHGtNg75/XJdSQYe8VLj5jUVC0KPH1e84Sm
oF2G+hUJV4jt0XS7VsuetwXLQtuCl3Vg2/17WdzDYPOWfUAHEmc+1fyKsMXeiO05c7Zs02
khZU6uxoSJ+NEuC8ksbUWFU1M8NyksZX1AWZk0I0fCrsaJDCPHvJQfDLYzwx3CDNmu5pkd
IRZqY0BjVac4BOSSVXlApUvCiecF3+IqFkj3WTyDAAAFiHv3C2579wtuAAAAB3NzaC1yc2
EAAAGBAN5Il0Fpl/F/c6gQev6saK3C2zE8oa8kX5yPb1tLnP5TryEAo174acehAlbWoxV2
v6O0aISOOPXcdzmfuM+mLsGpA5ntD9fyLY+ncMV393xiI2hUfV9jllFOPEWVRxTxhUDsq5
0Ky37V0SZuxCbZ+/mrIu8sfcMc+9MGhZZjbMVIKYPf1Jv97ffTuM/vZ/x9G/1uuxoWJNay
9M7oewTID1iPlXQhJwfoMmNEK2xKtvCcUQ6GNZ32c1MMZiBb/3QRxxcU01utjV9fSUeqo5
omw/AaIz1wPDz19mvmy0e1K7IhxrTYO+f1yXUkGHvFS4+Y1FQtCjx9XvOEpqBdhvoVCVeI
7dF0u1bLnrcFy0Lbgpd1YNv9e1ncw2Dzln1ABxJnPtX8irDF3ojtOXO2bNNpIWVOrsaEif
jRLgvJLG1FhVNTPDcpLGV9QFmZNCNHwq7GiQwjx7yUHwy2M8MdwgzZruaZHSEWamNAY1Wn
OATkklV5QKVLwonnBd/iKhZI91k8gwAAAAMBAAEAAAGBANdWWXmcEv94ahHZjV2kpnAXAg
OL6lJimWFxLv6xnLBhX5pIJPx/CPLEzyBTJIBJntO3lT09Dn9YCgQ/8GjxZABmfL+kgaHA
0lSFcGFMm+vaotSSbTZ4oom3kfoS6F6or1+7J3GmoIcKGmjyC4Jb0JgJK3mqj1bygB7qBY
YwYZIpG1bPAwfvkpZwfGysT/+xL+lvWUCnTR7VFQZQ/8QdD4jK6I0tBMPLNO0ngC2Tn/Au
bvP0HoMd2pEMxO76UNof0QFLXxbm4lI5CkHcIifBPnCrhG16zWb51Kem1Ld27UxZUMup8K
Wo0PfVzUuobrQsBTkXyhTd1yfHG6ptvHdvbASz4lK7sZYaZf/NCNG8TU5PNiOUKkgwumTn
HZEDVfZBcci3AkGNgqkp9m/5shSsXPd/6PM6NqeXsvK0sKTl06ZK95M8M7BYAyGNPt2++L
wwoMQ7BMDPLslQdK2byD9fcNKSIF3rPiUmEbrQx5LybvE58Gt7DLJuzrr7nqIpbvdfuQAA
AMA+UDzJcqWH1dUoBVWua9TIahSaiVDUNXNz31Sz8IAOO+ufm3X07OutKrTVWFPP6tbLPu
S0dIos38rp/nJK1sKGZmWDy/RoQUQnOKUeQun94wBS1aAVUDIjOnF1XkUbhK77J7M/LlT7
KbHZZAKgi5NmU5G2fISWiBqv4qgi6g3Tpx20YciDID5ZaG1kAq4JhvRnl1Ys2qjA17VsXW
qlOHgOm3bgLRixjt8afNo0td5BOxg+8RuRmvYs1e/a7ThnOrQAAADBAP1TGrDT1nOkjokW
/mxTAUDJyRK7+CQDl787gbc8SWYiEjIcvy6aVblvI1nR5+MgGCBJHwmtCO8aS02VfvN9Sw
/ui8XfxWd++a7koxW31mu2JsCk5WgIa4fs3eSeajZTN5nCJ4JVAEUjp9LKb1FnrEbyMFCc
NphMBtCtgbG/fMxPjZJdz/t2U3thuMXN6LhgyoaAqWPjkuxzW4dx4r8GTK1Ymlbgx3Zjn4
vKOjQcJH2xcrvhtj6Wf3h38J4t8FO81QAAAMEA4KGQJInEbqlP/xxFpQmT0J++6DyN/gn6
1NTyTnRbVYxzy0XKTAndvKvmlSXJSmfd7XGb/tfaF+PQDaCrY7FKGHWDMxGv8NLZB7bNIF
MVOi0WGqzrbP0gXsrRIKhkFYDL6sEfkkNJlT099fflqPJkAKLvQrrxngSKScZNz+/Jmbmu
C9BWR2yI3xHtqnFzTmLzOS8A5V+HtXF6HgyMe2tDeV01UbNttO5RQzfY6wU4CN1n/xG782
YJPth6c/CnD1/3AAAAEW11bml0ZXJAdWJ1c2VydmVyAQ==
-----END OPENSSH PRIVATE KEY-----

Extremely slow download

Hi, thanks for that nice app!
I just set it up successfully, however download is extremely slow, while with my usual FTP client it is x100 faster
I have used your docker compose, adding Traefik (removing Traefik does not correct the issue)
Any idea what's going on?
Thx


version: "2.1"
services:
davos:
image: lscr.io/linuxserver/davos
container_name: davos
environment:
- PUID=1000
- PGID=100
volumes:
- /srv/Docker/davos:/config
- /srv/Multimedia/_a_convertir:/download

ports:

- 8080:8080

expose:
  - 8080
networks:
  - traefik_proxy
restart: unless-stopped
labels:
  - "traefik.enable=true"
  - "traefik.http.routers.davos.rule=HostHeader(`davos.DOMAIN.fr`)"
  - "traefik.http.routers.davos.entrypoints=http,https"
  - "diun.enable=true"

networks:
traefik_proxy:
external:
name: traefik_proxy

"Delete Host File" doesn't work in some cases

Hey,

unfortunately there seems to be an issue with the "Delete Host File" feature. I am still trying to figure out when exactly and why the issue occurs, but here is what I found out.

When you download two folders with several files inside, while having the "Delete Host File" checkbox activated the removal of the folder fails and the 2nd folder does not get downloaded.

In this exacmple I tried to download two folders called [HorribleSubs].Naruto.Shippuuden.-.489.[720p] and [HorribleSubs].Naruto.Shippuuden.-.489.[720p]2 wich contained 3 files each.

Here is the debug log:

2017-03-26 00:22:10.801 - DEBUG - [ProgressListener] - Progress downloaded: 81.07043679903812% 2017-03-26 00:22:22.691 - INFO - [FilesAndFoldersTranferStrategy] - Successfully downloaded file. 2017-03-26 00:22:22.691 - INFO - [FilesAndFoldersTranferStrategy] - Running post download actions on [HorribleSubs].Naruto.Shippuuden.-.489.[720p] 2017-03-26 00:22:22.691 - DEBUG - [TransferStrategy] - Running actions... 2017-03-26 00:22:22.691 - INFO - [MoveFileAction] - Executing move action: Moving [HorribleSubs].Naruto.Shippuuden.-.489.[720p] to /download/ 2017-03-26 00:22:22.693 - INFO - [MoveFileAction] - File successfully moved! 2017-03-26 00:22:22.693 - DEBUG - [TransferStrategy] - Finished running actions... 2017-03-26 00:22:22.694 - INFO - [SFTPConnection] - Deleting remote file at path: /files/completed/[HorribleSubs].Naruto.Shippuuden.-.489.[720p] 2017-03-26 00:22:22.694 - DEBUG - [SFTPConnection] - Path is for a directory, so calling channel#rmdir() 2017-03-26 00:22:22.818 - DEBUG - [SFTPConnection] - channel threw exception. Assuming file not deleted 2017-03-26 00:22:22.818 - ERROR - [DownloadFilesWorkflowStep] - Unable to complete download. Error was: Unable to delete file on remote server 2017-03-26 00:22:22.818 - DEBUG - [DownloadFilesWorkflowStep] - Stacktrace io.linuxserver.davos.transfer.ftp.exception.DownloadFailedException: Unable to delete file on remote server at io.linuxserver.davos.transfer.ftp.connection.SFTPConnection.deleteRemoteFile(SFTPConnection.java:174) ~[classes!/:?] at io.linuxserver.davos.schedule.workflow.DownloadFilesWorkflowStep.runStep(DownloadFilesWorkflowStep.java:53) [classes!/:?] at io.linuxserver.davos.schedule.workflow.FilterFilesWorkflowStep.runStep(FilterFilesWorkflowStep.java:69) [classes!/:?] at io.linuxserver.davos.schedule.workflow.ConnectWorkflowStep.runStep(ConnectWorkflowStep.java:36) [classes!/:?] at io.linuxserver.davos.schedule.workflow.ScheduleWorkflow.start(ScheduleWorkflow.java:44) [classes!/:?] at io.linuxserver.davos.schedule.RunnableSchedule.run(RunnableSchedule.java:43) [classes!/:?] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_121] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [?:1.8.0_121] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_121] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [?:1.8.0_121] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_121] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_121] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_121] Caused by: com.jcraft.jsch.SftpException: Directory is not empty at com.jcraft.jsch.ChannelSftp.throwStatusError(ChannelSftp.java:2833) ~[jsch-0.1.50.jar!/:?] at com.jcraft.jsch.ChannelSftp.rmdir(ChannelSftp.java:2109) ~[jsch-0.1.50.jar!/:?] at io.linuxserver.davos.transfer.ftp.connection.SFTPConnection.deleteRemoteFile(SFTPConnection.java:164) ~[classes!/:?] ... 12 more 2017-03-26 00:22:22.821 - INFO - [DownloadFilesWorkflowStep] - Clearing current queue and will still continue to next step 2017-03-26 00:22:22.821 - DEBUG - [SFTPClient] - Disconnecting from channel 2017-03-26 00:22:22.821 - DEBUG - [SFTPClient] - Disconnecting from session 2017-03-26 00:22:22.823 - INFO - [ScheduleWorkflow] - Finished schedule run: Download completed Files from FTPServer 2017-03-26 00:22:22.823 - DEBUG - [RunnableSchedule] - Workflow finished 2017-03-26 00:22:22.823 - DEBUG - [RunnableSchedule] - Saving newly scanned files against schedule

Edit:
I tried to find the issue but from the code it seems that only single files are supported at this point. I think the api calls behave like "rm" and "rmdir" and we would need something like "rm -r folder".

If I get that right we would just need to copypaste the "download" method and change every download command to the corresponding rm command. Unfortunately my programming skills have seen better days (sued to be pretty good with Fortran) and actually I have no clue how to use github correctly. It would be very nice if someone could take over from here.

Edit2:
Here is a vague implementation of the fix:

`public void delete(FTPFile file, String localFilePath) {

    String path = FileUtils.ensureTrailingSlash(file.getPath()) + file.getName();
    String cleanLocalPath = FileUtils.ensureTrailingSlash(localFilePath);

    try {

        if (file.isDirectory())
            deleteDirectoryAndContents(file, cleanLocalPath, path);
        else
            channel.rm(path);

    } catch (SftpException e) {
        throw new DownloadFailedException("Unable to download file " + path, e);
    }
}`

`private void deleteDirectoryAndContents(FTPFile file, String localDownloadFolder, String path) throws SftpException {

    List<FTPFile> subItems = listFiles(path).stream().filter(removeCurrentAndParentDirs()).collect(Collectors.toList());

    String fullLocalDownloadPath = FileUtils.ensureTrailingSlash(localDownloadFolder + file.getName());

    for (FTPFile subItem : subItems) {

        String subItemPath = FileUtils.ensureTrailingSlash(subItem.getPath()) + subItem.getName();

        if (subItem.isDirectory()) {

            String subLocalFilePath = FileUtils.ensureTrailingSlash(fullLocalDownloadPath);
            deleteDirectoryAndContents(subItem, subLocalFilePath, FileUtils.ensureTrailingSlash(subItemPath));
        }

        else {

            channel.rm(subItemPath);
        }
    }
}`

Since I am getting old I have no clue how to commit this or how to use the Logger correctly but in theory you just need to append this to "SFTPConnection.java" and replace "channel.rmdir(path)" with "delete(file, localFilePath)"

Note that you would need a dummy filepath with this old mans solution. If you want to make it pretty you would have to delete all remenants of the lazy copypasta.

I hope that kind of helps.

I'm out. Cheers

Doesn't transfer files when there is a space in the folder name

Hi,

Just started using this today, and straight away struggled to download folders from the remote host. After playing with it a for a while thinking I had made a mistake, I then noticed that some folders where being downloaded from another schedule, but only folders that didn't have a space in the name.

After testing this by making folders with just a single text file in them and then adding spaces in the folders name, I can confirm that this is a problem.

EDIT: Turns out that it's not just spaces. For example I tested with the following folder name, and got the same result with no files being downloaded within the folders contents, Test.Test[2015].TEST_1

Just to clarify, the folder itself is actually created on the local machine, but none of the contents is transferred to it.

Allow to run on certain times

I've been happily using Davos for several months now. However, when it runs it clogs up my internet connection entirely. I would like to see an option to allow Davos to only run within a certain timeframe, so I could allow it to download only at night when I'm not using the internet anyways.

Where is searched items list stored?

Does anyone know where the searched items list is stored. I often find myself wanting to remove a single item from the list so it can attempt to be redownloaded, but I can't figure it out.

The only work around i have to is delete tons of things at the source, and then clear the list leaving only the items i want redownloaded and new items, which is very cumbersome

Encryption after grabbing the data

Hey Guys,

mostly that image is perfectly. But if there is a possibility to integrate it into that images, that it is automatically encrypted after grabbing (as function) would be a nice extra into the container.
A AES256 RAR Encryption, should be mostly enough.

For example duplicatis solutions isnt my thing, cause there backup encryption, which only works over them.

Regards. #34

Multithread download

Hey !

Thank you for you amazing job !

It would be great to have a multithread download to increase the speed download.

Add more flexible intervals

This tool is exactly what I wanted except I need it to run every minute. Seems like it would not be too complicated to allow the user to override it using a text box.

Thanks!

maybe connecting through proxy?

it would be great if connections through (socks5) proxy were an option.

i don't like all the people i deal with to know where i connect from all the time... for now, i'm using filezilla over tor / my own vps shadowsocks proxy for my sftp needs, but would really like to use the automatization available in davos. thanks in advance

better filter support

the filters only seem to work at the root level of the directory. e.g.

with invert selected
*.nfo filter will still allow download of somefolder/somefile.nfo

consider allowing regex?

Is there Active mode support?

I'm trying to automatically pull files off an old ftp server that only supports active mode. I got the following error when trying to sync from it. I assume this error is because active mode isn't supported?

2021-02-04 21:15:46.619 - ERROR - [DownloadFilesWorkflowStep] - Unable to complete download. Error was: Unable to list files in directory /Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/Hdd1/System/

Unable to download files

Hi,

I've installed davos on my homeserver, it connects perfectly to the remote server and lists the directories, but when it starts downloading it always throws an exception on a random folder and never actually downloads anything.

I have:

  • Remove server on port 22 (SFTP)
  • home server (UnRaid) with a docker of davos installed
  • WD My book live, mounted on the UnRaid server and passed on to the davos container

Below is a snippet of the debug log, and also all the directories exist in both the remote and local server.

I've tried to use a different folder and the issue is still there.

2017-10-01 11:06:06.564 - DEBUG - [SFTPConnection] - Creating new local directory /Media/Series/XTV/
2017-10-01 11:06:06.564 - DEBUG - [FileUtils] - Directory was not created!
2017-10-01 11:06:06.565 - DEBUG - [SFTPConnection] - FTPFile[name=season01-poster.jpg,size=50266,path=/home2/nctc25esj1/Media/Series/XTV/,lastModified=2017-10-01T03:09:49.000+04:00,directory=false]
2017-10-01 11:06:06.565 -  INFO - [SFTPConnection] - Downloading /home2/nctc25esj1/Media/Series/XTV/season01-poster.jpg to /Media/Series/XTV/
2017-10-01 11:06:06.565 - DEBUG - [SFTPConnection] - Performing channel.get from /home2/nctc25esj1/Media/Series/XTV/season01-poster.jpg to /Media/Series/XTV/
2017-10-01 11:06:06.565 - DEBUG - [SFTPConnection] - Progress listener has been enabled
2017-10-01 11:06:06.694 - ERROR - [DownloadFilesWorkflowStep] - Unable to complete download. Error was: Unable to download file /home2/nctc25esj1/Media/Series/XTV
2017-10-01 11:06:06.695 - DEBUG - [DownloadFilesWorkflowStep] - Stacktrace
io.linuxserver.davos.transfer.ftp.exception.DownloadFailedException: Unable to download file /home2/nctc25esj1/Media/Series/XTV
        at io.linuxserver.davos.transfer.ftp.connection.SFTPConnection.download(SFTPConnection.java:66) ~[classes!/:?]
        at io.linuxserver.davos.schedule.workflow.transfer.FilesAndFoldersTranferStrategy.transferFile(FilesAndFoldersTranferStrategy.java:28) ~[classes!/:?]
        at io.linuxserver.davos.schedule.workflow.DownloadFilesWorkflowStep.runStep(DownloadFilesWorkflowStep.java:50) [classes!/:?]
        at io.linuxserver.davos.schedule.workflow.FilterFilesWorkflowStep.runStep(FilterFilesWorkflowStep.java:69) [classes!/:?]
        at io.linuxserver.davos.schedule.workflow.ConnectWorkflowStep.runStep(ConnectWorkflowStep.java:36) [classes!/:?]
        at io.linuxserver.davos.schedule.workflow.ScheduleWorkflow.start(ScheduleWorkflow.java:44) [classes!/:?]
        at io.linuxserver.davos.schedule.RunnableSchedule.run(RunnableSchedule.java:43) [classes!/:?]
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_131]
        at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) [?:1.8.0_131]
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_131]
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) [?:1.8.0_131]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_131]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_131]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_131]
Caused by: com.jcraft.jsch.SftpException:
        at com.jcraft.jsch.ChannelSftp.get(ChannelSftp.java:980) ~[jsch-0.1.50.jar!/:?]
        at com.jcraft.jsch.ChannelSftp.get(ChannelSftp.java:878) ~[jsch-0.1.50.jar!/:?]
        at io.linuxserver.davos.transfer.ftp.connection.SFTPConnection.doGet(SFTPConnection.java:114) ~[classes!/:?]
        at io.linuxserver.davos.transfer.ftp.connection.SFTPConnection.downloadDirectoryAndContents(SFTPConnection.java:145) ~[classes!/:?]
        at io.linuxserver.davos.transfer.ftp.connection.SFTPConnection.download(SFTPConnection.java:61) ~[classes!/:?]
        ... 13 more
Caused by: java.io.FileNotFoundException: /Media/Series/XTV (No such file or directory)
        at java.io.FileOutputStream.open0(Native Method) ~[?:1.8.0_131]
        at java.io.FileOutputStream.open(FileOutputStream.java:270) ~[?:1.8.0_131]
        at java.io.FileOutputStream.<init>(FileOutputStream.java:213) ~[?:1.8.0_131]
        at java.io.FileOutputStream.<init>(FileOutputStream.java:101) ~[?:1.8.0_131]
        at com.jcraft.jsch.ChannelSftp.get(ChannelSftp.java:956) ~[jsch-0.1.50.jar!/:?]
        at com.jcraft.jsch.ChannelSftp.get(ChannelSftp.java:878) ~[jsch-0.1.50.jar!/:?]
        at io.linuxserver.davos.transfer.ftp.connection.SFTPConnection.doGet(SFTPConnection.java:114) ~[classes!/:?]
        at io.linuxserver.davos.transfer.ftp.connection.SFTPConnection.downloadDirectoryAndContents(SFTPConnection.java:145) ~[classes!/:?]
        at io.linuxserver.davos.transfer.ftp.connection.SFTPConnection.download(SFTPConnection.java:61) ~[classes!/:?]
        ... 13 more
2017-10-01 11:06:06.700 -  INFO - [DownloadFilesWorkflowStep] - Clearing current queue and will still continue to next step
2017-10-01 11:06:06.701 - DEBUG - [SFTPClient] - Disconnecting from channel
2017-10-01 11:06:06.701 - DEBUG - [SFTPClient] - Disconnecting from session

Algorithm negotiation fail connecting to Ubuntu 20.04 server

When I attempt to connect to my server via sftp by adding it as a host, it comes back with the following error:

There was an error: Algorithm negotiation fail. My address for connection is 192.168.1.161 and port 22. I can connect using Filezilla to the server with no issues using the SFTP method and credentials. Below is the log capture if this helps any.

2021-06-09 22:12:35.474 - INFO - [LoggingManager] - Logging level now set at DEBUG 2021-06-09 22:12:42.263 - DEBUG - [SettingsServiceImpl] - Calling out to GitHub to check for new version (https://raw.githubusercontent.com/linuxserver/davos/LatestRelease/version.txt) 2021-06-09 22:12:42.291 - DEBUG - [SettingsServiceImpl] - GitHub responded with a 200, and body of 2.2.1 2021-06-09 22:12:42.291 - DEBUG - [VersionChecker] - Current version: 2.2.1, Remote version: 2.2.1 2021-06-09 22:12:42.291 - DEBUG - [VersionChecker] - Remote version is not newer 2021-06-09 22:12:58.323 - INFO - [HostServiceImpl] - Attempting to test connection to host 2021-06-09 22:12:58.324 - DEBUG - [HostServiceImpl] - Credentials: MYUSERNAME : MYPASSWORD 2021-06-09 22:12:58.324 - DEBUG - [HostServiceImpl] - Making connection on port 22 2021-06-09 22:12:58.324 - DEBUG - [SFTPClient] - Configuring connection credentials and options on session 2021-06-09 22:12:58.324 - DEBUG - [SFTPClient] - Username: 2021-06-09 22:12:58.356 - ERROR - [APIController] - Failed to connect to host 2021-06-09 22:12:58.356 - DEBUG - [APIController] - Exception: io.linuxserver.davos.transfer.ftp.exception.ClientConnectionException: Unable to connect to host 192.168.1.161 on port 22 at io.linuxserver.davos.transfer.ftp.client.SFTPClient.connect(SFTPClient.java:45) ~[classes!/:?] at io.linuxserver.davos.delegation.services.HostServiceImpl.testConnection(HostServiceImpl.java:98) ~[classes!/:?] at io.linuxserver.davos.web.controller.APIController.testConnection(APIController.java:193) [classes!/:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_275] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_275] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_275] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_275] at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:220) [spring-web-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:134) [spring-web-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:116) [spring-webmvc-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:827) [spring-webmvc-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:738) [spring-webmvc-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85) [spring-webmvc-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:963) [spring-webmvc-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:897) [spring-webmvc-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970) [spring-webmvc-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:872) [spring-webmvc-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at javax.servlet.http.HttpServlet.service(HttpServlet.java:648) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846) [spring-webmvc-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:230) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) [tomcat-embed-websocket-8.5.6.jar!/:8.5.6] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99) [spring-web-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.springframework.web.filter.HttpPutFormContentFilter.doFilterInternal(HttpPutFormContentFilter.java:89) [spring-web-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:77) [spring-web-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:197) [spring-web-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-4.3.4.RELEASE.jar!/:4.3.4.RELEASE] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:108) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:349) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:784) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:802) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1410) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_275] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_275] at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-embed-core-8.5.6.jar!/:8.5.6] at java.lang.Thread.run(Thread.java:748) [?:1.8.0_275] Caused by: com.jcraft.jsch.JSchException: Algorithm negotiation fail at com.jcraft.jsch.Session.receive_kexinit(Session.java:582) ~[jsch-0.1.50.jar!/:?] at com.jcraft.jsch.Session.connect(Session.java:320) ~[jsch-0.1.50.jar!/:?] at com.jcraft.jsch.Session.connect(Session.java:183) ~[jsch-0.1.50.jar!/:?] at io.linuxserver.davos.transfer.ftp.client.SFTPClient.configureSessionAndConnect(SFTPClient.java:86) ~[classes!/:?] at io.linuxserver.davos.transfer.ftp.client.SFTPClient.connect(SFTPClient.java:41) ~[classes!/:?] ... 56 more

Add "Run Now" button

It would be nice if each schedule had a "Run Now" button so in addition to the scheduled run, it could be run right then if need be.

Davos freezes -- Most common cause?

Just wondering if you happen to know what may cause Davos to freeze. I'm running it just fine on a few machines, but it freezes on my old MacBookPro -- I haven't limited its resource usage, and Docker Desktop on Mac allows full use of my Mac's resources. Any insight would help. Thanks!

Updated my docker container today, now getting an error

I just updated my docker container of davos today. I am now getting this whenever I hit the URL:
image

I went to check my logs and I am getting:

SEVERE: Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Handler dispatch failed; nested exception is java.lang.NoClassDefFoundError: Could not initialize class sun.security.ssl.SSLSessionImpl] with root cause

Nothing else has changed. Any ideas?

Show Last Run Time

Show last run time next to "Files on the host during the last scan" button.

Not downloading new files in subdirectory

Davos doesn't seem to be seeing new files which are added to a sub directory. For example, I have my root dir that it connects to and under that there is a folder where new content is added. The first time it ran it downloaded everything in this sub dir but now it doesn't see the new files which have been added since.

Implement StrictHostKeyChecking for SFTP

When I first started writing davos (it was originally a redo of auto-ftp), I envisaged it as a pet project to help home my skills in various libraries and frameworks. I liked Spring; Java is my language of choice, and we toyed with file transfers a fair bit at work. I wanted to see if I could incorporate these into a single app.

However, I cut some corners during the initial coding and hard coded the StrictHostKeyChecking to "no", just so I could minimise the amount of effort required to get SFTP up and running.

Since davos appears to have kicked off a little bit, I should really take this as a priority to change as it leaves users vulnerable to Man-in-the-middle attacks.

My proposed solution is to make use of an app-specific known_hosts file in /config and save the host keys there. Verification will take place on the Edit Host screen and new hosts will require this verification to take place by the user (via a confirm box).

I am trying to work out the best way for existing users to do this so if anyone has any ideas, I'm all ears.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.