Below are the manual Task’s I perform frequently on all self-managed Ubuntu serves I own (along with automated backups and update tasks).
Advertisement:
Here is a post on automatically backing up a server to another via crontab entries. I have a number of Ubuntu Servers on Vultr and Digital Ocean (online cloud server for as low as $2.5 a month).
I also perform manual backups to ensure files are backed up.
Manual Backup Files.
First I create the following folder structure on my OSX desktop (for each server I need to backup).
My Server 01
- ~\Desktop\www.myserver01.com\
- ~\Desktop\www.myserver01.com\www
- ~\Desktop\www.myserver01.com\db
- ~\Desktop\www.myserver01.com\nginx
- ~\Desktop\www.myserver01.com\misc
My Server 02
- ~\Desktop\www.myserver02.com\
- ~\Desktop\www.myserver02.com\www
- ~\Desktop\www.myserver02.com\db
- ~\Desktop\www.myserver02.com\nginx
- ~\Desktop\www.myserver02.com\misc
etc (for each server)
I then use Forklift 3 (not a paid endorsement, I just love it) to manually backup files on the server. I manually copy files that are available via existing SFTP connections in Forklift (e.g WWW, MySQL, NGINX, MongoDB etc).
I simply drag and drop important file system files in Forklift (from the remote SFTP instance to a local folder).
SFTP copy progress can be viewed in Forklift.
fyi: SFTP is not the fastest transfer protocol. It appears only 39MB (4055 items) has been downloaded in 8 hours over ADSL (on a server that is over 400ms away).
FYI: Slow servers (ping) do not like like SFTP.
1 2 3 4 5 6 | ping myserver01.com PING myserver01 (ip_removed): 56 data bytes 64 bytes from 45.x.x.x: icmp_seq=0 ttl=53 time=450.338 ms 64 bytes from 45.x.x.x: icmp_seq=1 ttl=52 time=423.412 ms 64 bytes from 45.x.x.x: icmp_seq=2 ttl=52 time=458.129 ms 64 bytes from 45.x.x.x: icmp_seq=3 ttl=53 time=462.419 ms |
TIP: Consider zipping smaller files first (into one larger file) or using RSync instead.
Pre Zipping up files before backing up.
I used this command to pre-compress entire folders before downloading them over SFTP.
1 | cd / && zip -r -9 /www.zip /www && zip -r -9 /nginx.zip /etc/nginx/ && ls -al |
I created a bash script to manually prepare files to backup on each server.
1 2 3 4 5 | #!/bin/bash sudo rm /www.zip -R sudo rm /nginx.zip -R sudo zip -r -9 /www.zip /www sudo zip -r -9 /nginx.zip /etc/nginx/ |
Don’t forget to make the script file executable
1 | chmod +X _ManualBackup.sh |
Now I call the script to compress files before backup
1 | sudo bash _ManualBackup.sh |
TIP: Add other things to back up to the script (e.g MongoDB and MySQL).
Below is my script to backup WWW, NGINX, MySQL and MongoDB and zip them before copying over SFTP.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 | #!/bin/bash # Backup WWW Task echo "Backing up WWW" sudo rm /www.zip -R sudo zip -r -9 /www.zip /www echo "Finished Backing up WWW" # Backup NGINX Task echo "Backing up NGINX" sudo rm /nginx.zip -R sudo zip -r -9 /nginx.zip /etc/nginx/ echo "Finished backing up NGINX" #Backup MySQL Task echo "Backing up MYSQL Databases" sudo rm /*.sql USER="***********" PASSWORD="*********************************" databases=`mysql -u $USER -p$PASSWORD -e "SHOW DATABASES;" | tr -d "| " | grep -v Database` for db in $databases; do if [[ "$db" != "information_schema" ]] && [[ "$db" != "performance_schema" ]] && [[ "$db" != "mysql" ]] && [[ "$db" != _* ]] ; then echo "Dumping database: $db" mysqldump -u $USER -p$PASSWORD --databases $db > /$db.sql fi done sudo zip -r -9 /dbs.zip /*.sql rm /*.sql -R echo "Finsihed Backing up MYSQL Databases" # Backup MongoDB Task echo "Backing up MongoDB" rm /mondodb.zip -R sudo zip -r -9 /mongodb.zip /mongodb* echo "Finsihed Backing up MongoDB" # Done ls /*.zip -al echo "Done" |
Thanks to this thread for MySQL export to separate file help.
Note: The warning “Using a password on the command line interface can be insecure.” will be shown when exporting a database from the CLI.
Now I can download the 4 outputted files manually from each server.
1 2 3 4 5 6 | cd / ls -al -rw-r--r-- 1 username username 1615478 Dec 30 13:47 /dbs.zip -rw-r--r-- 1 username username 753094 Dec 30 13:48 /mongodb.zip -rw-r--r-- 1 username username 42222 Dec 30 13:47 /nginx.zip -rw-r--r-- 1 username username 239327652 Dec 30 13:47 /www.zip |
Snapshot Tasks
Taking snapshots in Vultr is a great way to backup too.
Manually Updating the Server
I also connect to the server via SSH and check for package updates
1 2 3 4 | /usr/lib/update-notifier/apt-check --human-readable 9 packages can be updated. 3 updates are security updates. |
Now I can now manually update packages
1 | sudo apt-get update && sudo apt-get upgrade |
Quick Reboot to ensure packages are updated.
1 | sudo shutdown -r now |
More to come (on viewing logs (errors, firewall, stats), backing up php etc).
Here is a post on automatically backing up a server to another via crontab entries. I have a number of Ubuntu Servers on Vultr and Digital Ocean (online cloud server for as low as $2.5 a month).
Hope this helps someone.
Donate and make this blog better
Ask a question or recommend an article
Revision History
v1.0 Initial Version