Setup Autobackup on Linux Print

  • 0

Introduction

Autobackup is an automated backup system which retrieves data to backup from a remote server and stores it on the server. In our case, we can use a Rcs Storage Instance with huge amounts of disk space and backup all our Compute Instances through that in order to prevent any data loss.

Requirements

  • rsync

Installation

You can install AutoBackup using git easily on your system. Go ahead and clone the repository:

mkdir /opt/
git clone https://github.com/fbrandstetter/Autobackup.git /opt/autobackup/

Configuration

Before we can start backing up data from our Compute Instances, we need to gain access to them. For that, we'll create a SSH key on our Storage Instance and grant access from it to all Compute Instances. Next, we'll start with creating the key:

ssh-keygen

As of now, we have to copy our public key to the Compute Instances. Open the following file ~/.ssh/id_rsa.pub and copy it to the Compute Instances' ~/.ssh/authorized_keys:

cat ~/.ssh/id_rsa.pub

If you don't work with public keys to gain access to your Compute Instances, you have to set the authorized keys file in the SSH server config first. Open the following file /etc/ssh/sshd_config on the Compute Instances and uncomment the following line:

AuthorizedKeysFile %h/.ssh/authorized_keys

Connection

Once you added the SSH key of the Storage Instance on all Compute Instances, you can go ahead with trying the connection to one of your Compute Instances ( in order to avoid any issues later, make sure the connection to all servers works ):

ssh root@COMPUTE_INSTANCE_1

You should be able to login without typing in any password nor something else.

Configure Autobackup

Autobackup requires some config to function properly as well. Open the /opt/autobackup/backup.sh file, as any config is being stored in the bash file itself. Take a look at the following lines and adapt them to fit your needs:

BACKUPDIR=""
PASSWORD=""
FREEUPSPACE=""
MAXUSED=""
  • BACKUPDIR: This is the folder in which all backups will be stored.
  • PASSWORD: This is the password being used to encrypt the backups.
  • FREEUPSPACE: Defines whether the script should delete old backups if the disk is full or not.
  • MAXUSED: Defines what the lowest disk size is allowed to be until it stops backuping or starts deleting old ones.

Add Server

All servers to backup are being stored in the /opt/autobackup/serverlist.template file using the following format:

<SERVER_HOSTNAME OR IP>|<USERNAME FOR AUTHENTICATION>|<EXCLUDE LIST>

Global Excludes

By default, Autobackup automatically backups the entire server, that means it tries to download / recursive. Because some people don't need the entire system to be backuped, you can add global excludes ( which apply to any server ) and server-specific excludes, which apply to specific servers. All global excludes are being stored in the file called /opt/autobackup/default-excludes.template and the file is prefilled with /proc and /dev, you can add new folders and file extensions there by simply adding new lines:

/proc
/dev

Server-specific Excludes

Because most people are running different types of servers ( e.g. Webservers and Database servers ) there are unique exclude lists for each server. The format of the server-specific exclude files looks the same like the global ones. You can create a new file and call it to the EXCLUDE_LIST you set for the server in the server-list. If you don't want to have any exclude-list specified for this server, set it to empty in the server-list. The file called empty was already downloaded by the repository clone - this file is empty in order to have no directories or anything else excluded, while the default excludes still take effect.

Restore Data

In an ideal environment, we're not even supposed to restore our encrypted backups. Although when face issues and we need to retrieve our backuped data, it's quite easy to restore it. You can restore any backup file using the following command:

openssl aes-256-cbc -d -salt -in BACKUP.tar.aes -out BACKUP.restored.tar
mkdir backup/
tar -xvf BACKUP.restored.tar backup/

Replace the BACKUP.tar.aes with the filename of the desired backup to restore. BACKUP.restored.tar will be the file name of the unencrypted archive. In the example above, we've already done the next steps, which are:

  • Create a new folder
  • Restore the unencrypted archive to the folder

Conclusion

Autobackup is a fully automated and quite smart backup script which handles the backups automatically for us and the huge plus is, the data is being encrypted by a password which can be nearly unlimited long. That means, as long you keep your password secure and it's long enough, nobody is able to touch your data in a timely manner. Happy Hacking!

Introduction Autobackup is an automated backup system which retrieves data to backup from a remote server and stores it on the server. In our case, we can use a Rcs Storage Instance with huge amounts of disk space and backup all our Compute Instances through that in order to prevent any data loss. Requirements rsync Installation You can install AutoBackup using git easily on your system. Go ahead and clone the repository: mkdir /opt/ git clone https://github.com/fbrandstetter/Autobackup.git /opt/autobackup/ Configuration Before we can start backing up data from our Compute Instances, we need to gain access to them. For that, we'll create a SSH key on our Storage Instance and grant access from it to all Compute Instances. Next, we'll start with creating the key: ssh-keygen As of now, we have to copy our public key to the Compute Instances. Open the following file ~/.ssh/id_rsa.pub and copy it to the Compute Instances' ~/.ssh/authorized_keys: cat ~/.ssh/id_rsa.pub If you don't work with public keys to gain access to your Compute Instances, you have to set the authorized keys file in the SSH server config first. Open the following file /etc/ssh/sshd_config on the Compute Instances and uncomment the following line: AuthorizedKeysFile %h/.ssh/authorized_keys Connection Once you added the SSH key of the Storage Instance on all Compute Instances, you can go ahead with trying the connection to one of your Compute Instances ( in order to avoid any issues later, make sure the connection to all servers works ): ssh root@COMPUTE_INSTANCE_1 You should be able to login without typing in any password nor something else. Configure Autobackup Autobackup requires some config to function properly as well. Open the /opt/autobackup/backup.sh file, as any config is being stored in the bash file itself. Take a look at the following lines and adapt them to fit your needs: BACKUPDIR="" PASSWORD="" FREEUPSPACE="" MAXUSED="" BACKUPDIR: This is the folder in which all backups will be stored. PASSWORD: This is the password being used to encrypt the backups. FREEUPSPACE: Defines whether the script should delete old backups if the disk is full or not. MAXUSED: Defines what the lowest disk size is allowed to be until it stops backuping or starts deleting old ones. Add Server All servers to backup are being stored in the /opt/autobackup/serverlist.template file using the following format: || Global Excludes By default, Autobackup automatically backups the entire server, that means it tries to download / recursive. Because some people don't need the entire system to be backuped, you can add global excludes ( which apply to any server ) and server-specific excludes, which apply to specific servers. All global excludes are being stored in the file called /opt/autobackup/default-excludes.template and the file is prefilled with /proc and /dev, you can add new folders and file extensions there by simply adding new lines: /proc /dev Server-specific Excludes Because most people are running different types of servers ( e.g. Webservers and Database servers ) there are unique exclude lists for each server. The format of the server-specific exclude files looks the same like the global ones. You can create a new file and call it to the EXCLUDE_LIST you set for the server in the server-list. If you don't want to have any exclude-list specified for this server, set it to empty in the server-list. The file called empty was already downloaded by the repository clone - this file is empty in order to have no directories or anything else excluded, while the default excludes still take effect. Restore Data In an ideal environment, we're not even supposed to restore our encrypted backups. Although when face issues and we need to retrieve our backuped data, it's quite easy to restore it. You can restore any backup file using the following command: openssl aes-256-cbc -d -salt -in BACKUP.tar.aes -out BACKUP.restored.tar mkdir backup/ tar -xvf BACKUP.restored.tar backup/ Replace the BACKUP.tar.aes with the filename of the desired backup to restore. BACKUP.restored.tar will be the file name of the unencrypted archive. In the example above, we've already done the next steps, which are: Create a new folder Restore the unencrypted archive to the folder Conclusion Autobackup is a fully automated and quite smart backup script which handles the backups automatically for us and the huge plus is, the data is being encrypted by a password which can be nearly unlimited long. That means, as long you keep your password secure and it's long enough, nobody is able to touch your data in a timely manner. Happy Hacking!

Was this answer helpful?
Back

Powered by WHMCompleteSolution