Backups: the most effective methods for various activities


Lately, backups are being written and spoken about all the time. SIM-Networks is one of the contributors to the discussion, too.

No wonder: taking into account how actively malware is being developed and the fact that it’s well beyond antiviruses, it’s only logical to base your IT infrastructure around information backups. Instead of wasting your resources on preventing attacks and fighting viruses, you can save time and money by restoring the system and the up-to-date data from backups.

Besides, up-to-date backups can balance the impact of force-majeure circumstances or a human factor, or it can neutralize the consequences of hardware failure for any reason. As you may know, one of the rules for any system administrator is: whenever you set up a new server, start with backups!

You can create backups on your own, there’s a bunch of tools available that you can easily find on google. However, unless you are a seasoned and professional system administrator, we recommend that you entrust yourself to someone who has the necessary knowledge and can set up backups while bearing full responsibility for the outcome.

Make sure that the copies of critical information are created on a regular basis and stored remotely, as far from the original ones as possible.

Regular backups are essential because the information you restore has to be as up-to-date for you as possible. For example, if your system was affected by a virus and the only way to recover the precious data is to restore them from the backup, you might be upset to know that the recent copy of your accounting reports dates back to the previous month.

The reason why it’s important to store your data remotely is the following: if, say, your backup storage shares a server with the main system and this server dies – you lose literally everything. For good.

To sum up, make sure you have a backup schedule and store your backups remotely.

Main criteria for choosing a backup software

If you do want to take a risk and set up a backup for your data on your own, experts recommend that you search for the right backup software based on the following four general criteria:

  • effective resource consumption: a backup software has to operate as autonomously as possible (without taking too much of your time, that is, to be as automated as possible), with the lowest possible system resource load, and to be executed as fast as possible;
  • recovery rate: the software should restore your data from a backup as quickly as possible so that your business processes are not affected; ideally, it should enable direct work with the data backups;
  • data protection and security: the backup software has to facilitate a high level of protection, using both cryptographic and hardware tools (data transmission channels protection in the data storage system, data protection during the backup operation, ability to restore an interrupted session);
  • flexibility: the backup software has to be suited for all types of data (since you cannot predict which types of those you might deem critical and, hence, choose to be backed up to another storage) and give you a number of choices for backup methods while being fully functional under any of those methods.

It is important to note that modern software used by professional admins always complies with those requirements. Besides, qualified professionals having hands-on experience setting up backups can choose the best fit for any case. That’s why we strongly recommend seeking help for such activities in order not to suffer from losing the necessary backups later on simply because they were overwritten with the wrong data (unfortunately, it happens if you choose the wrong way of doing backups and the wrong volume of a backup data storage system).

So, let us move to the backup methods – we have already mentioned there is a couple of ways to do backups. The difference between them is the way of information copying and compression.

Full backup

The name is self-explanatory: each time a backup activity is scheduled, a fully copy of the whole system is created or, to be more specific, a full copy of all data you have chosen for backups when creating a backup task. To reduce the final size of the backup copy, all data are compressed into an archive. So, when you do a full backup, you receive regular updates to your storage where most data are duplicated (since most of the data do not change over time). This process takes up a lot of resources (see point 1 in the list of criteria for choosing a backup): storage space, time to create a backup and processing time, computing powers, traffic resources for transferring archives into a remote data storage system. Even though the full backup method used to be very popular because of its high reliability, as of now, it’s considered to be inefficient. For example, if you need to create low-depth (less than two weeks) or frequent backups (once every 24 hours, once every couple of hours), a full backup will be an overkill.


The deduplication mechanism does make things slightly more efficient by uncovering and deleting data duplicates in full copies. It can also be set up by special software both on the data storage system, the server and the client level. Some statistic sources show that the deduplication results are pretty much impressive – from 90% to 98%.

The only notable advantage of a full backup is the recovery rate: data are restored from one archive much faster compared to a recovery from an incremental or a differential backup.

Nowadays, the full backup method is typically used in combination with other methods that are not that resource-consuming. Sometimes this approach is referred to as mixed, or synthetic, backup.

Incremental backup

This one is much faster and more cost-effective compared to a full backup, since this process only backs up the files that have changed since the time the previous backup was created. The mechanism of incremental backup is simple: the time when a full backup was created is chosen as an initial backup point Х0 (say, Monday midnight); a backup of those files that have been changed or that have appeared since Х0 is created at Х1 point (say, on Tuesday midnight); a backup of files that have been altered/ have been created since Х1 is done at Хpoint (on Wednesday midnight); … at the Хn point, the cycle is closed and the next full backup is created.

Incremental Backup Infographics

This method is much more economical in terms of the resources and the storage usage, the time and the data transmission traffic compared to others. However, if you need to recover data from a backup, the recovery process happens in phases from Хn-1…Х2, Х1, Х0 points – up to the last full backup inclusively, which means that it might take a lot of time.

Differential backup

It outruns the incremental backups in terms of the data recovery speed – it takes far less time since the full copies Х0 and Хn are compared and, hence, there is no need for a phased recovery. However, in terms of the space required in a data storage system, differential backups are comparable to full backups, which means it doesn’t help to save the storage space and the traffic.

Differential Backup Infographics

A differential backup means that backups are created on an accrual basis: each modified file in every next backup point is copied again. It looks as follows:: Х0, Х1, Х12, Х123, … +Хn, Х0(1+…n)

In other words, it’s very bulky and complicated to define the place in the data storage system.

The difference between an incremental and a differential backup is self-explanatory. In fact, it can be described with one word. Just compare the following two statements:

  • an incremental backup processes the files that have been created or modified since the time the previous backup was made;
  • a differential backup processes the files that have been created or modified since the time the previous full backup was made.

A delta backup (delta block-level or delta-style backups) is one of the forms of a differential backup. This method pre-supposes that only the changes happening to the files are backed, while the modified data are not re-written completely. That said, a file is backed up partially, not fully. However, a delta block-level method only applies to modified files, not the created ones – which is why new files are copied in full.

The advantages of this method are high creation speed, top-notch space saving and minor (compared to incremental and differential backups) number of redundant data. You might think that everyone should be using the delta method; however, it doesn’t happen, because to create such backups and to restore the data, one needs special software. Besides, it takes a lot of time to restore data from a delta backup: the information is gathered from a jigsaw of modified pieces. Still, this method works great if you need to facilitate continuous data protection (when a file is backed up immediately after its creation or right after making modifications to it – something vaguely resembling the Word autosave feature) or when the bandwidth is low and you need to save backups in a remote data storage.

The binary patches method introduced by software programmers works in a similar fashion because parts of modified files are backed up; however, another comparison basis is used (blocks for the delta methods and bits of information for this one).

Still, one should keep in mind that both those methods are used in conjunction with a differential or an incremental backup, not by themselves.

Sometimes the mirroring technology is referred to as a backup option, as it’s used, say, at RAID1 at the hardware level or to create mirror websites. In fact, this is just a copy of files, while the modified files accumulated over a set period of time are not archived and systemized.

During the last 12-15 years, the backup technologies have undergone critical changes that made people reevaluate the effectiveness of the older approaches and come up with new ones. For example, the introduction of the snapshots technology that lets you make instant ‘images’ of the file system that you can then use to ‘make’ a backup has enabled fast and easy cloud backups without having to stop the virtual machines. Besides, when used in the cloud, snapshots help you to save on the data storage resources, since they don’t take any space on the client’s disk.

To sum it up, let us tell you that when setting up the Backup-as-a-Service at our SIM-Cloud, we have analyzed a bunch of approaches for backup creation in terms of their efficiency, and we have opted for the incremental backup and optimized it in such a way that our average RTO (recovery time objective) varies between 15 and 30 minutes (depending on the amount of data). We are safe to say that our cloud BaaS complies with all above-mentioned criteria of a top-grade backup.

Those advocating the classic ‘hardware’ approach can take advantage of our remote storage rental option for backups – it’s reliable, it’s secure, it’s cutting-edge. In the meantime, our seasoned support team will help you to set up the best possible backup mode for your system.

Author: Alisa Kandeeva

Share this: