Backups in a data security context is a very popular topic for some last years. SIM-Networks also considers the critical value of backup.
A popular theme is inevitably mythologized. We, humans, tend to fill in the gaps in our knowledge with false facts and subjective assessments. It happens with the backup and the issue of its organization by hosting providers. Should the hosting provider to set backup for its customers by default? The answer to this question can be found in our article Backups and hosters: some myths, some truth…
It's no wonder that backup is so popular. As various malware active evolves, and its evolution is often faster than antivirus software progresses, it's only logical to base your information security on systematically data backups. Instead of wasting your resources on preventing attacks and fighting viruses, you can save time and money by recovering the system and the data from backups.
Besides, the latest backups can neutralize the impact of force-majeure circumstances or a human factor or the consequences of hardware failure. As you may know, one of the rules for any system administrator is: whenever you set up a new server, start with backups!
You can create backups on your own — you can easily find on Google a bunch of tools for doing it. But if you are not an experienced system administrator, we recommend you ask help from experts that have the necessary knowledge and can set up backups while bearing full responsibility for the outcome.
Make sure that the copies of critical information are created regularly and stored remotely, as far from the original ones as possible.
Regular backups are essential because the information you restore has to be as up-to-date for you as possible. For example, if your system was affected by a virus and the only way to take back the critical data is to recover it from the backup, you might be upset to know that the recent copy of your accounting reports dates back to the previous month.
The reason why it’s essential to store your data remotely is the simple: if your backup storage shares a server with the main system, and this server dies – you lose everything literally for good.
So, make sure you have a backup schedule and store your backups remotely.
If you do want to take a risk and set up a backup for your data on your own, experts recommend that you search for the right backup software based on the following four essential characteristics:
It is important to note that nowadays, all backup software used by professional admins always complies with the requirements listed above. Besides, top-notch professionals with hands-on experience setting up backups can choose the perfect option for any case. That's why we strongly recommend seeking help for such professionals. Agree, that it is better than to suffer from losing the necessary data simply because they were overwritten by the wrong data on top (unfortunately, it happens if you set up backups incorrectly).
So, let's move to review the variety of backup methods. The difference among them is the way of information copying and compression.
The name is self-explanatory: each time a backup activity is scheduled, a full copy of the whole system is created or, to be more specific, a full copy of all data you have chosen for backups when creating a backup task. All data are compressed into an archive to minify the final size of the backup copy. So, when you do a full backup, you receive regular updates of the dataset where most data are duplicated (since most of the data do not change over time). This process takes up a lot of resources (see point 1 in the list of criteria for choosing a backup): storage space, time to create a backup and processing time, computing capacities, traffic for transferring backup archives into a remote data storage. In the past, the full backup method used to be very popular due to its high reliability, but now, it’s considered to be inefficient. E.g., if you need to create backups with low RPO (less than two weeks) or frequent backups (once every 24 hours, once every couple of hours), a full backup is too resource-intensive.
To raise the efficiency of a full backup helps the deduplication, by uncovering and deleting data duplicates in full copies. It can also be set up by special tools both on the data storage or the server, and the client layer. Some statistic reviews report that the deduplication results are pretty much impressive – from 90% to 98%.
The only certain advantage of a full backup is the recovery time — data are restored from one archive much faster compared to recovery from an incremental or a differential backup.
Nowadays, the full backup method is typically used in combination with other methods that are not that resource-intensive. Sometimes this approach is referred to as a mixed or synthetic backup.
This method is much quicker and more efficient compared to a full backup since this process only backs up the files that have changed since the time the previous backup was created. The mechanics of incremental backup is simple: the time when a full backup was created is chosen as an initial backup point Х0 (e.g., Monday midnight); a backup of those files that have been changed or that have appeared since Х0 is created at Х1 point (on Tuesday midnight); a backup of files that have been altered/ have been created since Х1 is done at Х2 point (on Wednesday midnight); … at the Хn point, the cycle is closed and the next full backup is created.
The incremental backup is much more reasonable concerning the storage space usage, the time, and the data traffic compared to other backup methods. However, if you need to recover data from a backup, the recovery process happens in phases from Хn-1…Х2, Х1, Х0 points – up to the last full backup inclusively. It means that recovery running can take a comparable lot of time.
This type of backup outruns the incremental backups in terms of the data recovery time — it takes far less time since the full copies Х0 and Хn are compared and, hence, there is no need for a phased recovery. However, in terms of the space required in a data storage system, differential backups are comparable to full backups, which means it doesn’t help to save the storage space and the traffic.
A differential backup means that backups are created on an accrual basis: each modified file in every next backup point is copied again. It looks as follows:: Х0, Х1, Х1+Х2, Х1+Х2+Х3, … +Хn, Х0+Х(1+…n)
In other words, it’s very bulky and complicated to calculate the required space in the data storage.
It's simple to realize the difference between an incremental and a differential backup. Actually, it is only one word. Just compare the following two sentences:
A delta backup (also delta block-level backup) is the modification of a differential backup. This method pre-supposes that only the changes in the files are backed, instead of the total backup of modified data. I.e., a file is backed up partially, not fully. However, a delta block-level method only applies to modified files, not the created ones – which is why new files are copied in full.
The advantages of this method are high-speed creation, space-saving, and much less (compared to incremental and differential backups) of odd data. You can ask why everyone doesn't use the delta block-level backup? It is because this backup type needs specific software to create backups and to recover the data. Besides, the delta block-level backup takes a lot of time to recover data — the information is gathered from a jigsaw of modified pieces. Still, this method works great if you need to facilitate continuous data protection (when a file is backed up immediately after its creation or right after making modifications to it – something like the Word's autosave feature) or when you need to save backups in a remote data storage, but the bandwidth is low.
The binary patching introduced by coders works similarly to delta block because parts of modified files are backed up; however, another comparison basis is used (blocks for the delta and bits of information for the binary patching).
Still, both those backup methods are used in conjunction with a differential or an incremental backup, not by themselves.
Also, the mirroring technology sometimes is referred to as a backup option, as it’s used, e.g., at RAID1 in hardware architecture, or to create mirror websites. It is just a copy of files, while the modified ones accumulated over a set time are not archived and systemized.
During the past 12-15 years, backup technologies were drastically changed that made people reevaluate the efficiency of the older approaches and come up with new ones. For example, it is the snapshot technology that lets you make instant 'shots'' of the file system, which then use to ''splice'' a backup. Nowadays, snapshots technology wide is used to fast and easy cloud backups without having to stop the instances. Besides, when used in the cloud, snapshots help you to save on the data storage space, as they don't take any space on the customer's drive.
Sure, if you prefer to do everything by yourself, it will not be a problem for you to set up a backup manually on your home computer. However, even in this case, there is a risk, because something may go wrong, and valuable photos, books, videos, or the rocket-science calculations may accidentally not be saved or saved with a defect that will make it impossible to restore them from the backup. And what about office infrastructure? How to configure a backup of the entire corporate data? So, we strongly recommend you to rely on the competent experts of the hosting provider. Ordering a backup setup and space for remote backup storage in Germany is very simple.
Backing up business-critical data these days has become imperative. Nowadays, backup is part of comprehensive measures to ensure the information security of the enterprises. We talked about it in our article Corporate cybersecurity: How to defend information values
If you rent our SIM-Cloud IaaS, it’s easy to order a product SIM-Cloud BaaS (Backup-as-a-Service) in a couple of clicks. Everything is already configured and will be connected automatically as soon as you give a sign. BTW, when our engineers developed SIM-Cloud BaaS, they have chosen the incremental backup after analyzing the efficiency of different types of backups. Our cloud backup is optimized to provide the RTO (recovery time objective) is on average, from 15 to 30 minutes, depending on the amount of data. SIM-Cloud BaaS from SIM-Networks meets all the essential characteristics of high-quality backups listed above.
You can choose in which of two our datacenters in Germany to arrange the backup storage. The first option is Local storage — your backups are stored in the same datacenter where your main infrastructure is deployed. This makes it possible to optimize RTO and RPO. The second option is Remote storage — backups are sent for storage to the datacenter, remote from the one in which the main infrastructure is deployed. Data recovery, in this case, will be a bit slower, but the security is more reliable. If you are not sure which option to choose, contact our Customer Care and our experts will help you find the best solution.
For the adherents of the classic hardware, we offer rental of reliable, safe, high-end remote storage for backups. And, sure, our highly qualified support experts will help you configure all necessary backup options for your system.
LIKE THE ARTICLE? SHARE VIA: