Avoid Disasters – Design a Proper Backup Strategy

Avoid Disasters – Design a Proper Backup Strategy

In the digital age, everything revolves around data. In business applications, a loss of data will cost you time and money, customers, lawsuits, and even your entire business. This is not mere conjecture: it is fact! For this reason, we employ backups of our data to keep operations running smoothly always.

In this post, we shall delve into which data should have a backup and why. We shall debunk several myths people have about their data backup. Lastly, we shall identify best practices for data backup.

Why Does Data Backup Matter?

Simply put, Data Backup is a copy of sensitive and central data stored in a secured location. As new data comes in, it should be backed up immediately. If the original data is lost or corrupted, all is not lost and your business can continue its everyday business.

Root Causes of Data Breach in 2017

Why might data ever get lost or corrupted? Nowadays, it is cyber warfare and hackers that induce attacks on data systems. These attacks account for 47% of all data loss events. Human error clocks in at nearly a third, with 29%. Finally, the remaining quarter of events is systematic error or glitch. With a 400 percent increase in data loss events since 2012, data backup has never been more important.

Ponemon Institute LLC conducted a study covering 413 companies in 13 countries. What they found was mind-numbing: these companies suffered average total costs $3.62 million dollars when data was lost or corrupted. Furthermore, on average each breached customer account or individual file cost them about $141 in total.

2017 Per capita cost of data breach compared to last 4 year average

2017 Per capita cost of data breach compared to last 4-year average

According to Gartner’s research, nearly 43 percent of all businesses would fail immediately if they lost all of their data. With such a catastrophic event, these businesses would lose their electronic invoices and email records. Client accounts, contract status, the accounts payable, all out the door. Marketing, sales, and commission plans would exist only on paper schematics. Human Resources and the payroll department would be paralyzed. The kind of trouble only deep pockets can save you from.

None of that is of our concern, however, provided our data is backed up. So let’s take a look at what data should be backed up.

What Should You Backup?

Let us consider what kind of data should be a priority for backup. To begin with, you absolutely need to backup the backup. For instance, business websites should secure at least 2 copies of contents, databases, and emails.

In general, operating systems can be easily replaced and may not need backup. Same goes for most software programs. The only caveat exists when you are using customized settings. In that case, take a storage snapshot at the moment that the settings were customized. It serves as the default state for the program to return to after a system crash or system file corruption.

As we continue, let’s debunk the myths about data backup with some cold facts featured in the following infographic.

Infographics: Myths Vs Facts About Backup

Myths Vs Facts About Backup

Whether it’s human error or a malicious software attack, data backup is the best response to mitigate the fallout of data breach.

Best Backup Practices

  1. Choose a Backup Storage & Backup Location
  2. Choose a Reliable Backup Automation System
  3. Secure Multiple Backup Copies
  4. Set an Appropriate Backup Frequency
  5. Test Your Backup

There are many different backup strategies out there to safeguard the data. Let’s discuss common strategies, programmers most prefer to secure the data, web contents, databases. In this section, we have discussed three incremental levels of data backup strategy.

Real World Example of Website Backup in Action

1) Manage Offline Local Backup
i)  Setup Daily Backup
ii) Setup Weekly Backup To Additional Secondary Storage Devices
iii) Setup Monthly offsite location copy
2) Version control to have better management of your changes
3) Web hosting backup

1) Manage Offline Local Backup

“It’s always better to keep one copy offline”

If you are looking for a backup and recovery solution, the first primary solution you will mostly hear about is 3-2-1 backup strategy. You will hear from most IT consultant and IT geeks that 3-2-1 is the best practice for data backup and recovery at the primary level.

So what is a 3-2-1 backup strategy? How to implement it? Basically, It is a very simple and efficient strategy.  

I) Setup Daily Backup

First of all, keep at least three copies of data. That includes one original and two backup files. If you are using Linux or Mac, then you can take backup using a simple command.

If you want to take a backup of the project and want to set it automatically daily, weekly, monthly using 3-2-1 strategy.

i) If you are having a website project in your /var/www/html/project on
another hard drive then you can take the backup using rsync command:

The syntax of rsync backup is: rsync argument source-directory destination-directory


In the above example, backup of “project” taken into the backup folder of the second hard drive.

How to automate this to daily backup?

You need to install the cronie package first.

yum install cronie

Then open crontab file.

vi /etc/crontab

The format of crontab is as follows, To schedule the crontab file at a particular time.

min hr dom mn dow command

If you want to schedule the backup daily at 1:30 AM then add the line using the above format in the crontab file

30 01 * * * rsync -ar /var/www/html/project /media/hdd2/backup/

It should look like this in the crontab file:

Then restart the crond service

service crond restart

Now backup will automatically be taken daily at 01:30 AM.

For Windows, you can use any best backup software available in the market.

It is always better when you have multiple backups in multiple different devices. If you lose online backup copies then you can restore from local backup copies.

II) Setup Weekly Backup To Additional Secondary Storage Devices

Next, keep the two copies of data on two different storage devices like keep the one copy on internal drives and keep the second copy on the external device like Remote server, external hard drive, NAS, etc. It is better to have other two copies on the local site itself because it is easily accessible and provide quick restoration.

If you want to take backup over remote server using rsync and SSH. Then follow the following steps:
i) First, you need to complete the process of SSH Key-Based Authentication.
ii) Following execution of command will take backup of your project folder into the remote server’s backup folder.

rsync -aer /var/www/html/project/ ssh [email protected]:/backup

How to automate this as a weekly backup?
put the following line within crontab file.

00 04 * * 0 rsync -aer /var/www/html/project/ ssh [email protected]:/backup

Your project’s data automatically will be taken on every Sunday at 4 o’clock ( 0 means Sunday) into ( remote Linux server.

III) Setup Monthly offsite location copy

In last, keep the one copy of the backup on the offsite location like cloud storage. Offsite backup like on Dropbox, Google drive assures that there will be no single point failure means your data will be safe in natural calamities.

There is no such thing as a perfect backup strategy, but the 3-2-1 strategy is a great start for backup recovery.

2) Version control to have better management of your changes

The next way to keep your data managed well with the Version Control System. What if you have a team working on the same project and someone accidentally made the changes in a file or folder. It might possible because it is very confusing if everyone is using the same folder to save the data. That’s why you must need a Version Control System like Git, BitBucket, etc.

The basic flow is as follows:

  1. Consider, the production version of your main code is stored in the Master Repository
  2. And, Development version of your code is stored in Develop Repository.
  3. If you plan to add features in your website then create from Develop branch. This is the third branch “feature”.
  4. Whenever your developer is about to finish and release the product version, create new releases brach and stored the data and make the final changes in that folder.
  5. When release branch is finalized to release then merge the release branch to master branch.
  6. If any bug found during the process, which should be fixed immediately then hotfix branch must be created off the master branch. After a fix, “hotfix” branch will merge into “master” and “develop” branch to apply the changes in all over places.

So, In these version control systems, multiple people can work on a single project simultaneously without interfering another work. Version control system maintains for all user character level changes for all files. It allowing to complete backtrack of all versions of each file.

The basic idea of version control is to manage multiple version of projects. It is not necessary to all version control web application provides the backup service. The service like bitbucket provides the additional feature “zero downtime backup” to secure the project’s data as backup. Zero Downtime Backup allows you to take backup as often as you need (eg. hourly, daily) without causing any user or built agent facing any downtime.

It allows multiple developers or website designers to work in an isolated mode without impacting other’s work. It also allows testing, build your own version.

3) Web hosting backup

What if your business has large data to back up, then those traditional ways to take backup may not work efficiently. It will take too much time. That when Continues Data Protection(CDP) comes into the picture. CDP is a cost-effective solution to conducting a high performance at any time having a very low impact on server and disk performance.

CDP pickups every change in the file. Continuous Data Protection (CDP) will store data backup at the specified intervals set as per backup policy you want. You can restore the backed up data from the available restore points on request any time.

Recovery points: CDP backup restored the backup as recovery points. The first recovery point is called the initial replica performed only one time as long as the storage medium is not replaced.

Then another recovery point should be set to the frequency of daily, weekly and monthly as a foolproof plan for development projects. Each recovery point contains changes since the last. After some time you can set to merge old recovery points automatically.

How CDP works?

First, you need to purchase service for CDP backup. Then first specify the files that you want to secure data over CDP. Second, identify and allocate the space on the target backup server.
The more space you get for a backup, farther back in time you can go to restore files.

The features you should consider before purchasing a CDP backup from hosting service providers:

i) What type of devices they use for storage to back up the data?
It is necessary to check the what type of disk they use for CDP backup server. It will be better if they use SSD drive for CDP backup storage.
ii) What will be the backup frequency Daily, Weekly, Monthly?
Mostly, hosting provider provides the backup frequency as daily, weekly. It is better to choose who provides daily backup frequency.
iii) What about recovery point?
Recovery point means at what time your backup data will be available when needed. Best hosting providers will always provide backup at any given point of time.
iv) What retrieval method they provide?
You must check the backup retrieval method option. You should able to get backup of entire data or a specific folder if you need.

So, if your data is an important asset for your business, then you need to diversify the backup of your data. Above we have mentioned the most foolproof and impeccable techniques for backups.


Just like any other administrative task, if managed inefficiently data backup can seem like a huge waste of time. That is why one of the main takeaways from this piece should be to identify the appropriate frequency for your needs.

In discussing recent studies on data breach and common myths vs facts about data backup, we saw how costly it can be not to backup data. After, we reviewed best practices for managing data backup.

Finally, questions, concerns, or your own stories about data backup can be added comments section below.

Rahul Vaghasia

Rahul is CEO at AccuWebHosting.com. He shares his web hosting insights at AccuWebHosting blog. He mostly writes on the latest web hosting trends, WordPress, storage technologies, Windows and Linux hosting platforms.
Rahul Vaghasia
(Visited 450 times, 1 visits today)

Leave a Reply

AlphaOmega Captcha Classica  –  Enter Security Code

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sign up for a News Letter Click here to sign up