X

Data is the lifeblood of every business. The inability to access that data could result in damage that takes some organizations years to repair. In some cases, that lack of access may very well spell the end of the road.
Your future is only as promising as your business continuity plan, and your backup strategy is an integral cog in the machine. With that in mind, we have outlined a handful of fundamental strategies that will help make your business resilient in the face of any disaster.

  1. Protect Data at Every Level

Enterprise applications such as MySQL, Exchange Server, and Hyper-V function as their independent systems. They come complete with their user roles, access policies, and security features. They’re also responsible for generating and managing data that is probably as important as the data you protect on your operating system. Whether it’s handled with a native tool baked into the system or an all-in-one solution, that data should be included in your backup plan. Complete system protection must cover file data, configuration data, application data and beyond.

  1. Backup Your Backups

Most IT professionals are well aware of the 3-2-1 Rule. It’s a data protection principle that preaches a more reliable way to backup sensitive information. Here’s how it breaks down:

Whether it’s the result of a deteriorated DVD or broken flash drive, backups fail on a regular basis. A second copy provides some added assurance, but a third gives you peace of mind that comes from knowing your data can be recovered even if two backups happen to fail.

  1. Store Backups in Multiple Locations

The 1 component of the 3-2-1 rule stresses the importance of keeping your backups in different places.  Creating backups of your blog might protect you from attacks on the web server, but what happens when your local server is hit with malware or hardware failure? If that server houses all of your backups, having three copies won’t do you any good. Keeping one copy on your local NAS appliance, another at your backup site, and a third in the cloud is an example of how you can make sure your data is always available.

  1. Set Logical Recovery Goals

You want to rebound from a disaster as soon as possible. Instead of winging it and hoping for the best, map out that road to recovery by setting specific recovery goals. Recovery point objective (RPO) and recovery time objective (RTO) are essential to business continuity planning. While the names suggest a degree of commonality, they are two entirely different parameters that play an equally important role in devising a backup plan that best suits your business requirements.

RPO refers to the maximum period of time in which data is lost during a disruption. The best way to sum it up is an estimation of how much data you can afford to lose amid a disaster. This parameter is vital when it comes to determining how often you need to backup your data. If you set your RPO at eight hours, that means you have decided that you can lose eight hours worth of work and whatever data produce within that interval. To meet that objective, you would need to perform a backup at least once every eight hours. Anything less could hinder your ability to recover that data.

RTO refers to the target time designated to recover from an incident. In order words, it defines how long an organization can afford to be inoperable before a business is negatively affected. Let’s say you set your RTO at three hours. That means you essentially have three hours to get your servers, networking equipment, or telecommunications back up and running. It’s a pretty small window, so you would need to invest a lot of time and resources into disaster preparation to make sure that objective is achieved.

Whereas RPO is focused on backup frequency, RTO sets the tone for business continuity as a whole. Used correctly, both can provide a much-needed degree of measured guidance when responding to a disaster.

  1. Make Data Security a Priority

The same aspects that help increase business efficiency and agility have made us more vulnerable than ever. From cloud apps to mobile devices and everything in between, each piece of technology we implement is yet another attack vector for the bad guys to lock on. Data is always under fire. Organizations must make a concentrated effort to develop a business continuity strategy that protects against sophisticated outside attacks as well as the ever-looming threat of human error from within. A business continuity plan is not complete without a strong focus on data security.

Conclusion

Losing even the smallest amount of data can have game-changing ramifications for your business. To protect it, you need to be prepared for every possible disaster scenario. Although it’s merely one piece of the business continuity puzzle, an effective backup plan can make sure you retain as much of that data as possible.

View Comments

  • VMware Player is not a Type 1 hypervisor, and therefore does not have better performance than Virtualbox "because it runs directly on the hardware."""

  • Yes, a span size of two means that each span is as small as possible. So a span size of two in RAID 100 means that you are actually getting RAID 10 without anything extra (it is the middle RAID 0 that is eliminated.) So the advice is good, basically you always want a span size of two if the option exists. Some controllers cannot handle a RAID 10 large enough to accommodate all attached drives and so larger spans are required. Typically this does not happen until you have at least ~18 drives or so.

  • The one question I have coming out of this results from the conversation that I believe possibly prompted this blog post, namely that in this thread on SpiceWorks:

    http://community.spiceworks.com/topic/548896-raid-10-2-spans-a-cautionary-tale-it-can-happen-to-you

    The recommendation/default for at least one DELL controller model was a span-size of 2, with comments referring to this being referred to as the optimal configuration for larger arrays. Is there any evidence to support this being the optimal configuration? Your blog post, and my (albeit limited) understanding of RAID would suggest that this advice is flawed. Then again, maybe I am misunderstanding something at a fundamental level?

    Furthermore, would there be any benefit to adding in multiple RAID-0 layers above the RAID-100 so that the member size of all arrays involved is kept as small as possible?

  • I like the article, to be honest I've seen many posts on newspapers, magazines and even blogs that praises the open-source as it without being put on glory or hell, just neutral

    I'll like to add some other software like Thunderbird (for email), Git (for developers) and maybe replace Notepad++ with Geany/Gedit/Kate (or the text editor of your preference, yours being the Notepad); otherwise I like your choices and those are apps that I use a lot, even if in my workplace they don't want to replace it

    • Hey Dom, depending on where you're located there are a number of ways you can dispose of VHS tapes. Most thrift shops will take them off your hands, assuming they're actual movies and not simply blank tapes. Another option is to use Greendisk (greendisk.com), which allows you to mail in your old VHS tapes for recycling. Beyond that, there may be some options specific to your location (there are waste recycling facilities that can handle this type of trash all over), a quick Google search might reveal some of them.

  • Hi there, I think your web site may be having internet browser compatibility problems.
    Whenever I look at your web site in Safari, it looks fine
    however when opening in I.E., it has some overlapping issues.
    I simply wanted to provide you with a quick heads up!
    Besides that, wonderful site!

    • Thanks for letting us know, we really appreciate it. Do you happen to know which version of IE you're using? I know that sometimes the older versions don't cooperate. I can't seem to reproduce the results you're seeing, but we're looking into it. Thanks again for bringing this to our attention.

  • I think you are missing the point entirely here. I have a home with 5 PCs all running same Windows OS version and same versions of Office. MOST of the file data on the machines are copies of same files on other machines: the Windows OS files and Office binaries. I want to backup full system snapshot images (not just photos and music) daily to a NAS on my LAN, or even a headless Windows machine acting as a NAS (like the old Windows Home Server product). I want the bandwidth savings of laptops backing up over wifi to notice that those windows files are already stored and not transmit them over wifi. I also want the total NAS storage of all combined backups reduced so that I can copy the NAS storage to either external drive for offsite storage, or more interesting up to the cloud for redundancy. ISP bandwidth caps, limited upstream bandwidth, and cloud storage annual cost per GB mean that deduplicated backup storage is essential. The cost of additional local storage is NOT the only consideration.

    I don't care about Windows Server's integrated deduplication. The deduplication has to be part of the backup system itself, especially if you are doing cluster or sector level deduplication, to avoid sending the duplicate data over the wire to the data storage in the first place.

    I've been looking at different backup solutions to replace Windows Home Server (a decade-old product that offered deduplication), and your product looked very interesting, but unfortunately the lack of built-in deduplication rules it out for me. I can only imagine how this affects 100-desktop customers when I wont't even consider it for 5-desktop home use.

    • Thank you for your comments. We appreciate all points of view on this topic.

      I agree that ISP bandwidth caps, limited upstream bandwidth, and cloud storage cost per GB show how critical it is to minimize data transmissions offsite. I also believe that much like modems and BETA video tapes, the bandwidth of today is giving way to higher access everywhere. For example, Google Fiber is now available to some of my peers at the office. Cellular LTE and satellite technologies are also increasing bandwidth for small business and home offices. At the same time, our data consumption and data creation is increasing at a rate that may outpace this increased supply of bandwidth. Either way, there are ways to work around data transmission limits.

      One way we help with data transmission over slower networks is we incorporate WAN acceleration and bandwidth scheduling technologies into our offsite replication tools. These allow you to not only get the most efficient use of available bandwidth but to also schedule your data replication during off-peak hours. Another way we help with data transmission is through compression. Deduplication is after all simply another form of data compression which reduces the near side (source) data before it is transmitted over the wire (target).

      In your case, you could use our product to store images on a local volume which has deduplication. You could then replicate data over the wire to offsite storage using ImageManager or some other tool. Many of our customers do this very thing.

      Keep in mind that the deduplication process has to occur at some point: either at the source or at the target. If you wanted to deduplicate your 5 PCs you would be best served with a BDR solution that can read each of those PCs, see the duplicate files on each, and avoid copying those files to storage. In this example, deduplication would occur on your BDR but you're still reading data from each PC over the wire to your BDR. In addition, your BDR would control the index for data stored on a separate volume or perhaps has the storage volume incorporated in the BDR. This creates a single point of failure because if your BDR crashes then the backup images for your 5 PCs wouldn't be recoverable and current backup processes cease.

      At StorageCraft we focus on the recovery. Our philosophy means that we take the smallest fastest backup images we can and then we give you ways to automatically test those images for reliability, compress them into daily/weekly/monthly files according to your retention policy, and replicate those images locally and offsite. This gives you a solid foundation from which to recover those images quickly to almost any new environment. I have yet to see a faster more reliable solution among our competitors.

      Cheers,
      Steven

  • Regarding Shadowprotect desktop:
    I am looking for the following capabilities
    1. Windows 8.1 compatability
    Everything I've examined says Win 8 but nothing about Win 8.1
    2. I want to be able to do the following on an ACER S-3:
    320gb hd with Win 8.1
    create an image of the 320gb drive.
    Install a 120gb drive in the ACER.
    Install the image to the 120gb drive.
    I am assuming that I can boot from the Shadowprotect
    CD, use an external usb connected dock with the 320gb
    image, and successfully install the image from the
    external dock to restore to the 120gb drive installed in the ACER.
    3. Does Shadowprotect take care of setting up the needed
    partition and format for the target drive (120gb in this case)

    I've looked at several of the alternatives to your product
    posing the same questions above and get vague or downright
    misleading answers to my items 1, 2 AND 3 above.

    If I purchase your product will I be able to do what I
    want as stated in items 1,2 and 3 above?

    I have done exactly what I described in items 1,2 and 3
    above for WIN 7 using a product called EZGIG II and am
    pleased with the results. I am looking for the same
    capability for Win 8.1.

    Please avise,
    Joe O'Loughlin

  • Hello,

    I'm just wondering if any of you have actually tested this scenario in the end and come to any conclusion since this article was published.

    Thank you!

  • 1 2 3 10

Search by Tag

2014 press release award backup backup disaster recovery BDR Big Data business business continuity case study cloud cloud backup cloud computing curation cyber security data center data protection data recovery disaster planning disaster recovery Hard disk drive Hardware healthcare industry news IT industry linux marketing Microsoft Mobile MSP MSPs news partners ransomware ShadowProtect software StorageCraft StorageCraft Cloud Services storagecraft news tech tips VAR verticals video virtualization webinar Windows