X

Whenever we witness the rapid adoption of technology, we also must sift through the hyperbole and misconceptions that follow. Cloud computing is one of those technologies where everyone has an opinion, and yet there is still a lot of confusion when it comes to picking through the facts and myths.

One might expect there to be confusion and misinformation when so much money is on the line. Gartner predicts the cloud computing market to reach $411 billion by 2020. The good news for businesses is that this market is competitive, and choices abound. The big cloud providers such as Amazon, Microsoft, and Google garner a lot of attention, but smaller and more nimble niche players are thriving as well.

As you make decisions about your infrastructure, it is wise to separate the myth from the reality. Knowing what to expect before beginning any major deployment will put your company in a better position to take advantage of the features of the cloud. To the extent that cloud services are mainstream today, let us look at several myths and realities surrounding the market.

Top Cloud Computing Myths

  1. Loss of Control – You will lose control of having to maintain a physical server. That means you will not be required to swap hard drives or install extra RAM as your business scales. This will free up the time to manage your data, which is still yours to control. You still control who and how users access the data and will still have control over data flows and work processes.
  2. Cloud will Kill On-Prem – Many applications will run in the cloud, but some will not, or it will be too big of an inconvenience to port them to the cloud. You may be using legacy code or have internal dependencies that make it too expensive or difficult to migrate. And some companies have security contracts that flat our forbid moving sensitive applications and/or data to the public cloud. It is not an all-or-none proposition. Moving the 75% that works well in the cloud will free up the time to focus on the 25% that needs more attention on-prem.
  3. More Sensitive to External Threats – The cloud does not remove the threat of data breaches, DDoS attacks or other external attacks. Several high-profile attacks have proven this, but this is not a solid reason to avoid the cloud. Cloud providers all have security plans in place such as encryption and sophisticated firewalls to fend off most attacks. A small company may not have a security expert on staff while Google has over 750 experts on staff with the goal of keeping their network and your data secure.
  4. The Cloud is More Expensive – It might be. It depends on several factors such as the amount of data you store, number of users as well as the number of applications, and your backup needs. What the cloud does very well is scale to your needs quickly which means you only pay for the computing power you need. There is no need to build out overkill servers that you might need years from now. The cloud provider builds the cost of security and maintenance into their service, so you do not need staff to handle those tasks exclusively.

Top Cloud Computing Realities

  1. Faster to Market – For many years, consultants promised the cloud would lower the total cost of ownership while freeing companies to invest the savings elsewhere. This makes sense because IT needed a reason to justify their desire to move to the cloud. But TOC is just one part of the equation. The ability to scale, innovate and bring products to market faster have proven to have a greater impact on business than merely cost savings.
  2. More than One Cloud – The public cloud garners more notoriety than private clouds, in both media coverage and investment. But the public cloud is not the only player. Both private and hybrid clouds play a role for many companies. In fact, many companies deploy hybrid clouds that integrate data and processes across both public and private clouds. A study from SUSE has hybrid cloud strategies growing faster than both private and public clouds. There is no such thing as one true cloud.
  3. There is more than Amazon – Pundits used to refer to Microsoft as the 800 lb. gorilla because they dominated the software landscape for decades. And certainly, a lot of people speak in similar terms when referring to Amazon’s cloud presence. Yes, Amazon offers many cloud services and a lot of Fortune 500 companies rely on them. But they are far from the only player. Google, Microsoft, and Rackspace also offer many competing cloud services but integrate their own technology into their offerings like how Microsoft does with its .net-based Azure platform.
  4. Demand for IT Staff – As the cloud become popular, many IT people believed their jobs were in jeopardy. If a company moved everything to the cloud, they would not need staff to deal with IT tasks anymore. But that notion has not come to fruition. The cloud is a tool. It still requires people with strong IT skills to plan, implement and monitor. The cloud does not replace your IT staff unless you had people on staff installing hard drives and RAM all day. The cloud will free your IT staff to work on more strategic goals and projects instead of installing Windows Server updates.

Conclusion

Every business has unique needs, and not all of them will be met by the cloud. Consultants pushing cloud services, hybrid or on-prem are sure to point out the advantage of their offer while talking up the negatives of their competition. The cloud is especially ripe for this approach given the complexity and confusion that comes with such a business-critical service.

There will always be laggards who do not believe anyone can secure their data as well as they can or who worried the cloud will make their skills obsolete. Once you get past the hype and dogma, you can determine what type of cloud service is best for your company along with what company will serve you best.

Choosing the right cloud storage is one of the most important decisions your business makes. It’s no myth that the StorageCraft® Recovery Solution copies your data and keeps it on our own Cloud to ensure all your important files are maintained and accessible. Contact us today for more information on our services and tips on staying safe from the cybersecurity threats of the day.

View Comments

  • VMware Player is not a Type 1 hypervisor, and therefore does not have better performance than Virtualbox "because it runs directly on the hardware."""

  • Yes, a span size of two means that each span is as small as possible. So a span size of two in RAID 100 means that you are actually getting RAID 10 without anything extra (it is the middle RAID 0 that is eliminated.) So the advice is good, basically you always want a span size of two if the option exists. Some controllers cannot handle a RAID 10 large enough to accommodate all attached drives and so larger spans are required. Typically this does not happen until you have at least ~18 drives or so.

  • The one question I have coming out of this results from the conversation that I believe possibly prompted this blog post, namely that in this thread on SpiceWorks:

    http://community.spiceworks.com/topic/548896-raid-10-2-spans-a-cautionary-tale-it-can-happen-to-you

    The recommendation/default for at least one DELL controller model was a span-size of 2, with comments referring to this being referred to as the optimal configuration for larger arrays. Is there any evidence to support this being the optimal configuration? Your blog post, and my (albeit limited) understanding of RAID would suggest that this advice is flawed. Then again, maybe I am misunderstanding something at a fundamental level?

    Furthermore, would there be any benefit to adding in multiple RAID-0 layers above the RAID-100 so that the member size of all arrays involved is kept as small as possible?

  • I like the article, to be honest I've seen many posts on newspapers, magazines and even blogs that praises the open-source as it without being put on glory or hell, just neutral

    I'll like to add some other software like Thunderbird (for email), Git (for developers) and maybe replace Notepad++ with Geany/Gedit/Kate (or the text editor of your preference, yours being the Notepad); otherwise I like your choices and those are apps that I use a lot, even if in my workplace they don't want to replace it

    • Hey Dom, depending on where you're located there are a number of ways you can dispose of VHS tapes. Most thrift shops will take them off your hands, assuming they're actual movies and not simply blank tapes. Another option is to use Greendisk (greendisk.com), which allows you to mail in your old VHS tapes for recycling. Beyond that, there may be some options specific to your location (there are waste recycling facilities that can handle this type of trash all over), a quick Google search might reveal some of them.

  • Hi there, I think your web site may be having internet browser compatibility problems.
    Whenever I look at your web site in Safari, it looks fine
    however when opening in I.E., it has some overlapping issues.
    I simply wanted to provide you with a quick heads up!
    Besides that, wonderful site!

    • Thanks for letting us know, we really appreciate it. Do you happen to know which version of IE you're using? I know that sometimes the older versions don't cooperate. I can't seem to reproduce the results you're seeing, but we're looking into it. Thanks again for bringing this to our attention.

  • I think you are missing the point entirely here. I have a home with 5 PCs all running same Windows OS version and same versions of Office. MOST of the file data on the machines are copies of same files on other machines: the Windows OS files and Office binaries. I want to backup full system snapshot images (not just photos and music) daily to a NAS on my LAN, or even a headless Windows machine acting as a NAS (like the old Windows Home Server product). I want the bandwidth savings of laptops backing up over wifi to notice that those windows files are already stored and not transmit them over wifi. I also want the total NAS storage of all combined backups reduced so that I can copy the NAS storage to either external drive for offsite storage, or more interesting up to the cloud for redundancy. ISP bandwidth caps, limited upstream bandwidth, and cloud storage annual cost per GB mean that deduplicated backup storage is essential. The cost of additional local storage is NOT the only consideration.

    I don't care about Windows Server's integrated deduplication. The deduplication has to be part of the backup system itself, especially if you are doing cluster or sector level deduplication, to avoid sending the duplicate data over the wire to the data storage in the first place.

    I've been looking at different backup solutions to replace Windows Home Server (a decade-old product that offered deduplication), and your product looked very interesting, but unfortunately the lack of built-in deduplication rules it out for me. I can only imagine how this affects 100-desktop customers when I wont't even consider it for 5-desktop home use.

    • Thank you for your comments. We appreciate all points of view on this topic.

      I agree that ISP bandwidth caps, limited upstream bandwidth, and cloud storage cost per GB show how critical it is to minimize data transmissions offsite. I also believe that much like modems and BETA video tapes, the bandwidth of today is giving way to higher access everywhere. For example, Google Fiber is now available to some of my peers at the office. Cellular LTE and satellite technologies are also increasing bandwidth for small business and home offices. At the same time, our data consumption and data creation is increasing at a rate that may outpace this increased supply of bandwidth. Either way, there are ways to work around data transmission limits.

      One way we help with data transmission over slower networks is we incorporate WAN acceleration and bandwidth scheduling technologies into our offsite replication tools. These allow you to not only get the most efficient use of available bandwidth but to also schedule your data replication during off-peak hours. Another way we help with data transmission is through compression. Deduplication is after all simply another form of data compression which reduces the near side (source) data before it is transmitted over the wire (target).

      In your case, you could use our product to store images on a local volume which has deduplication. You could then replicate data over the wire to offsite storage using ImageManager or some other tool. Many of our customers do this very thing.

      Keep in mind that the deduplication process has to occur at some point: either at the source or at the target. If you wanted to deduplicate your 5 PCs you would be best served with a BDR solution that can read each of those PCs, see the duplicate files on each, and avoid copying those files to storage. In this example, deduplication would occur on your BDR but you're still reading data from each PC over the wire to your BDR. In addition, your BDR would control the index for data stored on a separate volume or perhaps has the storage volume incorporated in the BDR. This creates a single point of failure because if your BDR crashes then the backup images for your 5 PCs wouldn't be recoverable and current backup processes cease.

      At StorageCraft we focus on the recovery. Our philosophy means that we take the smallest fastest backup images we can and then we give you ways to automatically test those images for reliability, compress them into daily/weekly/monthly files according to your retention policy, and replicate those images locally and offsite. This gives you a solid foundation from which to recover those images quickly to almost any new environment. I have yet to see a faster more reliable solution among our competitors.

      Cheers,
      Steven

  • Regarding Shadowprotect desktop:
    I am looking for the following capabilities
    1. Windows 8.1 compatability
    Everything I've examined says Win 8 but nothing about Win 8.1
    2. I want to be able to do the following on an ACER S-3:
    320gb hd with Win 8.1
    create an image of the 320gb drive.
    Install a 120gb drive in the ACER.
    Install the image to the 120gb drive.
    I am assuming that I can boot from the Shadowprotect
    CD, use an external usb connected dock with the 320gb
    image, and successfully install the image from the
    external dock to restore to the 120gb drive installed in the ACER.
    3. Does Shadowprotect take care of setting up the needed
    partition and format for the target drive (120gb in this case)

    I've looked at several of the alternatives to your product
    posing the same questions above and get vague or downright
    misleading answers to my items 1, 2 AND 3 above.

    If I purchase your product will I be able to do what I
    want as stated in items 1,2 and 3 above?

    I have done exactly what I described in items 1,2 and 3
    above for WIN 7 using a product called EZGIG II and am
    pleased with the results. I am looking for the same
    capability for Win 8.1.

    Please avise,
    Joe O'Loughlin

  • Hello,

    I'm just wondering if any of you have actually tested this scenario in the end and come to any conclusion since this article was published.

    Thank you!

  • 1 2 3 10

Search by Tag

2014 press release award backup backup disaster recovery BDR Big Data business business continuity case study cloud cloud backup cloud computing curation cyber security data center data protection data recovery disaster planning disaster recovery Hard disk drive Hardware healthcare industry news IT industry linux marketing Microsoft Mobile MSP MSPs news partners ransomware ShadowProtect software StorageCraft StorageCraft Cloud Services storagecraft news tech tips VAR verticals video virtualization webinar Windows