May
29

Virtualization: A Look at the Past, Present and Future of IT’s Trusty Workhorse

Virtualization: A Look at the Past, Present and Future of IT’s Trusty Workhorse

May 29
By

Virtualization has become the trusted workhorse in IT environments of all sizes. It allows organizations to maximize the utilization of their hardware assets, reduce power consumption and migrate one system to another with little to no service disruptions. Not many are aware of this, but it also plays a critical role in the cloud computing trend. Thanks to virtual platforms, the cloud can efficiently deliver the flexible, scalable, cost effective services it is so well known for.

While the concept is currently at the pinnacle of prominence, the story on virtualization was written long before VMware and Microsoft Hyper V treated the IT world to their respective vSphere and Hyper V products.

In the Beginning

In its article History of Virtualization, InfoBarrel explains that it was not VMware or Microsoft, but IBM that kicked off the virtual computing trend. It all began in the 1960s, when the company was busy pouring its time and efforts into creating mainframe systems that made it possible to share computing resources across large groups through a process called time-sharing. This was seen as a big deal at a time when such robust systems were even cost prohibitive for businesses. More importantly, it demonstrated that it was possible for users to leverage the resources of a computer without actually having their own.

We pick up the story with a Network World article, which features an interview from retired IBM programmer Jim Rymarczyk. Credited for years as an “IBM fellow” and chief virtualization technologist, Rymarczyk was instrumental in moving the technology forward for the better part of four decades, making monumental contributions until retiring from the company in 2012. He explained that while IBM released its first commercially available hypervisor in 1972, it was already being used in the late 60’s as a tool for interactive computing and software development.

Virtual Infrastructures in the Modern Computing Era

In the 1990s, x86 servers joined PCs in the computing arena. Though robust and powerful, these computers presented the IT industry with a bit of a conundrum because they were so large that using them to their maximum efficiency was almost impossible. Things began to change when VMware, founded in 1998, introduced its VMware Workstation in 1999, a solution that transformed x86 systems into computing infrastructures complete with their own isolated environments. The company entered the virtual server game in 2001 with VMware ESX and the rest, you can say, is history.

VMware is currently the clear cut industry leader, holding onto nearly 60 percent of the market share. But its lead is being challenged by Microsoft’s Hyper V, which has more than 25 percent of that market, and open source solutions such as KVM and Xen. Brand names aside, organizations are heavily hitched on the virtualization bandwagon, harnessing its power to optimize resource utilization and improve the efficiency of various management operations in the data center environment.

Where do we go from here? Experts believe that virtualization is moving toward a software-based infrastructure that sees virtually every layer of the data center virtualized and managed by hypervisors. In this environment, those components will be delivered as a service and automated by software that enables businesses to be more flexible, agile and responsive to IT demands. Projections such as these are always questionable, but one thing’s for sure — the future of virtuailzation is very bright.

[cf]skyword_tracking_tag[/cf]