What is Virtualization?
Imagine the possibility of having a virtual computer, also known as a virtual machine (VM), inside an actual computer performing the same tasks at the same time. This is precisely what the powerful tool of virtualization permits its user to do on both the enterprise and consumer side. The most common form of virtualization is the operating system-level virtualizations (aka OSE/Operating System Environments) which allows multiple operating systems to run on a single piece of hardware. This can be categorized into different layers such as desktop, server, file, storage, and network; each layer has its own set of advantages and complexities. It relies on these layers and the software to simulate hardware functionality while creating a virtual computer system.
Several companies provide various virtualization solutions that are specialized or made for specific data center tasks or end-user-focused scenarios. Some virtualization examples include VMware (focus in server, desktop, network, and storage), Citrix (has a niche in application virtualization and offers server and virtual desktop solutions), Microsoft (Hyper-V virtualization solution focuses on virtual versions of server and desktop computers), and Kubernetes (aka K8s, which focuses specifically on virtualization of applications).
Many tech and IT companies have begun the integration of virtualization. The resulting benefits include great efficiencies and economies of scale, but these are not the only potential benefits of virtualization. Other benefits include minimized downtime, container portability (moving from one hardware platform to another), increased IT productivity, efficiency and responsiveness, simplified data center management, reduced capital, and operating costs, and more. Virtualization is primarily used for environments that seek to create virtual instances of simulated computers. It’s a great way to use other operating systems without needing another computer.
In the late 1950s, a group at the University of Manchester conceptualized virtual memory through automatic page replacement in the Atlas system. The term virtual machine is unveiled in the 1960s by IBM when they introduced the System/360 model 67, which was the first major system with virtual memory and system containerizations known as “LPAR” and in 1972 provided it as the S/370 aka Virtual Machine Facility/370. The concept of hardware virtualization also emerged during this time, allowing a virtual machine monitor to run virtual machines in an isolated and virtual environment. By the mid-1970s, virtualization was well accepted, and the use of virtualization solved many important problems for the time. Virtual machines presented an efficient and cost-effective way to gain the maximum benefit from the sizable investment in a company’s data center. During the 1980s and 1990s, the need for virtualization declined when low-cost minicomputers and personal computers became available to users like you and I. Once the early 2000s rolled around, there was a re-emergence of virtualization, but there were many hiccups and developments needed to be made for the programs to become user friendly and efficient like how they are today.
Arol Wright. “What is Virtualization and What is the Technology Used For?” Make Use Of, June 29, 2022, page 1.
Donald Firesmith. “Virtualization via Virtual Machines” SEI Blog – Carnegie Mellon University, September 18, 2017, page 1.
Sean Campbell and Michael Jeronimo. “An Introduction to Virtualization” 2006, pages 7-8.
“Virtualization” Techopedia, February 2, 2017, page 1.
“What is Virtualization” IBM, 2023, page 1.
“What is Virtualization” VMware, 2023, page 1.