Virtualization—it’s been around for years, so why is it suddenly so hot? Big changes are underway that will make it mainstream in servers and even client systems. What has changed?
First, the industry has started a transition from single- to multi-core processors… and on the horizon are similar trends in I/O. These trends will continue. So what do we do with all these extra processing units? Virtualization is a natural fit for all of these processing elements. One option is to partition a multi-core system—and dedicate a processor core to a specific “guest” OS.
Second, native support at the processor level will be standard… both Intel and AMD will have native support for virtualization. This will accelerate adoption, drive common instruction sets, and improve memory management.
Perhaps the biggest change will be in the lower cost of implementing a virtualized system. Virtualization software is more and more common—some is even free open-source. Moving forward, this means two things: lower cost of implementation, and more software options for customers… both of which are good for virtualization.
Virtualization will not be a passing fad. What will be interesting in the coming years is to watch how software licensing will need to change and how virtualization will lead to new ways to package and distribute software. With virtualization, a software developer can do some pretty creative applications in the contained virtual environment.
This is all pretty exciting stuff and it’s got us brainstorming new ways that virtualization can solve problems and advance the industry in both the client and the server. I’d like to hear where you think this is going. Expect to hear more from me on virtualization soon.