Virtualization has become somewhat of a buzzword in cloud computing circles. Unlike cloud computing, virtualization is the utilization of real computers to create an illusion. It works like this:
- You take a real server and partition it into more than one component
- You load separate operating systems and software applications on each partition of the real server
- Operate each part of that server as if it were a separate machine
Sounds easy, doesn’t it?
In actuality, it’s an easy concept to get your head around if you put it into simple terms. But it isn’t quite so easy to implement. If it were, everyone would be doing it, right?
It seems like companies have decided, out of nowhere, to go virtual. Some of them are going virtual in the cloud. OK, no problem. But what it boils down to is many companies that have built their own data centers out of real world servers are turning to virtual computing instead. They either convert their data centers into virtual computers or get rid of their data centers and go to the cloud. They benefit either way.
We have reached the point of virtualization in America. It is much less expensive to own or lease a part of a computer and use up 90% of your available operating space than to own five servers and only use up to 50% or 60% of each machine. I say, let’s go virtual, America.