How to Overcome the Hidden Costs of Virtualization

Resource Centre | 0 comments

by | November 21, 2014


Virtualization has widely been seen as one of the most cost-saving server technologies to emerge in the last decade. The flexibility virtual machines allow to start up whole servers as and when they are needed, then shut them down when they are not, has in theory meant that general-purpose server hardware can be readily re-allocated from one task to another as necessary.

So there won’t be idle resources wasting money doing nothing, because that particular area has been over-specified. But the theory doesn’t always work this way in practice, as there can be hidden costs that the concept obscures. In this feature, we uncover some of these hidden costs, and discuss the steps a network administrator can take to address the impact.

One of the key areas where virtualization has hidden costs comes from the very same feature that makes it so useful. To achieve its much-feted flexibility, virtualization has to rely on being “jack of all trades but master of none”, so that optimizations for one task don’t mean a significant slow-down in another.

As a result, hardware will be designed to give the best possible performance across a broad range of the most frequently used applications. But this will mean that it won’t be able to deliver best-in-class performance for any given application compared to hardware that was designed specifically with that task in mind. So if you do know which applications you will be running, and how much performance will be required, there is a good chance that dedicated hardware will be more cost-effective than full virtualization.

SOURCE: betanews.com

0 Comments

Submit a Comment

Subscribe To Our Newsletter

Subscribe To Our Newsletter

ITAM Channel brings the best news and views from the ITAM industry. Sign up for the newsletter and get them straight to your inbox

You have Successfully Subscribed!