Return on Investment, (ROI) for short, is typically associated with financial benchmarks of capital investments within a business. ROI has also made inroads to describe the benefit or return an organization receives from technology expenditures, though efforts have often been tenuous in showing a true relation. So much frustration has surrounded ROI for IT projects and purchases that a number of books have been written to help describe the oft confusing term, such as "How to Measure Anything."
The purpose of this discussion is to peel open the curtains just a little, to shed light on the ROI of typical server purchases and how the ROI on core server equipment can be increased through the use of virtualization. Additionally, examples of how to go about judging return on investment will be described.
Return on investment for equipment purchases, specifically servers, can be viewed (simplistically) in one of two ways. The first way is to conclude 100% ROI when the server is purposed for a task, such as email. In this sense, because the server is dutifully doing its job of providing email to the organization, it has provided full return on investment. Calculating the ROI over time, might include subtracting any server downtime due to software or equipment failure or the server sitting in the closet waiting to be installed. The second option of calculating ROI comes from gathering measurements of system performance to conclude overall system utilization. Only with virtualization does the second method of measuring become truly evident.
The way these two philosophies differ is subtle. To be a bit cliche, I will illustrate with a car analogy. In one instance, I have the luxury of owning a Ferrari and take full gratification from owning the vehicle. I drive to and from work in my shiny, fast car and therefore, my ROI is 100%. Only if my Ferrari breaks down do I lose a return on my investment. In the other instance, I am the only person in the car, and I can only drive to work at 30 mph. Clearly, in the second case, owning the Ferrari doesn't make much sense. At the very least, if I were able to drive to work at 100 or 120 mph, it might be worth owning. If I can have another person or two in the car to share the cost of gas, I might really get some use out of it! However, because I commute in a crowded city, I will rarely, if ever have the chance to operate at those speeds.
Having a fast server, chugging away on corporate email, for many organizations is like having the Ferrari (high end server), but driving at 30 mph with one person in the car. In technical parlance, it's equivalent to using about 10% of server resources (hard drive space, CPU, memory). Does it make sense to have a server dedicated to a specific function when that function only utilizes such a small amount of available resources?
Where virtualization comes into play is that the technology allows organizations to find underutilized servers and add additional functions to them without compromising the integrity of those services. If there are two servers running at 20%, then more services can be added, like Sharepoint, anti-spam, anti-virus, backup or other needs, until the utilization rate hits 70-80%. Now at least, we're driving that Ferrari to work with two or more people at 80 mph.
Maximum Return on Investment with VMWare
Dave Murphy Monday, August 06, 2007
: