Many IT departments run applications that are more than 20 years old. These programs provide a useful function, are stable, and just seem to keep running. Think for a moment about one of these programs. In the beginning, it was installed on a new medium-sized server. Three years later, the servers were refreshed. The new servers were more powerful than the old ones, and the application was installed on the weakest model used in that upgrade. It still ran fine because its CPU and RAM requirements had not changed. Every few years, the same application is moved to an evermore powerful refreshed server. Gradually, an application that once used 60% of a server's capacity is now using 5% of its capacity. It requires the same amount of electricity to operate as any other server of its class (actually, a bit less since it does not run very hard). How many of these underutilized servers do you have?
The solution to this in recent years has been to virtualize servers. This software allows one physical server to house many logical servers—each application running in a "logical" server thinks it is still on its own physical device. Of course, there is a cost for this software and its support. However, consider the electrical savings alone. If 20 servers are collapsed into one, then the electrical footprint is dramatically reduced.
Was this article helpful?