Virtual Machines: Usage & Performance


Virtualization is the process or system that separates the physical hardware from its operating system to provide greater utilization of IT resources and a more flexible system.

Virtual machines are a software representation of a physical computer with its own set of virtual hardware within which an operating system and applications are loaded.


Developing a virtualized system can have its benefits, but the hidden cost associated with such a system is seldom considered.

  • A company seeking to implement a virtualized system must be prepared to consider altering the current status quo, managing all performance activities, and take into account the scalability limitations.

The general goal for any virtualized system should be the reduction of costs and resources, without sacrificing service.

  • With virtualization a company may achieve several different goals ranging from server consolidation, better security, eliminating legacy hardware, better backup management and even reducing downtime with the most complex configuration.

The technology of virtualization is relatively new, with an increased popularity of the x86 platform in 2001, and only since the last five years have companies fully adapted to the technology.

Because most virtual machines don’t utilize a local disk, but rather take advantage of shared storage, they can take advantage of high availability and load balancing options.

  • In addition, they can also intensify the load capabilities to the NICs (network interface controllers), shared storage systems and switches.

Handling scalability can also be a challenge, with a large amount of a company’s IT budget going towards efficient management tools and processes needed to handle scalability.

  • Most vendor systems in the market are limited when it comes to scalability, due to design flaws and the ability, or flexibility, to expand.

  • Scalability features are often only provided in enterprise versions of the virtualization software, which means additional costs.

Troubleshooting can be another challenge because having multiple virtual systems requires troubleshooting the physical underlying hardware the virtual systems run on as well as the virtual systems themselves.

  • The virtual systems all share the same physical server, storage system, and network connection.

  • Troubleshooting a virtual system requires a long process of reviewing a large amount of logs and data from a number of different sources.

  • Finding a solution to the problem can be challenging for an administrator who would have to look at several screens and cross-reference relevant data and logs.

Changes and upgrades can create bottlenecks and cause an increase in CPU utilization within a virtual system.

  • Simple activities; such as rolling out a service pack, anti-virus updates, and software patches, can bring tremendous strain on a virtual system in comparison to a physical system.

Companies often have to take the route of developing around a vendor’s solutions.

  • The marketplace – with a multitude of vendors in the virtualization industry – has yet to provide the ultimate solutions that companies are seeking for to deal with their individual challenges.

  • Companies have to make the decision to either accept the problems or build a “work-around.”

For companies looking at making the leap into a 100% virtualized system, they must consider the challenges with such a system; the limited scalability, and the ability to having different systems working simultaneously as well as uniformly.


Protecting virtual machines, within a disaster situation – system failure or power failure etc. – has its challenges.

Having an exact backup copy of every virtual machine is the most optimal method to recover a system, but can require a large amount of storage.

  • The backup system must also be conducted while the virtual machines are running and serving clients 24/7.

Using third-party software to backup a virtual system often involves taking a snapshot prior to the backup, and deleting thereafter.

  • Such software should be able to accomplish the following: communicate with the virtual machine manager during the backup process, gain access to each virtual machine disk files, and move data from the virtual machines without disrupting the workload as quickly and efficiently as possible.

One of the most important considerations, when selecting a backup and recovery product, is the product’s ability to interact with the virtualization vendor’s APIs.

Although some vendors may claim otherwise, when creating a snapshot of a heavily used machine, there is a slight time lag – usually just a few seconds – where the system becomes unresponsive while processing the remaining I/Os.

  • Because of this lag time it is best to schedule a backup during off-peak hours of business.

  • Another option is to install backup agents directly on the host servers, which enables the hosts to become the backup mechanism.

Disk file backups should not take the place of traditional backup software agents running on the virtual machine’s guest operating level.

  • Guest-based software agents can be used to selectively backup only data that has been changed, and not repeatedly require backing up the entire operating system.

It is highly recommended to utilize a number of backup options (ie. tape, disk) to accomplish the company’s backup and recovery objectives.

For the most optimal backup strategies, the following should be implemented:

  • Once a week, backup the files on the virtual machine during off-peak hours.

  • Send all backups to a data deduplication storage network or a tape backup system.

  • Conduct a daily backup of the guest operating applications and files, particularly those on the system-level.

  • Store the daily backups, using an enterprise-class backup product, on a mixture of replicated storage, tape or disk.


Virtual machines can provide great benefits to an organization, and deliver superior performance if optimized properly. There are a number of methods/tweaks to help ensure that a virtual machine is working at its peak performance:

Virtual machines should be stored on a separate drive from the boot drive.

  • One of the biggest performance challenges with virtual machines is the disk I/O rate.

Install virtual tools to ensure ease of operation within the virtual machine.

  • Virtual tools can often improve performance as they tend to provide special drivers for network, video and other devices.

Use the virtual machines for any resource intensive testing.

  • When heavily resource intensive testing is needed (ie. Oracle DB), a system’s performance can be affected immensely. The solution is to use such a testing environment in a virtual environment and only run it when needed; shutting the environment down when it is not in use.

Because virtual machines tend to consume more kernel memory, a host machine should not use the /3G setting on the host as this will limit the number of virtual machines that can be launched.

Use Remote Desktop (RDP) to connect to remote virtual machines, which will allow copy/paste operations, sharing of data between local and remote machines, and sharing printers.

Ensure virtual hard drive files are defragmented to allow for faster I/O operations within the virtual machine.

Allocate sufficient memory to the guest machines, in order to reduce the I/O operations performed by memory swapping.


In any business environment, either a start-up or growth-oriented business, implementing a virtualized environment resolves a large number of IT challenges, reduces hardware costs, and reduces the size of a required data center.

There are a number of considerations to review before implementing a virtual environment in any organization:

  • Determine the virtualization goals of the company (ie. provide greater flexibility in the data center’s operation, improve disaster recovery, reduce hardware/software costs etc.)

  • Consider the applications to be used, and if the inline-business application vendors support virtualization.

  • Consider using a free hypervisor, and use one that supports the applications to be installed.

  • Identify the hardware needs for the company, and consider going with a vendor that offers systems especially designed for virtualization.

  • Consider the number of servers needed, with the consideration of acquiring a backup unit for spare parts or emergencies.

  • Determine if it most optimal for the organization to use desktop virtualization, as it can run all the operating systems for the entire company.

  • Provide the ability for users to access a desktop and applications remotely.

On-demand applications

As an alternative to virtual machines a new trend that is emerging in the corporate virtualization world is referred to as “application streaming,” which has become the solution for companies seeking to achieve efficiency and affordability needed to conduct their business activities.

  • Most popular productivity applications are being launched every few years, with a large number of features and useful tools within each upgrade.

  • Companies wanting to take advantage of these upgrades have to spend an enormous amount of money whenever a new version comes to market.

  • Most popular programs include a large number of features and options that are seldom used by the average user.

  • Application streaming becomes the solution because it allows the distribution of applications to individual users with only the features that are needed.

The process is very simple, with the main program  being installed in a company’s local server.

  • The program is then streamed to users as they connect to the network.

  • Resources are released again as users disconnect from the network, making the system more efficient.

Application streaming provides several benefits, including:

  • A company can save an enormous amount of bandwidth and achieve increased memory utilization  because users are using only a small portion of the application’s features.

  • The time consuming process of installing applications on each desktop is eliminated.

  • Companies can substantially lower the cost of End User License Agreements (EULAs), accounting just for licenses that are used concurrently vs. all of the installed application instances on each desktop.

  • Managing and updating software is quick and efficient.