Working with VMware Update Manager Server

Have you ever considered the best way to plan, design and work with VMware Update Manager (VUM)? In the early days, using VMware 3.x when VUM was first released,  I would end up installing VUM on the vCenter server itself.  After all, that was the recommendation from VMware at the time.  I propose that this is no longer the case and I would like to present a list of best practices to follow when working with VMware Update Manager. This list came from VMware, but should only be considered as a guide.  Each environment is different and your mileage may / will vary.

VUM Best Practices

  • Separate the VUM database from the VirtualCenter database when there are 500+ virtual machines or 50+ hosts.
  • Separate both the VUM server and the VUM database from the VirtualCenter server and the VirtualCenter database when there are 1000+ virtual machines or 100+ hosts.
  • Make sure the VUM server host has at least 2GB of RAM to cache patch files in memory.
  • Allocate separate physical disks for the VUM patch store and the VUM database.
  • Because the Windows guest agent is installed in each virtual machine the first time a powered-on scan is run, the first powered-on scan command can take longer than subsequent scans. It may therefore be desirable to run the first scan command when this additional time will not be an issue.
  • For a large setup, powered-on virtual machine scan is preferred if VUM server resources are constrained or more concurrency is needed for scans.
  • Multiple vCPUs do not help VUM operations as the VUM guest agent is single threaded.
  • Configure each virtual machine with at least 1GB of RAM so large patch files can fit in the system cache.
  • Deploy the VUM server close to the ESX hosts if possible. This reduces network latency and packet drops.
  • On a high-latency network, powered-on virtual machine scans are preferred as they are not sensitive to network latency.
  • Check if on-access virus scanning software is running on the VUM server host. If it is, exclude the mounted disk on a high-latency network.

Again, your mileage will vary on the actual numbers.  I was working in an environment that was about half the size listed above and the vCenter Server would respond sluggish at times.  This brings me to my point. Knowing that every environment is going to be different, when at all possible I automatically recommend moving VMware Update Manager to a separate server right off the bat.  This separation gives you one less thing to worry about or have to consider when troubleshooting any vCenter Server performance issues.

I have found what I consider a limitation with VMware Update Manager. Each VUM server is tied to a single instance of vCenter Server. That may not sound like to bad a limitation and you would be correct but, consider this scenario.  You are running multiple nested vSphere vCenter Servers.  These nested vCenters might be running a couple of ESX hosts for a lab and another vCenter Server which could control the VDI environment as an example.  In this scenario,  the nested groups would be much smaller environments then the production environment, it would be great if the VUM server attached to the primary vCenter server would be able to update the nested environments as well.  For larger environments with large nested infrastructures, this is not really an issue because you would want the extra capabilities resources that multiple VUM Server would provide when maintaining the environment.

On a positive note, when using multiple VUM servers in your environment you can use one VUM server has your primary VUM Server that downloads the update and or patches from VMware’s repository and the other VUM servers in your infrastructure can use the main VUM server as its repository to pull the updates and patches from. This does give you a centralized ability to make sure that all your VUM servers have all the same updates available to each one.

The limitation in itself, is not a show stopper by any means but, it is something to consider when designing your infrastructure. One of the main points that I wanted make with this post was that I believe it to be prudent and a best practice moving forward to keep vCenter Server and the VUM Server separated from each other. What are you thoughts?