Free Trial

Safari Books Online is a digital library providing on-demand subscription access to thousands of learning resources.

  • Create BookmarkCreate Bookmark
  • Create Note or TagCreate Note or Tag
  • PrintPrint

Managing Virtualization

Managing Virtualization Organizations can take advantage of many benefits by implementing virtualization capability in servers and storage networks. They also take on challenges managing them in complex environments. Managing Memory, CPUs, and Storage in Virtualized Data Centers Server and storage virtualization have many benefits, including saving money on electricity, heating, and cooling. However, there are challenges in large data centers. One such challenge is managing memory in the host physical servers. Operating systems installed on host servers now have more code and require more memory to operate. IT staff members allocate memory to applications on virtual machines by using VMware’s Hypervisor software. Staff members need to monitor applications’ memory usage so that servers can be upgraded or applications moved to other hosts if memory in the current host is not adequate. If this isn’t done, applications will run slowly and response times on individual user computers will be degraded. New servers are equipped with quad CPUs—quite literally, four CPU chips on a single host server. However, as the amount of processing needed to run programs grows, even this is not always adequate for the virtual machines installed on them. Programs that have sound and video are “fatter,” with more code, which requires additional CPU overhead. It’s difficult to estimate under these conditions the amount of CPU power that’s required for applications running on virtual machines. In addition, disk storage is being used up at a faster rate. Users now routinely store MP3 files in their e-mail inboxes. In data centers where user files and e-mail messages are archived, this can deplete spare storage in Storage Area Networks (SANs) at a faster rate than planned. Thus, storage, memory, and processing requirements should be monitored. In large, complex data centers, it can sometimes be easier to monitor and manage single physical servers rather than virtual machines and storage. In small organizations, managing memory, storage, and CPU usage is not as complex. With only one or two physical servers, it’s not as difficult to track and manage resource allocation. Server Sprawl Server sprawl is the unchecked proliferation of virtual machines on physical host servers. (Virtual machines are also referred to as images.) Managing server sprawl in large data centers is a major challenge. Because it’s easier to install applications on virtualized servers, the number of applications can escalate rapidly. Data centers that previously had 1,000 servers with a 1,000 applications can potentially now have eight images per physical server and 8,000 applications to manage. Another cause of server sprawl occurs when an application is upgraded. To test the application before it’s upgraded, management creates an image of the application, upgrades the image, and then tests the upgrade on a small group of users before making the upgrade available to all users. However, often the original and the upgraded application are left on the physical server, which further contributes to sprawl. To complicate matters further, if either the original or the upgraded version is moved to a different physical server, it can be difficult to determine that there is a duplicate. Software to Monitor Virtualized Data Centers Small data centers with only one or two physical computers holding virtual computers are generally easier to manage than the same data centers with perhaps 20 applications installed on 20 physical servers. However, there is a learning curve for managing numerous physical servers, each running multiple operating systems and applications on virtual computers. It involves keeping track of which physical server each image resides on as well as the application’s address. For large data centers, it is often easier to manage physical servers, where each one can be physically observed, than to manage an array of applications on virtual computers. To do this properly, a good suite of management tools is required. Moreover, because problems can occur in switches, servers, and storage, the best management suites are those that are capable of diagnosing and monitoring all of these areas. Visibility across these three sectors is complicated by the fact that in many data centers, different manufacturers supply the switches, servers, and storage. One solution is to use third-party suites of management software that are capable of monitoring equipment supplied by more than one manufacturer. However, in economic downturns, organizations are often reluctant to invest in this type of software. New applications critical to operating an organization’s core operations have a priority in resource allocation. A Nonprofit’s Experiences with the Cloud A non-profit organization that works with developing countries has offices in the United States and remote sites around the world. The company’s vision is to move as many applications as possible to either hosted sites or cloud providers. With hosting, the customer’s own servers are located at the host sites. Several applications including budgeting and payroll have already been outsourced to cloud or hosting sites. The company is now moving its accounting packages to a hosting facility. These packages will be located on the nonprofit’s own server at a hosting site. A software developer will manage the applications. The nonprofit still has in-house applications, including e-mail, and IP phone and voicemail systems. It hasn’t moved the phone system and voicemail because it owns the hardware on which these systems reside and has multiyear commitments for the software licenses. When the organization first moved its budget application to the cloud, there were many snags. Users had problems logging in, they couldn’t see department charge codes, and some managers couldn’t charge items to their own departments. The application still does not work satisfactorily. A major problem is that not all of these remote applications are compatible with their current browser. The non-profit would like to see more standardization in this area. When it moves applications off-site, it moves the database associated with each, as well. Management feels that this creates a simpler configuration than moving just the application. The IT director feels that the cloud works well for testing and developing new applications, but that it needs to improve. He has heard that many other nonprofits are also using cloud-based applications, particularly e-mail. And although the IT department might shrink, it will continue to be needed to manage the switches on the Ethernet LAN. The LAN is critical for accessing and transmitting data on cloud-based applications. The non-profit uses Cisco’s WebEx, which is a web-based audio and video conferencing service that it accesses from user desktops and camera-equipped laptops. Because it is Internet-based, remote staff members can easily access it. However, the IT director would like WebEx to be easier to use and plans to integrate it with the organization’s Microsoft Office applications on users’ desktops. The WebEx service is also used for collaboration and conferencing, where participants are able to write comments on a “whiteboard.” This is part of its concept of an office without walls, wherein people easily work with staff members in other locations. As part of this concept, more employees, such as new mothers, now work from home and access their files remotely. LAN Traffic The volume of LAN traffic has grown exponentially due to powerful computers that are able to handle multimedia traffic. This traffic has a longer duration and shorter pauses between packets for other devices to transmit. This is true whether applications are on desktops, centralized in data centers, or in the cloud, because at some point, data is still transmitted over the LAN to the user. The following are examples. • Large graphics file attachments such as PowerPoint files. • Daily backups to databases of entire corporate files. • Web downloads of long, continuous streams of images, audio, and video files. • Growth of audio and video conferencing conducted over LANs. • Voicemail, call center, and voice traffic. • Access by remote and local employees to applications and personal files on centralized servers and storage devices. • Web access, during which users typically have four or more pages open concurrently. Each open web page is a session, an open communication with packets traveling between the user and the Internet.

  

You are currently reading a PREVIEW of this book.

                                                                                                                    

Get instant access to over $1 million worth of books and videos.

  

Start a Free Trial


  
  • Safari Books Online
  • Create BookmarkCreate Bookmark
  • Create Note or TagCreate Note or Tag
  • PrintPrint