Console quick start guide - hp integrity rx2620 (2 pages)
Summary of Contents for HP Rx2620-2 - Integrity - 0 MB RAM
Page 1
HP Integrity Servers with Microsoft Windows Server 2003 Cluster Installation and Configuration Guide HP Part Number: 5992-0936 Published: September 2007...
Table of Contents About This Document......................9 Intended Audience..........................9 New and Changed Information in This Edition..................9 Document Organization.........................9 Typographic Conventions........................9 Related Information..........................10 Publishing History..........................10 HP Encourages Your Comments......................10 1 Introduction........................11 Clustering Overview..........................11 Server Cluster Versus Network Load Balancing..................12 Server Cluster..........................13 NLB..............................13 Cluster Terminology..........................14 Nodes...............................14 Cluster Service..........................14...
Page 5
List of Figures NLB Example..........................14 Single Quorum Example.......................16 MNS Quorum Example.........................17 Example cluster hardware cabling scheme...................22...
Page 7
List of Tables Server Cluster and NLB Features....................12 Installation and Configuration Input ...................23...
About This Document This document describes how to install and configure clustered computing solutions using HP Integrity servers running Microsoft® Windows® Server 2003. The document printing date and part number indicate the document’s current edition. The printing date changes when a new edition is printed. Minor changes may be made at reprint without changing the printing date.
Ctrl+x A key sequence. A sequence such as Ctrl+x indicates that you must hold down the key labeled Ctrl while you press another key or mouse button. The contents are optional in command line syntax. If the contents are a list separated by |, you must choose one of the items.
1 Introduction This document describes how to install and configure clustered computing solutions using HP Integrity servers running Microsoft® Windows® Server 2003. The clustering improvements for Microsoft Windows Server 2003, 64-bit Edition (over Microsoft Windows 2000) include the following: Larger cluster sizes 64-bit Enterprise and Datacenter Editions now support up to eight nodes.
Public network One or more public networks can be used as a backup for the private network and can be used both for internal cluster communication and to host client applications. Network adapters, known to the cluster as network interfaces, attach nodes to networks. •...
Table 1-1 Server Cluster and NLB Features (continued) Server Cluster Supports clusters up to eight nodes Supports clusters up to 32 nodes Requires the use of shared or replicated storage Doesn't require any special hardware or software Server Cluster Use a server cluster to provide high availability for mission critical applications through failover. It uses a shared-nothing architecture, which means that a resource can be active on only one node in the cluster at any given time.
Figure 1-1 NLB Example Cluster Terminology A working knowledge of clustering begins with the definition of some common terms. The following terms are used throughout this document. Nodes Individual servers or members of a cluster are referred to as nodes or systems (the terms are used interchangeably).
If the resource cannot be brought online or taken offline after a specified amount of time, and the resource is set to the failed state, you can specify the amount of time that cluster service waits before failing the resource by setting its pending timeout value in Cluster Administrator. Resource state changes can occur either manually (when you use Cluster Administrator to make a state transition) or automatically (during the failover process).
Arbitration The quorum is used as the tie-breaker to avoid split-brain scenarios. A split-brain scenario occurs when all network communication links between two or more cluster nodes fail. In these cases, the cluster can split into two or more partitions that cannot communicate with each other. The quorum then guarantees that any cluster resource is brought online on one node only.
Stateful applications Applications or Windows NT services that require only a single instance at any time and require state information to be stored typically use single quorums, because they already have shared state information storage. Connecting all nodes to a single storage device simplifies transferring control of the data to a backup node.
In the case of a failure or split-brain, all partitions that do not contain an MNS quorum are terminated. This ensures that if there is a partition running that contains a majority of the nodes, it can safely start up any resources that are not running on that partition. Thus, it can be the only partition in the cluster that is running resources.
Failback Failback is the process of returning a resource or group of resources to the node on which it was running before it failed over. For example, when node A comes back online, IIS can fail back from node B to node A. Cluster Terminology...
2 Administering the cluster This chapter provides step-by-step installation and configuration directions for HP Integrity clustered systems running Microsoft Windows Server 2003, 64-bit Edition. Verifying Minimum System Requirements To verify that you have all of the required software and firmware and have completed all the necessary setup tasks before beginning your cluster installation, complete the following steps: Before installation, see the HP Cluster Configuration Support website for details about the components that make up a valid cluster configuration.
Verify that you have sufficient administrative rights to install the OS and other software onto each node. 10. Verify that all of the required hardware is properly installed and cabled (see Figure 2-1). For information about best practices for this step, go to: http://www.microsoft.com/technet/prodtechnol/windowsserver2003/library/ServerHelp/ f5abf1f9-1d84-4088-ae54-06da05ac9cb4.mspx NOTE:...
Page 23
Table 2-1 Installation and Configuration Input Input Description Value Node name Node 1: Node 2: Node 3: Node 4: Node 5: Node 6: Node 7: Node 8: Public network connection, Node 1: Node 2: IP address, and subnet mask IP address: IP address: for each node Subnet mask:...
Configuring the Public and Private Networks NOTE: Private and public NICs must be configured in different subnets, otherwise the cluster service and Cluster Administrator utility cannot detect the second NIC. In clustered systems, node-to-node communication occurs across a private network, while client-to-cluster communication occurs across one or more public networks.
Click the General tab. Be sure that only the Internet Protocol (TCP/IP) checkbox is selected. If you have a network adapter that transmits at multiple speeds, manually specify a speed and duplex mode. Do not use an autoselect setting for speed, because some adapters can drop packets while determining the speed.
NOTE: If your public network paths are teamed, you must put your teamed connection at the top of the list (instead of the external public network). Repeat Step 1 through Step 7 for each node in the cluster. Be sure to assign a unique IP address to each node while keeping the subnet mask the same for all nodes.
Install and configure your HP StorageWorks MultiPath for Windows software. For an overview and general discussion of the MultiPath software, go to: http://h18006.www1.hp.com/products/sanworks/secure-path/spwin.html HP MultiPathing IO (MPIO) Device Specific Module software can be used as an alternative to HP StorageWorks Secure Path to provide multipath support. NOTE: You must use MultiPath software if more than one host bus adapter (HBA) is installed in each cluster.
Click the Computer Name tab, and click Change. Select Domain Name and enter the domain name determined by your network administrator. Reboot when prompted and log into the new domain. Install the MultiPath software on this node. All other nodes should be powered Off before completing this step. Click Start Programs Administrative Tools Computer Management Disk Management and select Disk Management.
In the Action menu list, select Add Nodes to Cluster and click OK. In the Welcome to Add Nodes wizard, click Next. Enter the name of the node you want to add under Computer Name, click Add, then click Next. Cluster analysis begins. NOTE: You can list all the nodes at the same time by entering the name of each one and clicking Add.
Validating Cluster Operation To validate your cluster installation, use one or both of the following methods from any node in the cluster. Method 1: Simulate a Failover To simulate a failover, complete the following steps: Select Start Programs Administrative Tools Cluster Administrator and connect to the cluster.
Page 32
With clustered systems, you can do maintenance even when users are online. Wait until a convenient, off-peak time when one of the nodes in the cluster can be taken offline for maintenance and its workload distributed among the remaining nodes. Before the upgrade, however, you must evaluate the entire cluster to verify that the remaining nodes can handle the increased workload.