Skip Headers
Oracle® Enterprise Manager Installation and Basic Configuration
10g Release 2 (10.2) for Linux x86

Part Number B16228-01
Go to Documentation Home
Home
Go to Book List
Book List
Go to Table of Contents
Contents
Go to Index
Index
Go to Master Index
Master Index
Go to Feedback page
Feedback

Go to previous page
Previous
Go to next page
Next
View PDF

E Agent Deploy Application - Installation Prerequisites

Ensure the prerequisites discussed in this section are met before proceeding with the installation using Agent Deploy. This appendix contains the following sections:

Prerequisites

The following prerequisites must be met before proceeding with the installation:

Set Up SSH (Secure Shell) User Equivalence

You must set up SSH (Secure Shell) prior to deploying the Management Agent using the Agent Deploy application. This is required as the Agent Deploy application uses SSH and SCP as modes of communication between nodes. Setting up the user equivalence helps avoid SSH authentication calls during future Agent Deploy operations.


IMPORTANT:

The SSH User Equivalence must always be set between the target hosts and the OMS, and never among the target hosts.

In order to set up SSH, you must execute the sshUserSetup.sh script that is available at the following location:

OMS_HOME/sysman/prov/resources/scripts

sshUserSetup.sh

Usage of this script is as follows:

sshUserSetup.sh -hosts "<hostlist>" -user <user name> [-verify] [-confirm] [-shared]

For example, sshUserSetup.sh -hosts "host1 host2" -user sjohn -advanced

Description

This script is used to set up SSH user equivalence from the host on which it is run to the specified remote hosts. After this script is run, you can use SSH to execute commands on the remote hosts, or copy files between the local host and the remote hosts without being prompted for passwords or confirmations.

The list of remote hosts and their user names are specified as command line parameters to the script.

  • -shared

    In case you have the home directory NFS mounted or shared across the remote hosts, the script should be used with -shared option.

    To determine Whether an Oracle Home Directory is Shared or Not Shared, consider a scenario where you want to determine whether the Oracle home directory of user1 is shared across hosts A, B, and C or not.You can determine this by following the instruction below:

    1. On hosts B and C, execute ls -al ~user1/checkSharedHome.tmp.

      If the file is present on hosts B and C in the ~user1 directory and is identical on all nodes, it means that the user's home directory is shared.

    2. On host A, rm -f ~user1/checkSharedHome.tmp.


    Note:

    In the event that you accidentally pass the -shared option for non-shared homes or vice versa, the SSH equivalence is only set up for a subset of the hosts. You will have to rerun the setup script with the correct option to rectify this issue.

  • -verify

    The -verify option allows you to verify whether SSH has been set up. In this case, the script does not set up SSH, but only checks whether SSH user equivalence has been set up from the local host to the remote hosts. It then runs the date command on each remote host using SSH. In case you are prompted for a password or see a warning message for a particular host, it means the SSH user equivalence has not been set up correctly for that host.In case the -verify option is not specified, the script sets up SSH and then does the verification as well.

  • -confirm

    The -confirm option allows you to set up SSH User Equivalence with a forced change in the permissions on remote hosts. This means that the script will not prompt you to confirm the change in permissions, if you execute the script passing the -confirm option.

  • -help

    Use this option to view the Readme file for the sshUserSetup.sh script. The usage is as follows:

    sshUserSetup.sh -help 
    
    

The following examples provides usage of the above-mentioned options:

Local host = Z
Remote Hosts = A, B, and C
Local user = sjohn
Remote users = foo (non-shared)
        aime (shared)
./sshUserSetup.sh -user foo -hosts "A B C" -confirm

Example E-1 Set up SSH User Equivalence and Verifies the Setup

sshUserSteup.sh -hosts "A B C" -user foo 

This script sets up user equivalence from:

  • Z to A

  • Z to B

  • Z to C

Example E-2 Setup SSH User Equivalence and verifies the Setup Without a Confirmation Prompt

sshUserSteup.sh -hosts "A B C" -user foo -confirm

This sets up SSH between the local host and A, B, C. It also verifies the setup. However, due to the usage of the -confirm option, it assumes that users are aware of the changes that would b made on the systems and will not ask for any confirmation.

Example E-3 Verifies Existing SSH User Equivalence Setup

./sshUserSetup.sh -hosts "A B C" -user foo -verify

Since the -verify option is specified, the script does not set up SSH setup, but only verifies the existing setup.

Setting Up SSH Daemon for the Timezone Variable on Remote Hosts

This section lists the steps you must follow to set up the SSH daemon (sshd) on remote hosts to access the timezone environment variable.

To verify if the timezone environment variable (TZ) is accessible by the SSH server on the remote hosts, execute the following command from the OMS host:

ssh -l <user_name> -n <remote_node> 'echo $TZ'

If this command does not return the TZ environment variable value, you must follow the instructions below to set the TZ variable and ensure this is accessible by the SSH server:

  1. Get the process ID of the SSHD by executing the following:

    $ps -aef | grep s
    root      4838     1  0 Sep26 ?        00:00:06 /usr/sbin/sshd
    <user name> 23954 26603  0 03:48 pts/5    00:00:00 grep sshd
    
    
  2. Shutdown SSHD by executing:

    sudo kill -9 4838 
    
    
  3. Restart the SSH Daemon by executing:

    sudo /etc/init.d/sshd restart
    
    
  4. Now, execute the following command from the OMS home to verify if the SSH server can access the TZ variable.

    ssh -l <user_name> -n <node_name> 'echo $TZ'
    
    

Note:

If sshd is not setup on remote box for TZ, you can pass this variable in the Additional Parameters text box using the -z option for default software source location (for install or upgrade) and the s_timezone=<timezone> option for a non-default software location.

Note that this will perform the installation of agents on all remote nodes with the same timezome value that you specify in the Additional Parameters text box. See Appendix F, "Additional Parameters for Agent Deploy" for more information.


Validate All Command Locations

The properties files located at <omshome>/sysman/prov/resources/ comprises the default locations of commands that are required for successful execution of certain APIs, for example, the ping executable.

Such command locations can vary between machines and/or platforms. Ensure you run the Validatepaths script to verify whether the command locations in the properties file are correct. This script provides a list of commands that are not found in the default locations.

Run the following command to execute this script:

./validatePaths -dirloc oms/sysman/prov/resources/
In the above example (of the ping executable); if the executable is present in /usr/sbin/ping, which is not the default location, you must specify this value in the userpaths.properties file by specifying PING_PATH=/usr/sbin/ping.

The properties files that are loaded by the Agent Deploy application are listed below:

  • platforminfo.properties

    Contains a list of files that need to be loaded for each platform. These files specify the paths for the commands. For example, /bin/ping.

    • Paths.properties

    • sPaths.properties

    • ssPaths.properties

    • userPaths.properties

    • ssPaths_sol.properties (for Solaris only)


    IMPORTANT:

    The files that comprise each properties file are loaded in the ascending order of their precedence. This means that values you specify in the last file that is loaded will override the values for the same property in the previous files.

    For example, the platforminfo.properties file comprises paths.properties, spaths.properties, ssPaths.properties, and userPaths.properties.

    If the default location for the ping executable in sPaths.properties file is usr/bin/ping, and you specified an alternative location in the ssPaths.properties file as usr/sbin/ping, the value in the latter file takes precedence over the others.


  • userPaths.properties

    This file lists all the variables that are used to specify the command paths. You must uncomment the variables that you want to use, and specify appropriate values.


    Note:

    If you want to include other command variables, you can either choose to specify these variables in any of these s*Paths.properties/userPaths.properties files, or create another properties file and specify its name in platforminfo.properties.

    Ensure these files are part of the platforminfo.properties file. If they are not, Agent Deploy ignores the paths to the executables that you have specified in these files and attempt to run the executables from their default locations.


  • system.properties

    This file contains properties that help you control the behavior and performance of the application. For example:

    • oracle.system.prov.threadpoolsize

      Number of threads that get created in the application and work in parallel to execute commands on remote hosts. The default threadpool size value that is set for Agent Deploy is 32. You can specify an appropriate value for the threadpool size in this property.

      For example oracle.sysman.prov.threadpoolsize=128.

    • oracle.sysman.prov.threadpoolmaxsize

      Number of threads can increase dynamically depending on the workload.

      The default value used in the application is 256 (oracle.sysman.prov.threadpoolmaxsize=256). You can specify an appropriate maximum value for the threadpool size in this property.

  • ignore_messages.txt

    If there are error messages displayed in the error stream that you know can be ignored in the setup, you can update these messages in the ignore_messages.txt file.

    Generally, if the error stream contains data when you execute any command, it is assumed that the command failed. But the data in the error stream may not always correspond to the error. So, to ignore such error messages, you must add these messages to the ignore_messages.txt file.

    Consider the following example:
    When you run /usr.local/bin/sudo on a remote machine, it writes the following messages on to the error stream:
    Administrator. It usually boils down to these two things:#1) Respect the privacy of others.#2) Think before you type.Password:
    This essentially, is just a warning to the user and does not constitute the failure of the executed command.
    Such error messages can be added to the ignore_message.txt file.
    

Note:

The dataformat for these files mandates only one property per line. You must specify the property values in the format: variable=value.

Location of Properties File

You can view the following properties files at <OMS_HOME>/sysman/prov/resources/:

  • platformInfo.properties

  • Paths.properties

  • sPaths.properties

  • ssPaths_sol.properties

  • userPaths.properties

  • system.properties

  • ignore_messages.txt

Location of Installation Logs

The following prerequisite check and installation logs are available at these locations:

  • <OMS_HOME>/sysman/prov/agentpush/<TIMESTAMP>/prereqs

    1. Connectivity logs: The following logs are available at

      $OMS_HOME/sysman/prov/agentpush/<time-stamp>/prereqs/local:

      • prereq<time_stamp>.log

      • prereq<time_stamp>.out

      • prereq<time_stamp>.err

    2. Prereq Logs: The following prerequisite logs for <node 1> will be available at $OMS_HOME/sysman/prov/agentpush/<time-stamp>/prereqs/<node1>:

      • prereq<time_stamp>.log

      • prereq<time_stamp>.out

      • prereq<time_stamp>.err


    Note:

    The time-stamp in the log-files of prereq/install/upgrade may not be the same as the time-stamp in the $OMS_HOME/sysman/prov/agentpush/<time-stamp>/. These timestamps can differ considerably from the OMS host because these logs are generated in remote nodes and are collected back to OMS after the agent installation/upgrade.

  • <OMS_HOME>/sysman/prov/agentpush/logs/

    1. EMAgentPush<TIMESTAMP>.log: Agent Deploy application logs.

    2. remoteInterfaces<TIMESTAMP>.log: Logs of the remote interfaces layer.

  • <OMS_HOME>/sysman/prov/agentpush/<TIMESTAMP>/logs/<HOSTNAME>/

    1. install.log/.err: Log/error of the fresh agent install/fresh cluster agent install.

    2. upgrade.log/.err : Log/error of the upgrade operation using Agent Deploy.

    3. nfsinstall.log/err: Log/error of the agent installation using the Shared Agent Home option in Agent Deploy.

    4. clusterUpgrade.log/err: Log/error of the cluster upgrade operation using Agent Deploy.

    5. sharedClusterUpgradeConfig.log/err: Log/error of the config operation in case of upgrade on a shared cluster.

    6. config.log/err: Log/error of the configuration of shared cluster in case of an agent installation on shared cluster.

    7. preinstallscript.log/.err: Log/error of the running of pre-install script, if specified.

    8. rootsh.log/.err: Log/error of running of root.sh.

    9. postinstallscript.log/.err: Log/error of running of post-install script, if specified.

    10. installActions<timestamp>.log, oraInstall<timestamp>.err/.out : Logs of Oracle Universal Installer.

    11. agentStatus.log : Status of agent after running emctl status agent from the agent home.

Modify Response File for Big IP Host and Port

If the Management Service is using a load balancer, you must modify the s_omsHost and s_OMSPort values in the <omshome>/sysman/agent_download/10.2.0.1.0/agent_download.rsp file to reflect the load balancer host and port before using the Agent Deploy application.

Verify oraInventory Permissions on Remote Hosts

Ensure you (or the user performing the agent installation) have read, write, and execute permissions to oraInventory on all remote hosts. If you do not have these permissions on the default inventory (typically at /etc/oraInst.loc) on any remote host, you can specify the path to an alternative inventory location by using the -i <location> option in the Additional Parameters section.


Note:

If this is the first installation on a remote host, the Oracle Universal Installer automatically creates the oraInventory in the user's home directory with read, write, and execute permissions for that user, as well as the OS group that the user belongs to.

Verify User Credentials

Ensure the user installing the agent is the same as the user that has installed Oracle Oracle Application Server and/or Oracle Collaboration Suite. You must also ensure the user has SUDO privileges that are required to execute the root.sh script.

You can either select the Run Root.sh option in Agent Deploy that will automatically execute the root.sh at the end of the installation, or choose not to select this option, but manually execute this script at the end of the installation.

This script must be run after the installation is complete in order to discover all the targets.

Prerequisite Checks Executed by Agent Deploy

The Agent Deploy application runs a local prerequisite check (on the machine running the Management Service) and remote prerequisite checks on all the remote hosts before proceeding with the installation process.

Prerequisite Checks Executed on the Local Host

Table E-1 lists the connectivity prerequisite checks that are run on the local (Oracle Management Service) host.

Table E-1 Connectivity Prerequisite Check

Check to ensure whether Description

Nodes are alive

Verifies whether the remote nodes are accessible.

SSh Server is up

Verifies whether there is an SSH Server Daemon running on all remote hosts, since the installation process will require SSH.

SSh User Equivalence is set

Verifies whether the user name specified in the installation details page has the SSH User Equivalence on all the remote hosts.

Installation directory is writable on the remote hosts

Verifies whether the installation base directory that you have specified is writable.


Prerequisite Checks Executed on Remote Hosts

Table E-2 lists the prerequisite checks that are executed by Agent Deploy for each installation type.

Table E-2 Prerequisite Checks for a Fresh Installation of Management Agent

Prerequisite Check for Description Fresh Install Shared Agent Install Upgrade

Certified Versions

Checks if the operating system on remote hosts is certified.

Yes

Yes

Yes

Packages

Checks if the minimum required packages are available on remote hosts

Yes

No

Yes

Disk Space

Checks if the minimum required disk space is available.

Yes

No

Yes

Agent Targets

Checks for targets on remote hosts that cannot be monitored by the agent.

Targets that have been installed by another user cannot be monitored by the agent that you are going to install.

Yes

Yes

Yes

Port

Determines if the specified port is available. If you have not specified a value, this check looks for a free port within the 1830 - 1849 range or 3872 and assign it.

Yes

Yes

Yes

Oracle Home Location

Verifies whether the specified Oracle home (<install_base_dir/agent10g>) is empty.

Yes

Yes

Yes

Existing Agent Installations

Checks for any existing agent installations on the remote hosts.

Yes

No

No

Write Permissions for Base Directory

Checks if the installation base directory on all remote hosts have write permissions.

Yes

No

No

Inventory Check

Checks if the user credentials that you have specified have write permissions on the central inventory of each remote host.

Yes

Yes

Yes

Upgrade Agent Existence Check

Determines the existence of an agent (10.1) that can be upgraded on the remote hosts.

No

No

Yes

Write Permissions for Upgrade Agent

Checks if the installation base directory on all remote hosts have write permissions.

No

No

Yes

NFS Agent Existence Check

Checks for any existing agent installations on the remote hosts.

No

Yes

No

Write Permissions 'for NFS Agent

Checks whether the installation base directory, EMSTATE directory, and the NFS location are writable from all the remote hosts.

No

Yes

No

Time Zone ENV Check

Checks if the Timezone (TZ) environmental variable is set on the remote hosts.

Yes

Yes

Yes

Software Existence Check

Ensures the alternative software that you have specified is valid.

Note: This check is executed only if you have selected a non-default (Another Location) software location for the agent installation.

Yes




Troubleshooting Failed Prerequisite Checks

This sections details the possible errors that you may encounter when the prerequisite checks are executed, and the appropriate user actions to be taken to resolve the errors.

Prerequisite Check Errors and Resolutions on Local Host

Table E-3 lists the most common reasons for prerequisite check failures, and the corresponding user actions to be performed to resolve them.

Table E-3 Prerequisite Check Errors and Resolutions on Local Host

Prerequisite Check Reason for Failure User ActionFoot 1 

Nodes are alive

Nodes are not accessible.

  • Ensure all the nodes are active.

  • Remove the nodes that are not accessible from the nodes list.

SSh Server is up

SSH Daemon on one or more nodes is not up.

  • Try to start the SSH Daemon on the failed nodes.

  • Remove the failed nodes from the node list.

SSh User Equivalence is set

SSh User Equivalence is not set up from the local host to the failed nodes for the specified user credentials.

Installation directory is writable on the remote hosts

Installation base directory that you have specified is not writable, or cannot be created on the failed nodes.

  • Include write permissions on the failed nodes by executing the following command on the failed hosts from the local (OMS) host:

    [ssh -l <user> <host> "chmod +w -R <dir>"]
    
    
  • Remove failed nodes from the nodes list.


Footnote 1 Where there are multiple user actions listed, you can choose to perform the action that is most appropriate.

Prerequisite Check Errors and Resolutions on Remote Hosts

Table E-4 lists the most common reasons for prerequisite check failures on remote hosts, and the corresponding user actions to be performed to resolve them.

Table E-4 Reasons for Prerequisite Check Failure and Corresponding User Actions

Prerequisite Check Reason for Failure User ActionFoot 1 

Certified Versions

The failed host may have an operating system or version that is not certified to deploy the agent on that machine.

  • Exit the current installation and retry the agent installation without the failed hosts.

  • Upgrade the failed node to an operating system or version that is certified before proceeding with the installation.

Packages

The failed hosts may not comprise the recommended minimum packages required to deploy the agent.

Click Fix and Retry in the Prorate Details page. Agent Deploy performs an automatic packages fix using YUM or RPMs. During the process, it returns to the Installation Details page and prompts you to specify valid/alternative values where required, and then reruns the prerequisite checks.

Disk Space

This check may fail if the required minimum disk space for the installation is not found on the remote hosts.

  • Increase the disk space on the failed hosts.

  • Remove the failed nodes from the nodes list.

Agent Targets

The failed nodes may have some targets that were installed by a different user, and hence cannot be monitored by the agent.

  • Remove the targets that cannot be monitored from the failed hosts.

  • Continue with the installation since the failure message is only a warning (though not recommended).

Port

  • The specified port is not valid, or is not available.

  • You have not specified any port and there is no available port in the default range.

  • Ensure the specified port is not blocked on the failed hosts.

  • In the Installation Details page, leave the Port value blank.

  • If the default ports are either blocked or not available, remove the failed nodes from the nodes list.

Oracle Home Location

The <install_base_dir>/agent10g already exists and is not empty.

  • Clean up the <install_base_dir>/agent10g directory.

  • Specify an alternative installation base directory.

  • Remove the failed nodes from the nodes list.

Existing Agent Installations

An agent already exists on the failed remote hosts that is registered with the central inventory.

  • Uninstall the existing agent and retry the prerequisite checks.

  • Continue with the installation since the failure message is only a warning (though not recommended).

Write Permissions for Base Directory

The installation base directory is not writable.

  • Include write permissions on the failed nodes by executing the following command on the failed hosts from the local (OMS) host:

    [ssh -l <user> <host> "chmod +w -R <dir>"]
    
    
  • Remove failed nodes from the nodes list.

Inventory Check

The specified user credential does not have write permissions on the central inventory.

Change the central inventory permission settings to render the central inventory and its sub-directories writable. Complete the following steps to resolve this issue:

  1. Log in to the local host (machine running the Oracle Management Service).

  2. Change the directory to:

    <HOME>/sysman/prov/agentpush/resources/fixup
    
    
  3. For each failed host, run the following script:

    ./fixOraInvPermissions.sh <install user> <install group> <failed host name> <inventory location>.
    
    

    As this script must be run as root (using sudo) on the failed remote host, you are prompted to specify the sudo password.

Upgrade Agent Existence Check

An Oracle Management Agent of version 10.1 is not present in the remote hosts on which you want to perform the agent upgrade.

Exit the upgrade process.

Write Permissions for Upgrade Agent

The installation base directory is not writable.

  • Include write permissions on the failed nodes by executing the following command on the failed hosts from the local (OMS) host:

    [ssh -l <user> <host> "chmod +w -R <dir>"]
    
    
  • Remove failed nodes from the nodes list.

NFS Agent Existence Check

An agent already exists on the remote hosts that is registered with the central inventory.

  • Uninstall the existing agent and retry the prerequisite checks.

  • Continue with the installation since the failure message is only a warning (though not recommended).

Write Permissions 'for NFS Agent

  • The installation base directory is not writable.

  • The NFS location is not accessible.

  • The EMSTATE directory is not writable.

  • Include write permissions on the failed nodes by executing the following command on the failed hosts from the local (OMS) host:

    [ssh -l <user> <host> "chmod +w -R <dir>"]
    
    
  • Remove failed nodes from the nodes list.

Time Zone ENV Check

The TZ environment variable is not set on the remote hosts.

Recommended

  • Specify the time zone in the Additional Parameters section (using the -z option) of the Installation Details page.

Optional

  • Set the TZ environment variable. Shut down and restart SSH on all remote hosts.

  • Update with the TZ environment variable on all remote hosts.

Software Existence Check

The alternative software location that you have specified is not valid.

  • Revert to the default software source location.

  • Change the alternative software location to a valid location (having ./stage/product.xml).


Footnote 1 Where there are multiple user actions listed, you can choose to perform the action that is most appropriate.