© Copyright IBM® Corporation 2008. All rights reserved. May only be used pursuant to a Tivoli® Systems Software License Agreement, an IBM Software License Agreement, or Addendum for Tivoli Products to IBM Customer or License Agreement. No part of this publication may be reproduced, transmitted, transcribed, stored in a retrieval system, or translated into any computer language, in any form or by any means, electronic, mechanical, magnetic, optical, chemical, manual, or otherwise, without prior written permission of IBM Corporation. IBM Corporation grants you limited permission to make hardcopy or other reproductions of any machine-readable documentation for your own use, provided that each such reproduction shall carry the IBM Corporation copyright notice.
No other rights under copyright are granted without prior written permission of IBM Corporation. U.S. Government Users Restricted Rights - Use, duplication or disclosure restricted by GSA ADP Schedule Contract with IBM Corp.
This fix pack only applies to a regular installation of Tivoli Provisioning Manager (an installation with IBM WebSphere® Application Server as the application server). Tivoli Provisioning Manager must be at version 5.1.1.1. A fix pack is not available for a Fast Start installation.
The enhancements and changes included in this fix pack are:
Included in this release is basic Red Hat Linux® 5 (RHEL 5) support for the Tivoli common agent on the endpoint. RHEL 5 is not supported on depots or the management console. This support does not include patch management or new security enhancements provided in Red Hat Linux 5.
You can now define maintenance windows. These are defined periods of time within which data downloads using Dynamic Content Delivery (DCD) will be enabled or paused. The maintenance windows are specified in a text file that follows the iCalendar syntax (RFC2445). The file containing the information about maintenance windows must be installed in the <agent installation directory>/runtime/agent/subagents directory, and must be named maintenance.ics. To install the schedule into the agent you can use the typical Software Package Block (SPB) distributions. Use the SPB processing to install the schedule file maintenance.ics into the previously mentioned directory. The SPB that contains the maintenance calendar is created using the Software Package Editor (SPE). To facilitate this process you can use the sample software package description (SPD) file found on the provisioning server after the fix pack is installed. The SPD can be found in $TIO_HOME/repository/tivoli/TCA/schedule.spd. The maintenance.ics file can be created using a text editor or any iCalendar-compliant calendar file creation tool.
The software module definition for the maintenance schedule SPB should have a software capability defined of type TIVOLI_COMMON_AGENT, name operational.schedule and value maintenance.window. When you install the software module, this capability is interpreted by Tivoli Provisioning Manager to be a "system" task. As a result, the priority of the Scalable Distribution Infrastructure (SDI) job is decreased, downgraded to run - in the background so that maintenance windows are handled with a higher preference. After the SPB is installed, the new file is in the endpoint and is ready to be used by the agent. The agent includes a scheduler that reads and interprets the maintenance.ics file. This scheduler scans the maintenance.ics file every 20 seconds to check whether it has been updated. If it was updated then the scheduler loads the maintenance.ics file, then processes it. If there are any issues in processing the maintenance.ics file such as syntax errors, the scheduler defaults to having no service windows. Updating the last modified time stamp on the maintenance.ics file causes the scheduler to load and process the file, regardless of its contents. In this implementation of maintenance windows, only the DCD file transfers (downloads) are being managed, so only job steps related to file transfers from DCD will be paused and resumed by the scheduler as determined by the maintenance schedule.
The maintenance.ics file format is compliant with the syntax specified in the RFC 2445 document. Only a subset of the RFC is used. The following section demonstrates the supported tags.
maintenance.ics file format
BEGIN:VCALENDAR DESCRIPTION:Sample 10pm to 5am maintenance window PRODID:IBM VERSION:1.0 BEGIN:VTIMEZONE TZID:EST END:VTIMEZONE BEGIN:VEVENT DESCRIPTION:Sample 10pm to midnight window DTSTART:20080901T220000 DTEND:20080901T235959 RRULE:FREQ=WEEKLY;BYDAY=MO,TU,WE,TH,FR,SA,SU; END:VEVENT BEGIN:VEVENT DESCRIPTION:midnight to 5am window DTSTART:20080901T000000 DTEND:20080901T050000 RRULE:FREQ=WEEKLY;BYDAY=MO,TU,WE,TH,FR,SA,SU; END:VEVENT END:VCALENDAR
Extension mechanism
Time Scheduling Services (TSS) uses the standard vCalendar extension mechanism. Custom properties have a standard prefix, X-; that are processed by specific applications and ignored by others.
Using this extension, some AvailabilityCalendar properties that are not standard in vCalendar can be set using X- properties.
In order to maintain compatibility with the vCalendar format, the X- properties are not required or mandatory. If they are not set, they will be automatically set by the AvailabilityCalendarFactory option to a default value.
Formal Definition
A TSS vCalendar definition consists of a header followed by a sequence of validity time intervals mapped into VEVENT components.
The following vCalendar properties are supported :
The VTIMEZONE is a vCalendar component. Only the TZID property is supported.
PRODID
VERSION
UID
DESCRIPTION
CALSCALE
METHOD
VTIMEZONE
X-CALVALIDFROM
X-CALVALIDTO
X-CALAVAILABILITYVALUE
X-CALFREESUNDAY
X-CALFREESATURDAY
X-CALTIMEINTERVAL
After the header, you must specify a sequence of validity time intervals. Note that you must specify a validity time interval.
The following validity time interval attributes are available:
UID
DESCRIPTION
DTSTART
DTEND
RRULE
RDATE
EXDATE
RRULE and RDATE are mutually exclusive.
TSS vCalendar only supports events that do not span across different days. In other words, DTSTART and DTEND must belong to the same day.
Property Name | Mult. | Default | Type | Description |
---|---|---|---|---|
PRODID | 1 | -- | String | Identifier for the product that created the vCalendar object. This is a mandatory property of the vCalendar standard but TSS ignores it. |
VERSION | 1 | -- | String | This is a mandatory property of the vCalendar standard but TSS ignores it. |
UID | 0:1 | (automatically generated) | String | Unique name. |
DESCRIPTION | 0:1 | -- | String | Description |
VTIMEZONE | 0:1 | Local timezone | Component | Time zone in which the time is expressed. |
X-CALVALIDFROM | 0:1 | Today | String | |
X-CALVALIDTO | 0:1 | Today + 10 years | String | |
X-CALAVAILABILITYVALUE | 0:1 | 0 | String | |
X-CALFREESUNDAY | 0:1 | True | boolean | Consider Sunday as free day. |
X-CALFREESATURDAY | 0:1 | True | boolean | Consider Saturday as free day. |
X-CALTIMEINTERVAL | 0:n | -- | String | Start and end time for each day into the calendar validity interval. |
Property Name | Mult | Default | Type | Description |
---|---|---|---|---|
UID | 0:1 | (automatically generated unique id) | String | |
DESCRIPTION | 0:1 | -- | String | |
DTSTART | 1 | -- | String | Validity interval start date and time. |
DTEND | 1 | -- | String | End time for runtime. |
RRULE | 0:1 | -- | String | Recurring rule (mutually exclusive with RDATE). UNTIL property can be specified: validity interval end date. |
RDATE | 0:1 | -- | String | List of punctual dates to include (mutually exclusive with RRULE). |
EXDATE | 0:1 | -- | String | List of punctual dates to exclude. |
The following code is an example of a maintenance.ics file:
BEGIN:VCALENDAR BEGIN:VEVENT DTSTART:20080901T100000 DTEND:20080901T140000 RRULE:FREQ=WEEKLY;INTERVAL=1;BYDAY=MO;FR; UNTIL=20200901T14000; END:VEVENT END:VCALENDAR
Custom inventory extensions are used to extend the data model inventory schema with additional server attributes. The endpoint scanning system retrieves and processes these extended attributes generated by endpoint scripts, making them accessible for queries (for example, for dynamically grouping endpoints) and reports.
To create a custom inventory extension, first create custom tables with the columns needed to hold the collected data. The custom tables must conform to the following rules:
The following rules are suggested in the custom table creation:
For example:
CREATE TABLE IPCONFIG( parameter VARCHAR(255), endpoint VARCHAR(255), network_interface VARCHAR(255), value VARCHAR(255) ); ALTER TABLE IPCONFIG ADD COLUMN SERVER_ID BIGINT NOT NULL; ALTER TABLE IPCONFIG ADD COLUMN DISCOVERY_ID BIGINT NOT NULL; ALTER TABLE IPCONFIG ADD FOREIGN KEY (discovery_id) REFERENCES DISCOVERY(discovery_id) on DELETE CASCADE; ALTER TABLE IPCONFIG ADD FOREIGN KEY (server_id) REFERENCES SERVER(server_id)on DELETE CASCADE; COMMIT;
The extensions are managed from a command line application found at:
$TIO_HOME/tools/inventoryExtension.[sh(unix) | cmd(windows)].
##name of the extension extName=IPCONFIG_Test #description of the extension extDescription=UNIT TEST #Custom Tables Names TABLE_1.NAME=ipconfig #files for the AIX platform AIX=yes pre_aix=/home/test/prescript.pl out_aix=/home/test/ipconfig1.mif post_aix=/home/test/postscript.pl #files for the HPUX platform HPUX=yes pre_hpux=/home/test/prescript.sh out_hpux=/home/test/ipconfig1.mif post_hpux=/home/test/postscript.sh #files for the Solaris platform SOLARIS=yes pre_solaris=/export/home/test/prescript.sh out_solaris=/export/home/test/ipconfig1.mif post_solaris=/export/home/test/postscript.sh #files for the Linux platform red hat 4 or 5 LINUX=yes pre_linux=/home/test/prescript.sh out_linux=/home/test/ipconfig1.mif post_linux=/home/test/postscript.sh #files for the windows platform win xp WINDOWS=yes pre_windows=c:\\test\\prescript.windows.bat out_windows=c:\\test\\ipconfig1.windows1.mif post_windows=c:\\test\\postscript.windows.bat
The extension name is a required parameter, if the extension description is not passed in, the extension is created with no description.
The following restrictions apply to the table section
The fully qualified path of the pre- and post-script files on the endpoints are specified in the properties file. The generated output file location is specified and can be an XML or MIF file.
The pre- or post-script files can be used to gather custom data and are specified in the properties file that is run on the target endpoint. The pre-script file generates an output file in either the XML or MIF file format that describes the information used to complete the custom tables. Following are examples of pre-script and post-script files, and XML and MIF output files that can be generated.
Sample Pre-script:
#!/bin/bash echo copying cp /tmp/ipconfig.mif.backup /tmp/ipconfig.mif exit 0;
Sample Post-script:
#!/bin/bash echo deleting backup # rm /tmp/ipconfig.mif.backup exit 0;
Sample XML output:
<custom-data> <custom-table name="ipconfig"> <string-column name="Parameter" length="255" value="parameter1" /> <string-column name="endpoint" length="255" value="endpoint_test" /> <string-column name="network_interface" length="255" value="eth0" /> <string-column name="value" length="255" value="9.168.100.80" /> </custom-table> </custom-data>
Sample MIF output:
Start Component Name = "ipconfig.mif" Start Group id = 1 Name = "ipconfig" Class = "test_class" Start Attribute Name = "Parameter" Id = 1 Type = String(255) value = "parameter1" End Attribute Start Attribute Name = "endpoint" Id = 2 Type = String(255) value = "endpoint_test" End Attribute Start Attribute Name = "network_interface" Id = 3 Type = String(255) value = "eth0" End Attribute Start Attribute Name = "value" Id = 4 Type = String(255) value = "9.168.100.80" End Attribute End Group End Component
The inventory extension defined is also used in the discovery configuration. Custom inventory scans can be configured to run using the discovery configuration UI. Custom inventory extensions can be selected in the user interface. The deployment engine can also be used to run an agentless extended scan.
In order to create and run an inventory extension follow these steps :
To run multiple Custom Inventory Extensions that have been created from the command line under the same Discovery instance:
For the Report:
Endpoints with multiple NICs can now be correctly managed from Tivoli Provisioning Manager. Configure the appropriate host name for that NIC and the route to be used to the provisioning server. When the common agent is installed, Tivoli Provisioning Manager will use the IP address used by the common agent to register itself as the management IP address, and use the defined host name as seen by the endpoint on the management IP as the computer name.
This feature provides an option for agent discovery configurations (both 'IBM Tivoli Agent Manager Discovery' and 'IBM Tivoli Agent Manager Discovery Device' discovery configurations) that will use the observed IP address as the management IP for each endpoint. When this parameter is set the agent discovery will not attempt to verify that the observed management IP is in the list of supported IPs for that endpoint, as in the Network Address Translation (NAT) case that IP would not be present.
This will ensure that any operations initiated using the deployment engine will discover the endpoint correctly using the NAT addressing.
The basic flow of events is as follows:
Now the user will be able to run the workflow using the deployment engine on the endpoints discovered using the NAT addressing.
The agent upgrade process now leverages the Software Infrastructure Distribution and the Software Installation Engine (SID/SIE) and therefore provides a more robust mechanism for agent upgrade and installation. The SPBHandler subagent is required on the endpoint.
The agent installer will now install all of its files as read-only.
There are several components of this integration: Tivoli Provisioning Manager, TotalStorage® Productivity Center Standard Edition and the IBMTPC automation package.
The features included in this release are :
The IBMTPC automation package also provides workflows and Java™ helpers that enable Tivoli Provisioning Manager to use the basic storage device functions provided by Total Storage Productivity Center.
Install IBMTPC automation package
To install the tcdriver, perform the following steps:
%TIO_HOME%/tools/tc-driver-manager.cmd forceInstallDriver IBMTPC
For Unix, run the following command:
$TIO_HOME/tools/tc-driver-manager.sh forceInstallDriver IBMTPC
To configure and initialize your system, follow these steps:
The provisioning server may now be installed and used on the 64–bit zSeries® Linux platform.
Version 5.1.1.2 (Fix Pack 2) includes fixes addressed in version 5.1.1.1 and customer Authorized Program Analysis Reports (APARs) from previous interim fixes.
There is no refresh installation included as part of this fix pack. Only an existing version 5.1.1.1 on any interim fix level can be upgraded to version 5.1.1.2. Migration from version 5.1 and version 5.1.1 to version 5.1.1.2 is not supported.
You must install and configure all components of Tivoli Provisioning Manager version 5.1.1.1 before attempting the installation of 5.1.1.2. Thoroughly review the following list of installation prerequisites before you proceed with the installation.
For Linux on IBM iSeries®, IBM pSeries®, or IBM zSeries and Linux on AMD, you must perform the installation manually. Refer to the Tivoli Provisioning Manager Installation Guide for additional platforms for your operating system.
If Tivoli Provisioning Manager is already installed, you will upgrade from version 5.1.1.1 to 5.1.1.2.
The fix pack can be applied to an existing regular installation of Tivoli Provisioning Manager. After verifying all the prerequisites described in this section, you can install the fix pack. A fix pack is not available for a Fast Start installation.Location | Disk space requirements |
---|---|
Disk space for installation images | 2 GB |
Disk space to extract files from installation images | 2.5 GB |
/ | 250 MB |
Temporary files: /tmp | 600 MB |
/usr | 50 MB |
/var | 1 MB |
/home | 100 MB |
Tivoli Provisioning Manager installation directory The default location is: /opt/IBM/tivoli/tpm |
|
Agent manager installation directory: usr/IBM/AgentManager |
50 MB |
Ensure that you can run the workflow named no_operation. For instructions on how to run a workflow, refer to the Running workflows from the Web interface topic in the Tivoli Provisioning Manager version 5.1.1.1 information center.
To determine the Tivoli Provisioning Manager version, in the Web interface, click the About link at the top right corner of the Welcome page.
Verify that the sample:all-objects access domain exists. If you removed this access domain after installing Tivoli Provisioning Manager, you must recreate it before installing the fix pack.
db2 connect to database_name user database_user using passwordwhere:
db2 connect to tc user tioadmin using pa55w0rd
sqlplus database_user/password@database_name
sqlplus tiodb/pa55w0rd@tc
db2 "select * from access_domain where name = 'sample:all-objects'"
select * from access_domain where name = 'sample:all-objects'
<?xml version="1.0" encoding="ISO-8859-1"?> <!DOCTYPE datacenter PUBLIC " -//Think Dynamics//DTD XML Import//EN" "xml import.dtd"> <datacenter> <access-domain name="sample:all-objects"> </access-domain> </datacenter>
$TIO_HOME/tools/xmlimport.sh file:path/sampleAccessDomain.xml
For each automation package that you have removed, ensure that the data model objects that were created by the automation package are also removed. For example, if you removed an automation package for a piece of software, any software catalog entries created by the automation package must also be removed. Information about the objects created by an automation package is in the xml subfolder of the automation package.
Alternatively, you can reinstall all provided automation packages that you removed before you begin the fix pack installation.
./cancel-all-in-progress.sh ./clean-up-deployment-requests.sh
which java
If this command is not available on your system, run the following command instead:
type java
echo $PATH
export JITC_COMPILEOPT=NALL{org/eclipse/osgi/framework/internal/core/PackageAdminImpl}{doResolveBundles}
export JAVA_COMPILER=nojit
The default_automation_package automation package can cause problems during the fix pack installation and is therefore deleted during the installation. If you want to save a workflow that you created in the Web interface, perform the following steps:
Click Automation > Workflows
After the fix pack installation, you can add the workflow back into Tivoli Provisioning Manager by opening it in the workflow composer and compiling the workflow. If you have multiple workflows to import, you can add them to an automation package and install the automation package.
Operating System | Tivoli Provisioning Manager | Components |
---|---|---|
Windows
Upgrading from 5.1.1.1 |
5.1.1.1-TIV-TPM-Win32-FP0002.zip |
5.1.1.1-TIV-Components-Win32-FP0002.zip, which includes: - CAS\AM_V13_WIN.zip - CDS\Win32\setup.exe - DMS\DMS_patch.zip - DMS\ProductUpdateInstaller61.zip |
AIX
Upgrading from 5.1.1.1 |
5.1.1.1-TIV-TPM-AIXPPC32-FP0002.zip |
5.1.1.1-TIV-Components-AIXPPC32-FP0002.zip, which includes: - CAS/AM_V13_AIX.tar - CDS/AIXPPC32/setup.bin - DMS/DMS_patch.zip - DMS/ProductUpdateInstaller61.zip |
Solaris
Upgrading from 5.1.1.1 |
5.1.1.1-TIV-TPM-SolarisSparc-FP0002.zip |
5.1.1.1-TIV-Components-SolarisSparc-FP0002.zip, which includes: - CAS/AM_V13_SUN.tar - CDS/SolarisSparc/setup.bin - DMS/DMS_patch.zip - DMS/ProductUpdateInstaller61.zip |
Linux
Upgrading from 5.1.1.1 |
5.1.1.1-TIV-TPM-Linux-FP0002.zip |
5.1.1.1-TIV-Components-Linux-FP0002.zip, which includes: - CAS/AM_V13_LIN.tar - CAS/AM_V13_LIN_PPC.tar - CAS/AM_V13_LIN_zSeries.tar - CDS/LinuxIA32/setup.bin - CDS/LinuxPPC64/setup.bin - CDS/LinuxS390/setup.bin - DMS/DMS_patch.zip - DMS/ProductUpdateInstaller61.zip |
HP-UX
Upgrading from 5.1.1.1 |
5.1.1.1-TIV-TPMFSW-HPUXIA64-FP0002.zip |
5.1.1.1-TIV-Components-HPUXIA64-FP0002.zip, which includes: - CAS/AM_V13_HPUX_IA64.tar - CDS/HPUXIA64/setup.bin - DMS/DMS_patch.zip - DMS/ProductUpdateInstaller61.zip |
where filename.md5 is the name of the file that contains the md5 sum and the name of the filename.zip file. The md5 file must include only the .zip file name and not its full path.
The command calculates the md5 sum for the filename.zip file, automatically compares it with the value listed in the .md5 file, and returns OK if the values are the same. For this to work, the .zip and .md5 files should be in the same directory when the md5sum check is run.
For example, to validate the file 5.1.0.2-TIV-TIO-Linux-FP0002.zip, run the command:
To verify ITDS on Windows:
ibmdirctl -D cn=root -w password status
ibmdirctl -D cn=root -w password -h hostname status
To verify that DB2 Universal Database is running:
su - db2inst1
db2startDB2 Universal Database is started if it is not running already. If the database is already running, the following message is displayed:
SQL1026N The database manager is already active
TNS-12541: TNS:no listener
Connected to: Oracle Database 10g Enterprise Edition ReleaseIf the database is stopped, the following message is displayed:
Connected to an idle instance.
This readme file uses the following variables to represent directory paths.
Path variables | Definition | Default directory |
---|---|---|
AM_installdir | Installation directory for the agent manager | /opt/IBM/AgentManager |
$DMS_HOME | Installation directory for the device manager service | /opt/IBM/tivoli/tpm/DeviceManager |
DB2_installdir | Installation directory for DB2 Universal Database |
AIX: /usr/opt/db2_08_01 Solaris and Linux: /opt/IBM/db2/V8.1 |
IDS_installdir | Installation directory for Tivoli Directory Server |
/opt/IBM/ldap/V6.0 |
$ORACLE_HOME | Installation directory for Oracle | $ORACLE_BASE/product/10.2.0.1.0
The value of $ORACLE_BASE is the directory in which all Oracle software is installed, for example, /u01/app/oracle. |
$WAS_HOME | Installation directory for WebSphere Application Server |
AIX: /usr/IBM/WebSphere/AppServer Solaris and Linux: /opt/IBM/WebSphere/AppServer |
$TIO_HOME | Installation directory for Tivoli Provisioning Manager |
/opt/IBM/tivoli/tpm |
$TIO_LOGS | Log file directory for Tivoli Provisioning Manager |
/var/ibm/tivoli/common/COP/logs |
export TERM=ansior
export TERM=xtermor
export TERM=vt100
For DB2 Universal Database
unixUpgrade.sh -WASadmin was_adminID -WASadminPWD was_admin_pwd -DBAdmin db_adminID -DBAdminPWD db_admin_pwd
For Oracle database
unixUpgrade.sh -WASadmin was_adminID -WASadminPWD was_admin_pwd -DBAdmin db_adminID -DBAdminPWD db_admin_pwd -CDSOraclePWD cds_schema_pwdwhere
$TIO_HOME/.tools/tpmJspCompiler.sh
$JAVA_HOME/bin/java -Xms256m -Xmx1024mwith
$JAVA_HOME/bin/java -Xms512M -Xmx512Mthen save and close the file.
The first time that you start the product after installing the fix pack, the automation package migration process runs and updates the automation packages provided with Tivoli Provisioning Manager:
The following main steps occur during the installation of the fix pack:
To recover from an installation error:
/opt/ibm/tivoli/common/ctgde/logs
$DMS_HOME/log
$TIO_LOGS/tcdrivermanager.log
$TIO_LOGS/deploymentengine/console.log
After you have completed the installation of the fix pack, you must upgrade older versions of the common agent. The common agent upgrade is from version 1.3.2.25, provided with Tivoli Provisioning Manager 5.1.1.1 (any interim fix), to version 1.3.2.29, provided with Tivoli Provisioning Manager 5.1.1.2. The common agent must be upgraded on both the depot servers and the endpoints that are participating in the software distribution process.
You can upgrade the agents using the Software Distribution Infrastructure (SDI) or the deployment engine. The steps to upgrade using SDI are:
Name | Version | Installation Files |
---|---|---|
TCA-1.3.2.29 Upgrade | 1.3.2.29 | TCA-1.3.2.29 AIX Upgrade Installable |
TCA-1.3.2.29 Upgrade | 1.3.2.29 | TCA-1.3.2.29 HP-UX on Itanium® Upgrade Installable |
TCA-1.3.2.29 Upgrade | 1.3.2.29 | TCA-1.3.2.29 HP-UX Upgrade Installable |
TCA-1.3.2.29 Upgrade | 1.3.2.29 | TCA-1.3.2.29 Linux390 Upgrade Installable |
TCA-1.3.2.29 Upgrade | 1.3.2.29 | TCA-1.3.2.29 Linux86 Upgrade Installable |
TCA-1.3.2.29 Upgrade | 1.3.2.29 | TCA-1.3.2.29 LinuxPPC Upgrade Installable |
TCA-1.3.2.29 Upgrade | 1.3.2.29 | TCA-1.3.2.29 Solaris Upgrade Installable |
TCA-1.3.2.29 Upgrade | 1.3.2.29 | TCA-1.3.2.29 Solaris x86 Upgrade Installable |
TCA-1.3.2.29 Upgrade | 1.3.2.29 | TCA-1.3.2.29 Windows Upgrade Installable |
You can run the workflow as a Favorite Task for multiple computers or a group.
Component name | Component version |
---|---|
Tivoli Common Agent Services | 1.3.2.29 |
Common Inventory Technology (CIT) | 2.5.1026 |
Dynamic Content Delivery Client | 1.3.2102 |
Dynamic Content Delivery Depot | 1.3.2102 |
Dynamic Content Delivery Axis Export | 1.3.2102 |
Tivoli Provisioning Manager OMA/DM OSGi Agent Extension | 1.8.2 |
IBM Tivoli Provisioning Manager Job Execution Service | 5.1.2000 |
Event Administration Service | 5.1.1000 |
IBM Tivoli Intelligent Orchestrator CIT Scanner | 5.1.5 |
SPBHandler | 5.1.1000 |
IBM Tivoli Provisioning Manager Subagent | 5.1.2000 |
If the Automation Package Developer Environment is installed in Eclipse on a separate computer, perform the following steps to upgrade it.
./eclipse -clean
Before you replicate the existing Tivoli Management Framework data in the Tivoli Provisioning Manager DB2 data model installed on AIX 5.3, ensure that AIX 5.3 is at Maintenance Level 3.
Additional Web Replay scenarios are available for you to use in Tivoli Provisioning Manager. To download these Web Replay scenarios, go to the IBM Open Process Automation Library at http://catalog.lotus.com/tpm?NavCode=1TW10105F.
After downloading the packages, see the Web Replay online help for instructions on how to import and use the Web Replay scenarios in your environment. The Web Replay online help is available from the Tivoli Provisioning Manager Web interface.
Patch management for the HP-UX operating system is not supported
If non-root user SAP is being used for AIX patch discovery, then the permissions of the following files need to be updated as shown below at the satellite server:
chmod 505 /usr/suma/bin/suma chmod 404 /usr/suma/lib/SUMA/DBMStanzaDir.pm chmod 404 /usr/suma/lib/SUMA/Download.pm chmod 404 /usr/suma/lib/SUMA/Policy.pm chmod 404 /usr/suma/lib/SUMA/PolicyFactory.pm chmod 404 /usr/suma/lib/SUMA/StanzaDB.pm chmod 666 /var/adm/ras/suma.log chmod 707 /var/suma/data/policy.suma/ chmod 606 /var/suma/data/policy.suma/default.SDBM_File.pag chmod 606 /var/suma/data/policy.suma/default.SDBM_File.dir chmod 604 /var/suma/data/config.suma
If you are applying the fix pack to a Tivoli Provisioning Manager version 5.1.1.1 installation, refer to the following sources of documentation:
The following updates apply to the information center.
In the Tivoli Provisioning Manager version 5.1.1 information center , disregard the document at http://publib.boulder.ibm.com/infocenter/tivihelp/v20r1/index.jsp?topic=/com.ibm.support.tpm.doc/agent/trdcds_authentic.html. This is not the supported resolution for resolving publishing problems because of issues logging onto the dynamic content delivery console. Contact IBM Software Support for assistance if you are experiencing this problem.
The default installation path for the common agent is C:\Program Files\Tivoli\ep (Windows®), /opt/tivoli/ep (Solaris or Linux®), and /usr/tivoli/ep (AIX® or HP-UX). The default installation path can be customized as required.
The installation path variable that is defined in the configuration template associated with the common agent software module specifies the default common agent installation path. The following installation path variables can be used to override the default common agent installation path:
To override the default common agent installation path :
The default common agent installation path for the selected software definition has now changed to the specified new location.
There are two ways to manually update the Oracle driver from 10.2.0.1.0 to 10.2.0.2.0 :
Method 1:
Using this method, all components that were using driver 10.2.0.1.0 will use driver 10.2.0.2.0.
Method 2:
Using this method, only DMS will use driver 10.2.0.2.0. All other components will use driver 10.2.0.1.0.
Jdbc driver 10.2.0.2.0 can be downloaded from Oracle site: http://www.oracle.com/technology/software/tech/java/sqlj_jdbc/index.htm
Discovery works when no password is specified. However, it is the policy defined on the endpoint being discovered which restricts the login. TPM cannot discover the endpoint when there is no password because the local security policy on the system by default does not allow a connection with no password when using SMB. To allow a connection with no password, disable the policy to allow the machine to be discovered.
Steps to configure local security policy on the endpoint:
When using SDI, the job is submitted to the device manager service (DMS) federator agent running on the provisioning server and DMS publishes this job. The endpoint polls the DMS server until it sees the job available for execution. It fetches all the required files, and executes the job. The detailed logs about the task are available on the endpoint rather than the server.
Below are the steps to change the SSL security certificate:
Step 1: Generate the keystore for the provisioning server user interface (log on as root or administrator).
Key database type: JKS
File Name: tpmKeyStore.jks
Location: <TIO_HOME>/cert
Key Label: tpmUICert
Common Name: your system hostname
Organization: your organization name
Data Type: Base64-encoded ASCII data
Certificate file name: tpmuicert.arm
Location: <TIO_HOME>/cert
Step 2: Create a new virtual host TPMUIVH and add 9049 to this virtual host (if 9049 is used for another application, you can use another available port)
http://<your hostname>:9061/admin
Step 3: Create new SSL configuration repertoires for the provisioning server
Step 4: Create 9049 transport chain
Step 5: Map AlphaloxPlatform and TCEAR web modules to TPM_UI_VH (9049)
Step 6. Restart the provisioning server
Publishing patches to a Tivoli Management Framework (TMF) depot is not supported. Only software package block (SPB) packages can be published on a TMF depot.
Multiple network interface cards (NICs) were not supported in version 5.1.1.1 (Tivoli Common Agent 1.3.2.25). The computer representation found by discovery in version 5.1.1.1 will not change after an agent upgrade to version 5.1.1.2 (TCA 1.3.2.29). Only the interface name will be updated and this is by design. To change the computer name in the data center model (DCM), delete the DCM object representing the computer and rediscover the computer using Tivoli common agent discovery.
In Managing computers, inventory, and compliance > Reporting> Creating a new report the statement in step 10: "Review the summary of your selections for the report." should be updated with the following information:
The fields selected on the Constraints and Layout page of the wizard are used for the generated report and are different from the fields displayed on the Summary page of the wizard.
On the Hardware page, all interfaces are shown with an endpoint host name rather than the actual name of each interface. .
There are two steps to change a password for an user in the Lightweight Directory Access Protocol (LDAP) :
At time of publication, the following problems and limitations were known.
As limitations and problems are discovered and resolved, the IBM Support team updates the online knowledge base. You can search the online support knowledge base to quickly find workarounds or solutions to problems that you experience.
Occasionally a workflow might end unexpectedly with java.rmi.MarshalException: org.xml.sax.SAXParseException.
This is caused by a problem with error handling for the common agent version 1.3.2.25.
See the endpoint log ep/logs/rcp.log.0 to determine the cause of the error.
COPDEX123E The workflow threw a AgentInstallException exception. The message is Agent installation failure. Agent install return code log file contents InstallStatus=0.
The possible causes of this error are:
On the endpoint, check the installation log for error details at <agent_install_dir>/ep/runtime/agent/logs. If the target computer has a firewall enabled, ensure that the provisioning server can ping the common agent. Check that the provisioning server can access the endpoint. Install the common agent again.
The Tivoli common agent installation fails with a Jython error in the TCA_Installable_Install workflow.
The password of the user installing the common agent has an invalid character for Jython. The double quotation mark (") is an invalid character.
The user password for the default credential must not include a double quotation mark.
Columns are overlapping in PDF reports when the data length of the column is long.
This is a known problem in Alphablox version 8.4, which has been fixed in Alphablox version 9.5.
Export the PDF into a spreadsheet to allow adjustment to the columns.
The common agent upgrade fails.
The # character in the password generates a corrupted URL, because the Dynamic Content Delivery component does not validate the URL.
Do not use the # character in the password for the administrator user of the provisioning server.
After the installation of version 5.1.1.2, the custom automation packages (created by customers) are missing from the %TIO_HOME%\drivers folder.
During the fix pack installation, the old automation packages are backed up in %TIO_HOME%\drivers_5.1.1.1 and the 5.1.1.2 automation packages are copied to %TIO_HOME%\drivers.
Copy the custom automation packages (.tcdriver files) from %TIO_HOME%\drivers_5.1.1.1 to %TIO_HOME%\drivers after the upgrade is completed.
The discovery and common agent installation did not succeed.
Support for Windows Vista and Windows 2008 were added after version 5.1.1.1.
Install the Interim Fix 5 on version 5.1.1.1.
To install the common agent on Red Hat Enterprise Linux 5 (RHEL 5) endpoints, disable SELinux.
The provisioning server is unable to ping the endpoints and the operation fails after 15 attempts with InstallStatus=0.
SELinux or a firewall might be enabled on the endpoints.
Perform the following steps to verify and disable SELinux on Red Hat Linux 5 and install the common agent again:
Running initial discovery on endpoints using network address translation (NAT) sets the management IP to unknown.
TADDMDiscovery, TPMNetworkDiscovery, MSADDiscovery do not support NAT. Therefore, after running initial discovery on endpoints using NAT, the management IP is set to unknown.
Agent installation fails with error code 237.
This generally occurs when a common agent is already running on the same endpoint.
Ensure the old agent is correctly uninstalled from the endpoint.
Stand-alone Tivoli common agent installation fails on Windows 2008 64-bit platforms if the default installation location is used.
c:\Program files (x86) folder is emulated as a standard folder by 32-bit mode emulation. Even though the default installation is c:\Program Files as the destination folder, Windows redirects all the files and installs them to c:\Program files (x86), but the installation script verifies if installation is completed in c:\Program Files and fails.
Update the response file caInstall.rsp and specify an installation location. The installation location must to be different from C:\Program Files.
For example:
-P installLocation="C:\Program Files (x86)\tivoli\ep"
or
-P installLocation="C:\tivoli\ep"
While performing Windows patch remediation and/or download, the following error message is generated:
COPDEX123E The workflow threw a Download Failure exception. The message is "Download of Microsoft .NET Framework 2.0 Service Pack 1 (KB110806) ar zh-cn zh-tw cs da nl en fi fr de el hu it ko no pl pt-br pt ru tr es sv has failed! Verify the WSUS server or proxy is reachable".
This is a problem with the selected patch. There is no issue with the code used to download the patch but with the .Net SP1 which has downloading issues.
Refer to the Microsoft® Web site at http://support.microsoft.com/kb/923100.
COPCOM123E A shell command error occurred: Exit code=253, Error stream="** ERR:A problem occurred while attempting to download the UPDATE file: A connection with the server can not be established
or
2008-11-14 16:08:54,578 INFO [Deployment Request 12802] (InvCfgHelper.java:69) inv.InvCfgHelper: A profile with this name does not exists, create it
2008-11-14 16:08:54,703 ERROR [Deployment Request 12802] (TmfInventoryService.java:124) inv.TmfInventoryService: COPTMF001E Object: TPM_Windows_Patch_Scan not found.
The provisioning server is not able to login to Tivoli Management Framework (TMF).
On Solaris, Web pages do not load and an Out of Memory error is seen in the j2ee/console.log file.
This is caused by an issue with the Sun Java Virtual Machine. For more information, see http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=5077277
Open the Web Sphere Administrative Console and click Servers > Application servers > server1 > Process Definition > Java Virtual Machine. Change the Maximum Heap Size to 1024 and save the configuration change. Then restart the provisioning server.
The registration fails when attempting to register the same Solaris Operating System (Solaris) endpoint again by running the Sun_Update_Connection_Register workflow.
If a Solaris endpoint is already registered with the Sun Solve account, then it is not possible to register the same Solaris endpoint again with the Sun Solve account.
If an endpoint is already registered with Sun Solve, unregister the endpoint and wait for at least 24 hours for the unregistration to take effect, before attempting to register an endpoint again.
When you add a new access group in a one-node Microsoft Active Directory (MSAD) setup, the group is not displayed immediately.
The refresh interval for the MSAD server is one minute, therefore it can take up to one minute to refresh the page.
To see the new access group immediately, refresh the page manually.
Cannot login to the Web interface. The root cause for this problem is the base DN value entered has an additional space between "," and "dc" which breaks the authentication code.
Cannot login to the Web interface after upgrading to version 5.1.1.1.
There is a space between the comma and the dc in the DN in user-factory.xml.
In the user-factory.xml file, remove any spaces in the DN. Ensure that the base DN in user-factory.xml exactly matches the real base DN. For example, the value should be dc=co,dc=in without additional spaces between the comma and dc.
Cannot use the new WebSphere Application Server (WAS) user (unless it is the default WAS user) in the WAS Administrative console on UNIX® platforms.
Logging in using a non-default WAS user is not allowed.
Only the default WAS user login is supported on UNIX platforms.
Installing a software stack fails if more than one status update is in the status update window.
This is a problem in the Oracle jdbc driver 10.2.0.1.0 on the Solaris Operating System.
Update the Oracle jdbc driver to version 10.2.0.2.0. Instructions are provided in the Information center updates section of this readme file.
The port displayed when creating a Remote Execution and Access (RXA) service access point (SAP) for Server Message Block (SMB) using the Add Credential wizard is not valid.
When creating an RXA SAP for SMB, the port shown is 0 where it should be 445 or 139.
The provisioning server does not reference the assigned port, instead it uses port 455 or 139.
No action is required.
During the manual uninstallation of Tivoli common agent, the displayed agent version is incorrect.
When Tivoli common agent is upgraded, the uninstaller is not being upgraded from a previous version.
Ignore the agent level displayed during a manual uninstallation.
COPDEX123E A PatchInstallError exception occurred. The exception was caused by the following problem: Error installing patches. Please refer /tmp/temp11428.log file for more details. RETURNCODE:1 ERROR:ERROR: Dependency resolution failed: Resolvable id 116289 does not exist.
This problem is related to the Novell rug issue. If the ZENworks daemon service is idle for a long time, the first time when rug install calls ZEN to retrieve Suse catalog database data, the error
Resolved id can not be retrieved.can occur.
A Software Distribution Infrastructure (SDI) job stays in a submitted state for a long time and there is an error in the rcp.log.0 regarding loading libawt.
The Java XMLEncoder/Decoder classes depend on libawt. There might be situations when the operating system environment does not have the particular package that includes all of the libraries needed by libawt.
Verify that you have the packages documented in the following link and install any as needed: http://publib.boulder.ibm.com/infocenter/tivihelp/v3r1/index.jsp?topic=/com.ibm.itpmdcd.doc/install_guide/cdcd_prereqs_jvmdependencies.html
If you have any questions about this fix pack, call the IBM Support Center for your country. For example, in the USA call 1-800-IBM-SERV. For specific contact numbers for all countries, refer to the following Web site:
http://techsupport.services.ibm.com/guides/contacts.html
If you find a problem or have a suggestion about the APDE (Automation Package Development Environment) features or the documentation in general, contact IBM through the Tivoli Provisioning Manager and Intelligent Orchestrator Automation Package Development Environment forum. The forum is a technical discussion focused on installing, configuring and using the APDE (Automation Package Development Environment) for writing workflows and creating automation packages for the Tivoli Provisioning Manager products.
To access the forum:
The notices file in the license subdirectory has been updated for this release.
The following section includes important information about this document and its use.
Date of Creation/Update | Summary of Changes |
---|---|
January 15, 2009 | Documentation APAR IZ41797, APAR IZ40728 |
This information was developed for products and services offered in the U.S.A.
IBM may not offer the products, services, or features discussed in this document in other countries. Consult your local IBM representative for information about the products and services currently available in your area. Any reference to an IBM product, program, or service is not intended to state or imply that only that IBM product, program, or service may be used. Any functionally equivalent product, program, or service that does not infringe any IBM intellectual property right may be used instead. However, it is the user's responsibility to evaluate and verify the operation of any non-IBM product, program, or service.
IBM may have patents or pending patent applications covering subject matter described in this document. The furnishing of this document does not grant you any license to these patents. You can send license inquiries, in writing, to:
For license inquiries regarding double-byte (DBCS) information, contact the IBM Intellectual Property Department in your country or send inquiries, in writing, to:
The following paragraph does not apply to the United Kingdom or any other country where such provisions are inconsistent with local law:
INTERNATIONAL BUSINESS MACHINES CORPORATION PROVIDES THIS PUBLICATION "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF NON-INFRINGEMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE.
Some states do not allow disclaimer of express or implied warranties in certain transactions, therefore, this statement might not apply to you.
This information might include technical inaccuracies or typographical errors. Changes are periodically made to the information herein; these changes will be incorporated in new editions of the publication. IBM may make improvements or changes in the product(s) or in the program(s) described in this publication at any time without notice.
Any references in this information to non-IBM Web sites are provided for convenience only and do not in any manner serve as an endorsement of those Web sites. The materials at those Web sites are not part of the materials for this IBM product and use of those Web sites is at your own risk.
IBM may use or distribute any of the information you supply in any way it believes appropriate without incurring any obligation to you.
Licensees of this program who want to have information about it for the purpose of enabling: (i) the exchange of information between independently created programs and other programs (including this one) and (ii) the mutual use of the information which has been exchanged, should contact:
Such information may be available, subject to appropriate terms and conditions, including in some cases, payment of a fee.
The licensed program described in this document and all licensed material available for it are provided by IBM under terms of the IBM Customer Agreement, IBM International Program License Agreement or any equivalent agreement between us.
IBM, the IBM logo, and ibm.com® are trademarks or registered trademarks of International Business Machines Corporation in the United States, other countries, or both. If these and other IBM trademarked terms are marked on their first occurrence in this information with a trademark symbol (® or ™), these symbols indicate U.S. registered or common law trademarks owned by IBM at the time this information was published. Such trademarks may also be registered or common law trademarks in other countries. A current list of IBM trademarks is available on the Web at “Copyright and trademark information" at www.ibm.com/legal/copytrade.shtml
Intel® and Itanium are trademarks or registered trademarks of Intel Corporation or its subsidiaries in the United States, other countries, or both.
Linux is a trademark of Linus Torvalds in the United States, other countries, or both.
Microsoft and Windows are registered trademarks of Microsoft Corporation in the United States, other countries, or both.
Java and all Java-based trademarks and logos are trademarks or registered trademarks of Sun Microsystems, Inc. in the United States, other countries, or both.
UNIX is a registered trademark of The Open Group in the United States and other countries.
Other company, product, and service names may be trademarks or service marks of others.