Patch Name: PHSS_25539

Patch Description: s700_800 10.20 OV ITO6.0X HP-UX 11.x Agent Patch A.06.10

Creation Date: 02/02/05

Post Date:  02/02/25

Warning: 02/03/22 - This Non-Critical Warning has been issued by HP.

	- PHSS_25539 introduced behavior that can cause programs using
	  the OVO agent APIs to fail during initialization when running
	  in a pure Japanese environment.  This affects customer-written
	  software that integrates with the OVO agent, as well as some
	  OVO executables.
	- The default monitor template "proc_util" will fail with error:

		OpC30-608

	  and the associated Japanese error message will be displayed
	  which in English means:

		Can't retrieve value for monitor 'proc_util'.
		Suppressing further error messages.

	- The error only occurs if the locale is completely set to
	  "ja_JP.SJIS" or "ja_JP.eucJP" and can be avoided by setting
	  LC_MESSAGES="C".
	- If setting LC_MESSAGES="C" is not an acceptable workaround,
	  then to avoid this behavior HP recommends that patch
	  PHSS_25539 be removed from all systems that run pure Japanese
	  environments.  PHSS_25539 should also be removed from software
	  depots used to installed patches on these systems.
	- The previous patch, PHSS_24917, does not exhibit this same
	  behavior.  To ensure as many known issues as possible are
	  addressed, HP recommends that PHSS_24917 be installed after
	  PHSS_25539 is removed.  If PHSS_24917 was installed prior to
	  PHSS_25539, it will automatically be restored when PHSS_25539
	  is removed and it will not need to be re-installed.
	- This behavior will be corrected in patch PHSS_26726 which is
	  expected to be released by the end of April 2002.


Hardware Platforms - OS Releases:
	s700: 10.20
	s800: 10.20

Products:
	OpenView IT/Operations 6.0

Filesets:
	OVOPC-CLT.OVOPC-UX11-CLT,A.06.00

Automatic Reboot?: No

Status: General Superseded With Warnings

Critical: No

Path Name: /hp-ux_patches/s700_800/10.X/PHSS_25539

Symptoms:
	PHSS_25539:
	- SR: H555006719
	  If the agent is running as non-root user and the
	  management server processes are restarted, the agent has
	  to be restarted as well, otherwise all messages are
	  buffered.
	- SR: 8606213476
	  The distribution to nodes may hang or fail. This is more
	  likely to happen while distributing to Windows NT/2000
	  nodes rather than on UNIX nodes.
	- SR: B555007980
	  Local automatic actions are started immediately, even
	  though agent MSI is enabled in divert mode and the
	  'Immediate Local Automatic Action' box is not checked.
	- SR: B555008220
	  The <$MSG_TIME_CREATED> variable is not substituted in
	  the message template.
	- SR: B555008838
	  The event correlation engine creates a 'Time cannot go
	  backwards' error if the system is very busy.
	- SR: B555009745
	  The template default of the object field of a monitor
	  template is not used.
	- SR: B555010620
	  Some messages are missing in the Japanese message
	  catalog. You get a 'Cannot generate message' error.
	- SR: B555010955
	  Even if you used opcswitchuser.sh to specify a non-root
	  user which should run the ITO agent, it will still be
	  started as user root after a system reboot.
	- SR: B555010966
	  A message key relation containing <*> does not always
	  match message keys correctly. This results in messages
	  not being acknowledged when they should.
	- SR: B555011184
	  opcagt fails to start opcctla if it is started as
	  ./opcagt and /opt/OV/bin/OpC is not in the search PATH.
	- SR: B555011505
	  1. opcecm/opceca might run in a dead lock while
	     processing lots of ECS annotate nodes
	  2. opcecm/opceca might leak memory when ECS annotate
	     nodes are used
	- SR: B555011594
	  The original message text of a logfile encapsulator
	  message is wrong if <$LOGPATH> or <$LOGFILE> is used.
	- SR: B555011638
	  Pattern matching cannot match the new line character
	  of multi-line messages.
	- SR: B555011979
	  Pattern matching hangs if only single byte Japanese
	  HANKAKU KANA characters are used.
	- SR: B555011990
	  ECS event log (ecevilg) has invalid time difference to
	  the next message which can cause the ECS simulator
	  to hang or appear to hang when loading an event log file
	  with such values.
	- SR: B553000162
	  After opcagt -stop, opcagt -status tells that the control
	  agent does not run although it is running and sometimes
	  you get the following error in the message browser:
	  'Ouput of kill -0 differs from internal pids-table for
	  index <number> (OpC30-1094)'

	PHSS_24917:
	- SR: B555010879
	  opctrapi aborts during template distribution if
	  conditions with the 'Suppress Identical Output Messages'
	  features are used.
	- SR: B555010899
	  opcdista requests distribution data from a wrong manager
	  if there is a secondary manager with the same short
	  hostname than the appropriate primary manager.
	- SR: B555010948
	  Nested alternatives were not handled correctly in the
	  pattern matching algorithm, e.g. the pattern '[a|b]c|d'
	  was handled like '[a|b|d]c'.
	- SR: B555010980
	  Traps without a SNMP variable are not matched because
	  server patch adds an extra attribute to the template.
	- SR: B555011126
	  Agent distribution using the new Secure Shell (SSH)
	  method introduced with the A.06.08 server patches
	  doesn't work for HP-UX agents. Nothing is installed
	  but you get no error message about it. The only hint is
	  that the "Unpacking truck file /tmp/opc_tmp/opc_pkg.Z"
	  message is not displayed during the installation.

	PHSS_23988:
	- SR: 8606180583
	  When the VPO agent was started manually from an MC/SG
	  shared volume, the agent was killed upon package stop.
	  This was because the agent used this volume as the
	  current directory. Now the agent always starts in /tmp.
	  This also has the side effect that any core file for the
	  agent is written into /tmp.
	- SR: 8606180891
	  The template default for the service name is not used.
	- SR: 8606181988
	  The event interceptor doesn't forward on
	  "forward unmatched" if a "supress unmatched" condition
	  is used in a second template
	- SR: 8606182250
	  opcfwtmp doesn't trap bad login from CDE login.
	- SR: 8606182981
	  The ITO agent is not started after system reboot if the
	  default runlevel is lower than 3 and you don't get any
	  warning about that fact.
	- SR: B555010341
	  Agent sometimes does not start automatically after reboot
	  while manual start works fine.

	PHSS_23821:
	- The event correlation process opceca (agent) / opcecm
	  (server) might crash after processing several annotation
	  nodes
	- The VPO A.06.03 patches for HP-UX and Solaris do
	  not work as expected in firewall environments:
	  While server port restrictions are still regarded,
	  client-side port restrictions are ignored.

	PHSS_22881:
	- Changes were required for the security add-on product
	  VantagePoint Advanced Security.
	- agent installation configure script fails to convert
	  ITO 4 queue files: awk syntax error in swagent.log file

	PHSS_22012:
	- disk_mon.sh returns invalid values if the bdf command
	  returns more than one line output for a filesystem
	  (e.g. if the filesystem name exceeds its column width)
	- Several changes for firewall environments. For detailed
	  information refer to the VPO Firewall Configuration
	  White Paper version 3.0
	- When executing large numbers of autoactions, some of them
	  were staying in 'running' state.
	- opctrapi aborts after getting traps with unresolvable IP
	  address.
	- The handling of '\' was different in the pattern
	  definition and the "matching pattern".
	- if buffer file size limitation is enabled the agent may
	  discard low-severity messages even if there is still
	  space in the buffer file

Defect Description:
	PHSS_25539:
	- SR: H555006719
	  When a communication to the message receiver fails, the
	  agent starts buffering messages. It periodically checks
	  if the server is alive by sending it ICMP packets. If the
	  server can't be reached with ICMP packets, no RPC
	  communication is attempted. This doesn't work when the
	  agent is running as non-root (only root is allowed to
	  send ICMP packets); the sending function returns an OK
	  value but does not send anything. Therefore we also never
	  receive any replies and the message agent never goes out
	  of the "Checking node" mode.
	  Fix:
	  If the agent is running as a non-root user, opcmsga
	  immediately tries to contact the management server using
	  RPC communication.
	- SR: 8606213476
	  While the agent receives several RPC calls, like "Start
	  Distribution", "Execute Action" or "Set Primary Manager"
	  in parallel, it may happen that the call results in a
	  conflict within the control agent, which causes the
	  NT control agent to bring a Dr. Watson window.
	  This conflict can also occur on UNIX but the control
	  agent does not die, rather the RPC request may fail.
	  With this version the RPC calls which could cause
	  conflicts are serialized.
	- SR: B555010955
	  The non-root user was added to the startup configuration
	  file but not used.
	- SR: B555010966
	  The processing of the key relation is wrong for the log
	  file encapsulator. The problem is that all unresolved
	  entries followed by a resolved entry are removed.
	  Other unresolved entries are kept as they are.
	- SR: B555011184
	  The working directory for the ITO agent was changed
	  from /opt/OV/bin/OpC to /tmp to avoid problems if
	  the agent is running in an MC/SG environment.
	- SR: B555011638
	  VPO could not match for for new line of multi line
	  messages. Following changes have been made to allow this:
	  It is now possible to use ^M (\r) as field separator.
	  New patterns are introduced: </> to match any number of
	  line breaks (UNIX style \n or NT style \r\n) and <n/> to
	  match exactly n line breaks, for example <1/> will match
	  exactly one line break.
	  This change works only for sources that already can
	  create multi line messages (for example opcmsg or NT
	  event log), it does not allow multi line logfile
	  encapsulation.
	  This change requires a fix on the management server and
	  the agent. Therefore a patch on the management server and
	  a patch for the agent is required to use the new
	  functionality.

	for SR's not listed in this section please see
	the list of symptoms

	PHSS_24917:
	- SR: B555010879
	  When freeing the allocated memory, a wrong frunction was
	  used.
	- SR: B555010899
	  opcdista requests distribution data from a wrong manager
	  if there is a secondary manager with the same short
	  hostname than the appropriate primary manager because it
	  searches the whole list in for each name it tries to
	  match first the long then the short name. Instead it
	  should try the long names for all systems first and only
	  then try to match using the short names.
	- SR: B555010948
	  The grammar was changed to allow nested alternatives and
	  process it correctly.
	- SR: B555011126
	  The SSH agent installation method is not known to the
	  opcrinst script which should unpack the agent on the
	  target not. Thus the opcrinst script simply does nothing.

	for SR's not listed in this section please see
	the list of symptoms

	PHSS_23988:
	- SR: 8606182250
	  opcfwtmp didn't handle the LOGIN_PROCESS value of the
	  wtmprec.ut_type field of the WTMP structure, so the bad
	  logins from CDE haven't been detected.
	- SR: 8606182981
	  The ITO agent was integrated into the systems startup
	  process at runlevel 3 but up we didn't check the
	  default runlevel from /etc/inittab. Now there is a check
	  and you'll get a warning if the default runlevel is
	  lower than 3.
	- SR: B555010341
	  When the process ID of the 'opcctla -start' is the same
	  as of the running opcctla before the shutdown, the
	  internal logic concluded that the agent is already
	  running and did not start up the subprocesses.

	for all other defects not listed in this section please see
	the list of symptoms

	PHSS_23821:
	see the list of symptoms

	PHSS_22881:
	see the list of symptoms

	PHSS_22012:
	see the list of symptoms

SR:
	H555006719 B555011990 B555011979 B555011638 B555011594
	B555011505 B555011184 B555011126 B555010980 B555010966
	B555010955 B555010948 B555010899 B555010879 B555010620
	B555010341 B555010079 B555009745 B555009155 B555009152
	B555008838 B555008613 B555008314 B555008220 B555007980
	B555007752 B555007709 B555007602 B555007426 B555006890
	B555006267 B553000162 8606213476 8606182981 8606182250
	8606181988 8606180891 8606180583 8606137088

Patch Files:
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/opc_pkg.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/install/opcrclchk
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/install/opcrdschk
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/install/opcrndchk
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/install/opcroschk
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/install/opcrverchk
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/install/opcrinst
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/ana_disk.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/cpu_mon.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/disk_mon.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/last_logs.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/mailq_l.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/proc_mon.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/sh_procs.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/swap_mon.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/vp_chk.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/dist_mon.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/mondbfile.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/ssp_chk.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/opcfwtmp.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/opcnprcs.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/
		opc_get_ems_resource.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/actions/mailq_pr.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/actions/st_inetd.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/actions/st_syslogd.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/actions/st_mail.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/actions/dist_del.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/cmds/opcdf.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/cmds/opclpst.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/cmds/opcps.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/cmds/E10000Log.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/cmds/ssp_config.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/cmds/opc_sec_v.sh.Z
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/AgentPlatform

what(1) Output:
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/opc_pkg.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/install/opcrclchk:
		HP OpenView VantagePoint A.06.10 (12/07/01)
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/install/opcrdschk:
		HP OpenView VantagePoint A.06.10 (12/07/01)
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/install/opcrndchk:
		HP OpenView VantagePoint A.06.10 (12/07/01)
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/install/opcroschk:
		HP OpenView VantagePoint A.06.10 (12/07/01)
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/install/opcrverchk:
		HP OpenView VantagePoint A.06.10 (12/07/01)
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/install/opcrinst:
		HP OpenView VantagePoint A.06.10 (12/07/01)
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/ana_disk.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/cpu_mon.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/disk_mon.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/last_logs.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/mailq_l.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/proc_mon.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/sh_procs.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/swap_mon.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/vp_chk.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/dist_mon.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/mondbfile.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/ssp_chk.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/opcfwtmp.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/opcnprcs.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/monitor/
		opc_get_ems_resource.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/actions/mailq_pr.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/actions/st_inetd.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/actions/st_syslogd.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/actions/st_mail.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/actions/dist_del.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/cmds/opcdf.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/cmds/opclpst.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/cmds/opcps.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/cmds/E10000Log.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/cmds/ssp_config.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/A.06.10/RPC_DCE_TCP/cmds/opc_sec_v.sh.Z:
		None
	/var/opt/OV/share/databases/OpC/mgd_node/vendor/hp/pa-risc/
		hp-ux11/AgentPlatform:
		None

cksum(1) Output:
	515593636 7512367 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		opc_pkg.Z
	2811062488 6708 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		install/opcrclchk
	3234780691 28923 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		install/opcrdschk
	1991425910 6720 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		install/opcrndchk
	1436561863 6287 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		install/opcroschk
	565567375 31983 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		install/opcrverchk
	102020819 105394 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		install/opcrinst
	1720774433 2729 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		monitor/ana_disk.sh.Z
	789947932 5974 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		monitor/cpu_mon.sh.Z
	1675723719 6070 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		monitor/disk_mon.sh.Z
	3788993245 5858 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		monitor/last_logs.sh.Z
	212276500 5834 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		monitor/mailq_l.sh.Z
	3321132962 5982 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		monitor/proc_mon.sh.Z
	1823315012 5433 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		monitor/sh_procs.sh.Z
	1889288589 5875 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		monitor/swap_mon.sh.Z
	3976740190 5725 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		monitor/vp_chk.sh.Z
	1326584801 6125 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		monitor/dist_mon.sh.Z
	785190745 14332 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		monitor/mondbfile.sh.Z
	3466004174 5967 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		monitor/ssp_chk.sh.Z
	3352246397 14446 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		monitor/opcfwtmp.Z
	3905428724 10958 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		monitor/opcnprcs.Z
	1744953214 19571 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		monitor/opc_get_ems_resource.Z
	2720990625 2534 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		actions/mailq_pr.sh.Z
	27925490 2579 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		actions/st_inetd.sh.Z
	260271102 2588 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		actions/st_syslogd.sh.Z
	526161161 2581 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		actions/st_mail.sh.Z
	1085921250 6104 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/
		actions/dist_del.sh.Z
	1781671426 326 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/cmds/
		opcdf.Z
	1985877331 388 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/cmds/
		opclpst.Z
	4196424067 403 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/cmds/
		opcps.Z
	4156213397 3319 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/cmds/
		E10000Log.sh.Z
	3531502974 3097 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/cmds/
		ssp_config.sh.Z
	2837799807 13170 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/A.06.10/RPC_DCE_TCP/cmds/
		opc_sec_v.sh.Z
	233551493 6324 /var/opt/OV/share/databases/OpC/mgd_node/
		vendor/hp/pa-risc/hp-ux11/AgentPlatform

Patch Conflicts: None

Patch Dependencies: None

Hardware Dependencies: None

Other Dependencies: None

Supersedes:
	PHSS_22012 PHSS_22881 PHSS_23821 PHSS_23988 PHSS_24917

Equivalent Patches:
	ITOSOL_00128:
	sparcSOL: 2.6 2.7 2.8

	PHSS_25540:
	s700: 11.00
	s800: 11.00

Patch Package Size: 7810 KBytes

Installation Instructions:
	Please review all instructions and the Hewlett-Packard
	SupportLine User Guide or your Hewlett-Packard support terms
	and conditions for precautions, scope of license,
	restrictions, and, limitation of liability and warranties,
	before installing this patch.
	------------------------------------------------------------
	1. Back up your system before installing a patch.

	2. Login as root.

	3. Copy the patch to the /tmp directory.

	4. Move to the /tmp directory and unshar the patch:

		cd /tmp
		sh PHSS_25539

	5a. For a standalone system, run swinstall to install the
	    patch:

		swinstall -x autoreboot=true -x match_target=true \
			-s /tmp/PHSS_25539.depot

	By default swinstall will archive the original software in
	/var/adm/sw/patch/PHSS_25539.  If you do not wish to retain a
	copy of the original software, you can create an empty file
	named /var/adm/sw/patch/PATCH_NOSAVE.

	WARNING: If this file exists when a patch is installed, the
	         patch cannot be deinstalled.  Please be careful
		 when using this feature.

	It is recommended that you move the PHSS_25539.text file to
	/var/adm/sw/patch for future reference.

	To put this patch on a magnetic tape and install from the
	tape drive, use the command:

		dd if=/tmp/PHSS_25539.depot of=/dev/rmt/0m bs=2k

Special Installation Instructions:
	BEFORE LOADING THIS PATCH...

	    o It provides bug fixes and enhancements for the
	      VPO A.06.00 Management Server system.

	    o DO NOT use this patch with older releases of ITO,
	      for example versions A.05.00, A.05.11 or A.05.30
	(A) Patch Installation Instructions
	    -------------------------------

	(A1)  Install the patch, following the standard
	      installation instructions.

	      For backing up the system before installing
	      a patch, you may use opc_backup(1m)

	NOTE: MAKE SURE THAT NO AGENT OF THE PLATFORM
	      ADDRESSED BY THIS PATCH IS DISTRIBUTED
	      (either from the VPO Administrator's GUI
	      or from command line using inst.sh) WHILE
	      RUNNING SWINSTALL.

	      Don't be afraid of the '-x autoreboot=true'
	      option above. There won't be a reboot due
	      to this VPO patch.
	      You can skip this option if you like.

	      If you are running VPO in a MC/ServiceGuard
	      installation:

	      - Note, that only files on the shared disk volume
	        at /var/opt/OV/share will be patched. Therefore
	        install the patch on one cluster node while the
	        shared disks are mounted. The server processes
	        may be running during patch installation.

	      - It is not necessary to install this patch on all
	        cluster nodes. Even if the software inventory on
	        the other cluster nodes will not be updated, the
	        patched files will be available there when the
	        shared disk is switched to them.

	NOTE: This patch must be installed on the VPO Management
	      Server system, NOT on an VPO Managed Node directly.
	      Changes will take effect on managed nodes by means of
	      VPO Software Distribution (using 'Force Update' if
	      there is already an agent installed on the managed
	      node). See chapter 2 of the VPO Administrator's
	      Reference manual for more information.

	(B) Patch Deinstallation Instructions
	    ---------------------------------

	(B1)  To deinstall the patch PHSS_25539 run swremove:

	      NOTE: MAKE SURE THAT NO AGENT OF THE PLATFORM
	            ADDRESSED BY THIS PATCH IS DISTRIBUTED (either
	            from the ITO Administrator's GUI or from
	            command line using inst.sh) WHILE RUNNING
	            SWREMOVE.

	      If you are running VPO in a MC/ServiceGuard
	      installation make sure to mount the shared
	      disks at the node and only at the node that
	      had them mounted during patch installation.
	      Otherwise restoration of the original files
	      onto the shared disk will fail.

	      # swremove PHSS_25539