Clon01: Difference between revisions

From CLONWiki
Jump to navigation Jump to search
Boiarino (talk | contribs)
No edit summary
 
No edit summary
Line 1: Line 1:
SUN Blade 2000, 2x900MHz Ultra SPARC-IIIi, 2GB RAM. Primary use - EPICS applications.
After Solaris installation and customization, start cronjobs from epics account. File is ''~epics/cron/crontab.txt'' so cronjobs can be started by command
crontab crontab.txt
Up-to-date cron job file is located in ''/var/spool/cron/crontabs/epics'', so if ''/var'' directory was saved before Solaris reinstalling you can check it out.
== old info - not valid ==
Data Mover (move2silo, EPICS)
Data Mover (move2silo, EPICS)



Revision as of 10:18, 29 January 2009

SUN Blade 2000, 2x900MHz Ultra SPARC-IIIi, 2GB RAM. Primary use - EPICS applications.

After Solaris installation and customization, start cronjobs from epics account. File is ~epics/cron/crontab.txt so cronjobs can be started by command

crontab crontab.txt

Up-to-date cron job file is located in /var/spool/cron/crontabs/epics, so if /var directory was saved before Solaris reinstalling you can check it out.


old info - not valid

Data Mover (move2silo, EPICS)

SUN Blade 2000, 2x900MHz Ultra SPARC-IIIi, 2GB RAM ---+ CLON01

This is *Sun Blade 2000* dual CPU (SPARC IIIi) server which primary use in the *CLAS DAQ is Data Mover*. The secondary use of it is *EPICS* visualisation workstation.

This server runs also

  * VERITAS Cluster Server
  * NIS Slave Server
  * INGRES Server
  * CLON Printer Server (for CLONHP and CLONHP2 printers)

---++ Data Mover Location: /usr/local/system/raid

Program: move2silo

Logs: logs/

To start it manually:

 
[root@clon01]$ cd /ssa
[root@clon01]$ mv presilo1 gopresilo
[root@clon01]$ move2silo >& /dev/null & 

To move Data Mover to *CLON10* edit file */usr/local/system/raid/checkraid* uncommenting corresponding line:

...
# for CLON10
#  /usr/local/system/raid/move2silo >& /usr/local/system/raid/log &

# for CLON01
  rsh clon01 /usr/local/system/raid/move2silo ">&" /usr/local/system/raid/log "&"
...

---++ Memory Monitoring Tool *CEDIAG*

location: /opt/SUNWcest/bin/cediag

[root@clon01 ~]$ cediag
cediag: Revision: 1.78 @ 2005/02/11 15:54:29 UTC
cediag: Analysed System: SunOS 5.8 with KUP 117350-06 (MPR active)
cediag: Pages Retired: 0 (0.00%)
cediag: findings: 0 datapath fault message(s) found
cediag: findings: 0 UE(s) found - there is no rule#3 match
cediag: findings: 0 DIMMs with a failure pattern matching rule#4
cediag: findings: 0 DIMMs with a failure pattern matching rule#5