Pbsjob template.sh

From KIP Wiki
Revision as of 14:57, 20 February 2009 by Victor (talk | contribs)
⧼kip-jumptonavigation⧽⧼kip-jumptosearch⧽
#! /usr/bin/zsh
#
# Template for Torque or PBS (Portable Batch System) script
#
# Remarks: a line beginning with # is a comment;
#          a line beginning with #PBS is a pbs command;
#          commands are upper/lower-case sensitive.
#
# Usage:   submit job with
#             qsub pbsjob.sh
#          this template cannot be submitted directly!
#
#===========================================================================
#                     PBS Job Parameters
# All parameters start with #PBS, so that the shell ingores them,
# but the PBS server reads them! The parameters have to be defined before
# any shell executable line appear.
#===========================================================================
#
# Job name (default is the name of pbs script file)
#PBS -N job_name
#
# The standart and error output of your job is normally written to special
# files in the job working directory and get delivered back to the directory
# where the job was submitted from after it's finished. You can change them
# with the following two settings. You can provide absolute path names or
# relative to the submission directory.
#
# Path/filename for standard output. Default: [job_name].o[job_id]
#PBS -o myjob.out
#
# Path/filename for error output. Default: [job_name].e[job_id]
#PBS -e myjob.err
#
# If you submit jobs from an external host, PBS will try to deliver output
# and error files back using scp from root account. Hence, you have to allow
# root from stud to login to root at your machine without password.
# This can be avoided if you request the files to be put on stud:
#PBS -o stud.kip.uni-heidelberg.de:/data/users/einstein/jobs/lastjob.out
#PBS -e stud.kip.uni-heidelberg.de:/data/users/einstein/jobs/lastjob.err
#
# Queue name (e.g. Instant, Short, Medium, Long, Eternal). Default: Medium
#PBS -q queue_name
# If the default PBS server is not set on your machine, then type instead
#PBS -q queue_name@stud.kip.uni-heidelberg.de
#
# There are two different ways to set the default server on your machine:
# 1. Set the shell environment variable PBS_DEFAULT, like:
#         export PBS_DEFAULT=stud.kip.uni-heidelberg.de
#    Put this line into your ~/.profile or /etc/profile.d/pbs.sh
# 2. Put the server name into the file: /var/spool/pbs/server_name, like:
#    echo stud.kip.uni-heidelberg.de > /var/spool/pbs/server_name
#
# Execution host (this option increases job standby time. Better remove it!!!)
#PBS -l host=mare05.atlas-farm.kip.uni-heidelberg.de
#
# Send me e-mail when job begins - rarely needed
#PBS -m b
# Send me e-mail when job ends - usually desirable
#PBS -m e
# Send me e-mail when job aborts with an error - usually desirable
#PBS -m a
# Or collect several e-mail options together:
#PBS -m ae
#
# Send e-mail not to me at the submitting host but to some other address(es):
# Specify this ALWAYS, as the correct mail delivery on stud is not guaranteed!
#PBS -M einstein@kip.uni-heidelberg.de,bohr@cern.ch
#
# Do not rerun this job if it fails
#PBS -r n
#
#===========================================================================
#               PBS Environment Variables
#===========================================================================
#
# When a batch job starts execution, a number of environment variables are
# predefined, which include:
#
#      Variables defined on the execution host.
#      Variables exported from the submission host with
#                -v (selected variables) and -V (all variables).
#      Variables defined by PBS.
#
# The following reflect the environment where the user ran qsub:
# PBS_O_HOST      - the host from which you ran the qsub command;
# PBS_O_LOGNAME   - your user ID where you ran qsub;
# PBS_O_HOME      - your home directory where you ran qsub;
# PBS_O_PATH    - the PATH environment variable where you ran qsub;
# PBS_O_SHELL    - your SHELL environment variable, where you ran qsub
# PBS_O_MAIL    - the MAIL environment variable of the submitter
# PBS_O_WORKDIR   - the working directory, from which you ran qsub;
# PBS_O_QUEUE     - the original queue you submitted to;
# PBS_QUEUE       - the queue the job is executing from;
# PBS_JOBID       - the job's PBS identifier;
# PBS_JOBNAME     - the job's name;
# PBS_NNODES      - the submitter's "size" resource request;
# PBS_ENVIRONMENT - is set to PBS_INTERACTIVE or PBS_BATCH.
#
#===========================================================================
#               Unique Directory on the Scratch Disk
#===========================================================================
#
# The unique directory for a job is created automatically
# in: /scratch/pbstmp.${PBS_JOBID} on the execution host.
# The subdirectory is removed after the job finished.
# The total amount of data in this directory must not exceed 20 GB!
#
cd /scratch/pbstmp.${PBS_JOBID}
#
#===========================================================================
#               Fetch Your Files if Necessary
#===========================================================================
#
# This can be anything: sources, executables, input, steering etc.
#
scp stud:path_name/filename .
#
# Or from an NFS-mounted directory:
#
cp /data/x01/users/${LOGNAME}/filename .
#
# It is better to copy big files to the local directory, than
# to access the remote mounted directory during the execution.
# This speeds up the execution, reduces the network traffic,
# and protects against possible network problems due to the job execution.
#
#===========================================================================
#               Make the Executable if Necessary
#===========================================================================
#
/usr/bin/make exec_file_name
#
#===========================================================================
#               Initialise ATLAS ATHENA Environment if Necessary
#===========================================================================
#
source /atlas/athena/11.2.0/setup.sh
source /atlas/athena/11.2.0/dist/11.2.0/Control/AthenaRunTime/AthenaRunTime-00-00-06/cmt/setup.sh
#
#===========================================================================
#               Run the Job
#===========================================================================
#
./exec_file_name parameters < steer_file > output_file 2> error_file
#
# If you do not redirect the output and the error output
# they will be collected in the job output and error files,
# specified by the PBS option -o and -e (see above).
# If you write 2>&1 then the error output is written to the same file as
# the standard output.
#
# The input file steer_file is read at the moment the executable starts,
# i.e. when the job starts running. This can happen much later after
# the job was submitted. If you change the steer_file in between, the
# changed version will be read. Alternative: read standard input not from
# a file but from this script:
#
./exec_file_name parameters <<EOF > output_file 2>&1
steering parameters
more steering parameters, whatever the executable needs
etc.
EOF
#
# Instead of EOF, any other word can be used to mark the beginning
# and the end of the input text block. The advantage of this method is
# that the text is saved during the job submission.
#
# For ATHENA jobs the job options file can be given as a shell parameter.
# It is convenient to make a copy of this file for each job in the scratch
# directory and move it at the end of the job to the output directory:
#
cp /data/x01/users/${LOGNAME}/my_job_options.py my_job_options_${PBS_JOBID}.py
athena.py my_job_options_${PBS_JOBID}.py > athena_${PBS_JOBID}.out 2>&1
#
#===========================================================================
#               Dispose the Output
#===========================================================================
#
# Here you copy back the results, the output_file, error_file etc.
#
scp resulting_file_name host:destination_path_name
#
# Or to the NFS-mounted directory:
#
cp ./output_file /data/x01/users/${LOGNAME}/
#
# For ATHENA example, move job steering and output files:
#
cp my_job_options_${PBS_JOBID}.py athena_${PBS_JOBID}.out /data/x01/users/${LOGNAME}/
#
#===========================================================================
#               Clean-up
#===========================================================================
#
rm -f ./mylocalfiles
#

Media:Pbsjob_template.sh