High Performance Computing - School of Chemistry

General

University of Glasgow IT services provides a general purpose high-performance computational cluster for researchers within the University. It can be used by staff members and postgraduate students to support their research.

General information about the cluster, such as obtaining the account

  • configuring your account before first use
  • support
  • hardware and software specification

can be found here.

In order to get access to chemistry software, members of the School of Chemistry should add a note on their registration form “add me to compchem group”.

If you have no Linux and High Performance Computing experience, you may find this Linux tutorial and HPC help documents useful. IT services regularly organise training course for people new to HPC.

If you have problems configuring your HPC account before first use, contact our chemistry IT admin, Stuart Mackay, who will be able to help you.

Chemistry software

We have installed a number of software packages on the University cluster that are available only to staff and research students of the School of Chemistry:

  • Gaussian16 rev C.01
  • Turbomole 6.4
  • Orca 5.0.4
  • Gromacs 5.0.5
  • NAMD 2.9
  • Autodock

To help our staff with using chemistry software we have prepared a system configuration file and sample scripts for submitting Gaussian, Gromacs, Orca, and Turbomole jobs.

Configuring your account to use chemistry software

After you have set up SSH and email forwarding for your new cluster account (as detailed here), you need to configure your account to allow easy access to chemistry software. You only have to do this once.

  1. Log in to the University cluster with your username and password.
  2. Type the following command:

/export/projects/compchem/software/chem_config.sh

You are all set now! Again, if you have problems with this contact Stuart Mackay.

Submitting Gaussian / Turbomole / Orca / Gromacs jobs

Sample scripts for submitting jobs with chemistry software are located in: /export/projects/compchem/software/SETTINGS_AND_SCRIPTS/bash

  • for gaussian16, use /export/compchem/software/run-g16
  • gromacs.run is for Gromacs 5.0.5
  • turbomole.run is for Turbomole 6.4
  • orca3.run is for Orca 3.0.3
  • orca4.run is for Orca 4.0.0.2

Copy the needed file(s) to your home directory. Each script has a header which explains how to use it and a customizable section that allows to change CPU/memory/time requirements of the job. If you have problems using those scripts you should contact us.

Gaussian notes:

Except for very specialised situations, old G09 input files should run “as is” with G16. So you should be able touse existing input files without much further ado.

However, if you are relying on exact reproducibility or continuity, you may want to rerun an old input with G16 and compare the results 1:1. For most cases, there should be no difference. However, G16 may have slightly different defaults in some cases, which can affect the results. If this is an issue, the easiest is probably to rerun calculations with G16 or to check in the G16 manual if/how the old defaults can be used.

To submit G16 jobs, use the script /export/projects/compchem/software/run-g16.
Usage: /export/projects/compchem/software/run-g16 <Gaussian 16 input file> [PBS options]
Specify requested wall time with: -l walltime=hh:mm:ss [default is 24 h]
CPU time is automatically set to walltime * #cores, as specified in the G16 input.

The script will read your gaussian input file, detect the number of cores requested and submit the job to the queuing system. If you do not specify the expected runtime (walltime) of the calculation, it will default to 24 hours and be submitted to the short queue.

example: run-g16 myinput.com -l walltime 72:0:0

and your input file might begin with lines like:

%mem=8GB
%chk=test1.chk
%NProcShared=8
#P M062X SVP opt=(tight) int(grid=ultrafine)
scrf=(pcm,solvent=acetonitrile)

and so on....

 

Disk space for Chemistry

School of Chemistry has a separate disk space on the University cluster at:

/export/projects/compchem

This means, that /export/scratch/username is a shortcut to /export/project/compchem/username. Because this is very slow network drive you should not use it as a scratch space for your computations, Instead it is advisable to use local storage available on the computational nodes. If you are using one of the scripts prepared by us, then the script is taking care of that for you.

Be aware that the University cluster has a very limited disk space, and there is no backup policy at all. Therefore, you should not store large amounts of data on the cluster and back up all important data somewhere else.