We offer tightly-integrated OpenMPI and MPICH2 on the HPCC.
MPI Selection (module)
![]() |
The default MPI module is `mpi/openmpi-x86_64` (GNU CC/C++ & Fortran). If you wish to use the Intel compilers, or the MPICH2 MPI, you will need to edit your `~/.bashrc` file, find the ‘MPI SELECTION’ section, and uncomment ONLY the MPI module that you with to use, before any MPI commands are run on the HPCC. |
C MPI on the Grid
More information: C/C++ on the HPCC (including MPI)
Fortran MPI on the Grid
Submitting a Fortran MPI Job
![]() |
Remember Job Management with job/slot limits for job queues. |
Create Fortran MPI Commands File
Create a .f file with your commands, for example: Fortran/Sample MPI Commands File
Compile Fortran MPI Code
Compile the .f commands file with the appropriate compiler:
qrsh mpif90 -o your-commands-file your-commands-file.f
Create Fortran MPI Job Script
Create a .sh file with at least the following contents:
#!/bin/bash mpiexec your-commands-file
Submit Fortran MPI Job
qsub -pe openmpi 4 your-job-script.sh
More information: Job Management