High Performance Compute Cluster - hpc.uea.ac.uk

 

This page is currently being updated (16/02/18 - HPC Admin)

Prior to using the HPC service you will need to register for an account.  We aim to setup all new accounts within 2 working days.

IMPORTANT NOTE

If you wish to connect to the HPC system from offsite you will need to connect to the UEA VPN service first i.e. vpn.uea.ac.uk and then follow normal ssh connection procedures (putty/terminal) to connect to the cluster. 

Further guidance about the UEA VPN service is available from the online wiki.

If you have a requirement to install specific software on the HPC cluster, please complete the following form.

Please note that whilst we attempt to complete all new application installations within two working weeks, this is not always possible.

The HPC cluster consists of the following components:
 

Node Names Node Type  CPU Cores (Total) Clock Speed Memory Interconnect IB Fabric Queues Usage
e0001 to e0092 Standard Haswell (v3) 16 2.6Ghz 64GB DRR3 1Gb/s Eth NA *-eth Standard workflows
e0093 to e0152 Standard Broadwell (v4) 16 2.1Ghz 64GB DDR4 1Gb/s Eth NA *-eth Standard workflows
i0001 to i0040 Infiniband (IB) Haswell (v3) 24 2.3GHz 128GB DDR3

1 x 40Gb/s (IB)

1 x 1Gb/s Eth

Qlogic *-ib Parallel jobs and or I/O intensive/threaded jobs
i0041 to i0081 Infiniband (IB) Broadwell (v4) 28 2.4GHz 128GB DDR4

1 x 56Gb/s (IB)

1 x 1Gb/s Eth

Mellanox mellanox-ib Parallel jobs and or I/O intensive/threaded jobs
g0001 to g0008 GPU (Tesla K40) Haswell 16 2.6GHz 64GB DDR4 1 x 1Gb/s Eth NA gpu GPU aware jobs
hm001 Huge memory Broadwell 16 2.1GHz 128GB 1 x 1Gb/s Eth NA huge-mem Large memory jobs
hm002 Huge memory Broadwell 16 2.1GHz 512GB 1 x 1GHz Eth NA huge-mem Large memory jobs