IMPORTANT NOTE FOR OFF-CAMPUS CONNECTIONSBoth Grace and HPC clusters require that all logins are routed through the UEA campus network. To achieve this from off campus, you must first connect to the UEA vpn -

Further guidance for VPN connections for your device is available from the online wiki.

Grace - UEA's High Performance Compute Cluster 


If you would like to use Grace - Please complete our online account application

Following a 2014 update, the university's research community can now benefit from an enhanced High Performance Computing (HPC) Cluster.  Grace, the UEA HPC is now a 334 node cluster providing a total of 4784 cores, including parallel and large memory resources (full specifications below).  Named in recognition of the contribution of female IT pioneers like Grace Hopper, it was also noted the name provides a relevant and appropriate acronym in ‘Greener Research Computing Environment'.

Building on the successful provision of High Performance Computing resources to the research community at UEA for a number of years, in 2010 the Research and Specialist Computing Services looked to develop an on-going partnership with a HPC provider.  Viglen Ltd share UEA's goals of developing High Performance Computing, and we are excited to engage in developing a true collaborative partnership to evolve Research Computing at UEA to be at the forefront of the new hi-tech Britain.  This included meeting a number of challenges: Provide effective and reliable HPC resource fitting the research communities requirements with a focus on sustainable IT as well as accessibility and usability.

Grace Technical Specifications

A Total of 4784 CPU cores  and 896 GPU cores

Running Red Hat compatible Centos 5.8 and the powerful Platform LSF workload manager consists of:

  • 32 Ivy Bridge Dual 10 core E5-2670V2 2.5GHz processor systems (20 cores) with 64GB of RAM

  • 132 Sandybridge dual 8 core Intel E5-2670 2.6GHz processor systems (16 cores) with 32GB of RAM 

  • 160 Westmere dual 6 core Intel X5650 2.66GHz processor systems (12 cores) with 24GB of RAM 

  • Westmere dual 6 core Intel X5650 2.66GHz processor systems (12 cores) with 48GB of RAM 

  • 1 Westmere  quad 4 core intel E7440 processor system (16 cores) with 128GB of Ram

  • A large Quad Data Rate Infiniband communication Network of 160 nodes – 2464 parallel cores