You are here
GPU and cluster computing
This page lists the GPU and other compute clusters which are available to the School of Informatics.
Why use GPUs for computing? See the GPGPU Computing page. To learn how to do it right, see the GPU cluster tips page.
1. GPGPU desktops
Some computers in the Informatics student labs are equipped with GPUs for GPGPU computing. They're in rooms 5.05 and 9.02 of Appleton Tower. 9.02 is restricted to final year undergraduates until end of semester 2 when the MSc students are granted access. Look for the bigger computers with "This is a GPGPU desktop" on their login screens. You can test GPU code on these machines before running it for real on a GPU cluster. Currently these machines include:
ariane:AT-5.05 | atlas:AT-9.02 | epoch:AT-5.05 |
glenn:AT-9.02 | link:AT-5.05 | kubrick:AT-9.02 |
nolan:AT-9.02 | russo:AT-9.02 | soyuz:AT-5.05 |
starship:AT-9.02 | stronsay:AT-9.02 | tarantino:AT-9.02 |
turpie:AT-5.05 | waititi:AT-9.02 | wumpus:AT-9.02 |
2. Informatics student clusters
The Teaching cluster is for MSc and Machine Learning Practical students, but it can be used by others when it's not being used for those courses. It has GPUs. If you do not automatically get access please submit an RT ticket.
The Research cluster is for PGR students. It has GPUs. If you do not automatically get access please submit an RT ticket.
The James and Charles cluster is for Pervasive Parallelism CDT and Data Science CDT students. It has GPUs.
The Hadoop cluster is for Extreme Computing students, but it can be used by others when not needed for that module. It's dedicated to Hadoop. There are no GPUs.
3. Informatics-affiliated research units
The Institute for Language, Cognition and Computation has a CPU/GPU cluster. GPU nodes include barre
, greider
, levi
, mcclintock
, moser
, nuesslein
and ostrom
each have four GTX 1080 Tis, and youyou
has Maxwell Titan Xs. There is also specific provision in this cluster for CDT in NLP students
Access to the cluster is moving to slurm, there is some initial documentation at the link below.
If you're in the ILCC or an NLP CDT student you should have access to the cluster automatically, If you have problems accessing the head node then submit an RT ticket mentioning that you're an ILCC/CDT member.
The Centre for Speech Technology Research has GPU, multicore and large memory compute servers. The CSTR computer support page has details.
4. The University of Edinburgh
The Edinburgh Compute and Data Facility (ECDF / Eddie) has a large scale (2000+ cores) Linux cluster, a multi-terabyte parallel filesystem and some GPGPU provision (Nvidia Tesla K80 and Nvidia Titan X). It's free, or you can pay for priority access. For large scale HPC/HTC, ECDF should be your first choice.
Staff and research students get access by asking IS Helpline. Taught MSc students may get access, but only with the agreement of their supervisor, who should contact IS Helpline directly.
The Cloud Computing Service is two OpenStack services: a Research Cloud (called "Eleanor") for academic staff and research students, and an Enterprise Cloud (called "Grace") which can be used by anyone. Each cloud has limited free access and a more extensive paid service.
The EPCC in the Bayes Centre has supercomputers and HPC clusters - the EPCC's HPC page has details. In particular, the EPCC hosts ARCHER2, the UK's premier academic research supercomputer, to which University of Edinburgh users have generous access, and CIRRUS, a UK Tier-2 national HPC facility with hundreds of CPU and GPU nodes.
5. HPC beyond Edinburgh
Time on an HPC facility is available via EPSRC calls. These can be queried on the UKRI website.