Posted: 23 Jun 2021 | 10:05
For over a decade our community has enjoyed significant performance benefits by leveraging heterogeneous supercomputers. Whilst GPUs are the most common form of accelerator there are also other hardware technologies which can be complementary.
Field Programmable Gate Arrays (FPGAs) enable developers to directly configure the chip, effectively enabling their application to run at the electronics level. There are potential performance and power benefits to tailoring code execution and avoiding the general purpose architecture imposed by CPUs and GPUs, and as such FPGAs have been popular in embedded computing for many years but have not yet enjoyed any level of uptake in HPC.
Posted: 23 Jun 2021 | 09:15
Computing doesn’t come much more novel than quantum computing.
Classical computers rely on the manipulation of bits, physical systems (usually transistors) that can be found in one of two states, which we label 0 and 1. Quantum computers on the other hand, use quantum bits (qubits) which can be measured in one of two physical states, still labelled 0 and 1, but can exist in any linear superposition of the two – they can be in 0 and 1.
Such a fundamental change at such a low level means that programming a quantum computer, or even developing quantum algorithms is very different to what we’re used to.
Posted: 7 Jun 2021 | 10:05
The European Commission has launched a network of National Centres of Excellence in High-Performance Computing (HPC), High-Performance Data Analytics (HPDA) and Artificial Intelligence (AI). The UK centre will be delivered jointly by EPCC and the Hartree Centre.
Posted: 24 May 2021 | 11:11
Back in February, I reviewed the usage of different research software on ARCHER over a large period of its life. Now we have just come to the end of the first month of charged use on the ARCHER2 4-cabinet system I thought it would be interesting to have an initial look at how people are using the new system in terms of research software use and job sizes.
Novel ammonia-hydrogen sulphide mixtures under extreme conditions: Implications for Ice-giant planetsAuthor: Guest blogger
Posted: 12 Feb 2021 | 13:14
Sudip Kumar Mondal from Jadavpur University in Kolkata visited the University of Edinburgh through the HPC-Europa3 Transnational Access programme. Falling from 11 October to 29 January, Sudip's visit was unlike most others because the Covid pandemic was ongoing. In this blog post Sudip describes his experiences in coming to and working in Edinburgh.
My doctoral research focuses on the physical behaviour of naturally occurring mineral phases at high temperature and pressure conditions by employing quantum chemical simulations which are hard to realize through experiments at the laboratory. Dr Andreas Hermann, my supervisor at the University of Edinburgh, was undoubtedly perfect for supervising this project, being an expert on materials at extreme conditions.
Posted: 13 Jan 2021 | 10:35
Work on the Edinburgh International Data Facility has passed three key milestones, bringing the infrastructure that will underpin the £600m Data-Driven Innovation Programme significantly closer to reality.
Firstly, and perhaps most importantly, EIDF’s home, Computer Room 4 (cr4) at the University’s Advanced Computing Facility, completed its main construction phase at the end of the third quarter of 2020 and cr4 has entered its commissioning and fit-out phase. If everything goes to plan, we will start to build infrastructure in the room from January 2021.
Posted: 2 Dec 2020 | 14:20
The release of Fujitsu’s A64FX CPU has been a high point in an otherwise disappointing year. This next-generation CPU is the brain in Fugaku, the supercomputer at RIKEN in Japan, which was number one in the June 2020 TOP500 list.
Since February, Fujitsu has given EPCC access to a development A64FX machine as part of an early-access programme. We have been exploring the performance of this technology applied to numerous HPC workloads.
Posted: 16 Nov 2020 | 14:07
EuroCC will raise participating countries to a common high level in high-performance computing (HPC), high-performance data analytics (HPDA), and artificial intelligence (AI).
National Competence Centres (NCCs) will be responsible for surveying and documenting the core HPC, HPDA, and AI activities and expertise in each participating country. The ultimate goal is to make HPC available to different users from science, industry, public administration, and society.
Posted: 12 Nov 2020 | 16:41
Friday 13 November, 11:05–11:35 am ET
This Friday I am presenting work on accelerating Nekbone on FPGAs at the SC20 H2RC workshop. This is driven by our involvement in the EXCELLERAT CoE, which is looking to port specific engineering codes to future exascale machines.
One technology of interest is that of FPGAs, and a question for us was whether porting the most computationally intensive kernel to FPGAs and enabling the continuing streaming of data could provide performance and/or energy benefits against a modern CPU and GPU. We are focusing on the AX kernel, which applies the Poisson operator, and accounts for approximately 75% of the overall code runtime.
Posted: 10 Nov 2020 | 09:44
Join the workshop: Friday 13 November, 2:30pm–6:30pm ET
At SC20 this year we are chairing UrgentHPC, the second international workshop on the use of HPC for urgent decision making. The idea of the workshop is to explore the fusion of HPC, big data, and other technologies in responding to disasters such as global pandemics, wildfires, hurricanes, extreme flooding, earthquakes, tsunamis, winter weather conditions, and accidents. Whilst HPC has a long history of simulating disasters, we believe that technological advances are creating exciting new opportunities to support emergency, urgent, decision making in real-time.