Nick Brown's blog
Posted: 11 Dec 2019 | 15:54
The Met Office relies on some of the world’s most computationally intensive codes and works to very tight time constraints. It is important to explore and understand any technology that can potentially accelerate its codes, ultimately modelling the atmosphere and forecasting the weather more rapidly.
Field Programmable Gate Arrays (FPGAs) provide a large number of configurable logic blocks sitting within a sea of configurable interconnect. It has recently become easier for developers to convert their algorithms to configure these fundamental components and so execute their HPC codes in hardware rather than software. This has significant potential benefits for both performance and energy usage, but as FPGAs are so different from CPUs or GPUs, a key challenge is how we design our algorithms to leverage them.
Posted: 9 Dec 2019 | 12:48
MVAPICH is a high performance implementation of MPI. It is specialised for InfiniBand, Omni-Path, Ethernet/iWARP, and RoCE communication technologies, but people generally use the default module loaded on their system. This is important because, as HPC programmers, we often optimise our codes but overlook the potential performance gains of better choice of MPI implementation.
Posted: 12 Nov 2019 | 11:11
Here in EPCC we lead a work package of the VESTEC EU FET project which is working on the fusion of real-time data and HPC for urgent decision-making for disaster response. While HPC has a long history of simulating disasters, what’s missing to support emergency, urgent, decision-making is fast, real-time acquisition of data and the ability to guarantee time constraints.
Posted: 30 Aug 2019 | 11:32
The recent installation of Fulhame, the ARM HPC machine based here in EPCC as part of the Catalyst UK programme, raises plenty of interesting opportunities for exploring the HPC software ecosystem for ARM. One such aspect is the relative performance of different MPI implementations on these machines and this is what I was talking about last week at the MVAPICH User Group (MUG) workshop.
Posted: 5 Jul 2019 | 11:13
The EU VESTEC research project is focused on the use of HPC for urgent decision-making and the project team will be running a workshop at SC’19.
VESTEC will build a flexible toolchain to combine multiple data sources, efficiently extract essential features, enable flexible scheduling and interactive supercomputing, and realise 3D visualisation environments for interactive explorations.
Posted: 17 Jun 2019 | 14:24
When is enough, enough? With so many parallel programming technologies, should we focus on consolidating them?
At the ISC conference in June I will moderate a panel discussion on whether it is time to focus on the consolidation and interoperability of existing parallel programming technologies, rather than the development of new ones.
Posted: 24 Apr 2019 | 11:51
The blog post below is based on the abstract of a talk at the PASC mini-symposium 'Modelling Cloud Physics: Preparing for Exascale' (Zurich, 13 June 2019).
The Met Office NERC Cloud model (MONC) is an atmospheric model used throughout the weather and climate community to study clouds and turbulent flows. This is often coupled with the CASIM microphysics model, which provides the capability to investigate interactions at the millimetre scale and study the formation and development of moisture. One of the main targets of these models is the problem of fog, which is very hard to model due to the high resolution required – for context the main UK weather forecast resolves to 1km, whereas the fog problem requires 1metre or less.
Posted: 15 Nov 2018 | 16:21
With jobs submitted to a batch system, supercomputing has traditionally been centred around an offline, non-interactive approach to running codes such as simulations. However, it is our belief that there is great potential in fusing HPC with real-time data for use as part of urgent decision-making processes in response to natural disasters and crises.
Posted: 27 Jul 2018 | 15:25
We are working on a machine-learning project with Rock Solid Images (RSI), a geoscience consulting firm that provides borehole characterisation with the goal of reducing exploration drilling risk for oil and gas companies.
RSI is one of the main players in the interpretation of seismic data with well log data and it has built its business on using advanced rock physics methods combined with sophisticated geologic models to deliver highly reliable predictions of where oil and gas might be found.
Posted: 19 Jun 2018 | 18:39
“There is not a moment to lose” – I don’t know if you have ever read any of the Aubrey-Maturin books by the late Patrick O’Brian, set at the turn of the eighteenth to nineteenth centuries and describing life in the Royal Navy. Even if you have only flicked through one of the books, you will probably have picked up an almost constant sense of urgency (a realistic representation of what pervaded the Navy at that time) in the books, much to the annoyance of the decidedly un-Navy-like Dr Maturin!
Considering the modern pace of change I think this sentiment is truer today, especially in scientific fields, than it has ever been before. Certainly from my perspective there is an urgency to try and push forward the state-of-the-art in HPC and share it, before other people’s activities supersede my work. However, I think this same sense of urgency also applies to other, non-technical, aspects of our community. Diversity is a prime example here and, whilst there are some excellent initiatives being adopted by the likes of the SuperComputing (SC) and ISC conferences, we still have a long way to go.