Collaboration with UK Met Office
Posted: 7 Nov 2014 | 15:39
Very high resolution modelling of clouds and precipitation is one of the tools used to further our understanding of numerous important aspects of the weather and climate. Apart from validating and supporting the UM forecast model, an example of the scientific problems that such a model might be used for is the simulation of fog where working on a scale of 1 metre is often desirable to accurately model this behaviour. Modelling fog is an illustration of how such models are not purely for furthering scientific understanding, but also impact the real world. Accurately simulating such weather is very important to activities such as commercial aviation where airports need to be able to give as detailed information as possible to the likelihood of conditions improving enough to permit aircraft operations when it is foggy.
In the late 1980s the Met Office developed their own LES model called the Large Eddy Model (LEM) and over the past two decades meteorologists have very successfully continuously used and developed this model to conduct fundamental research. The LEM was initially written for a generation of serial machines and then ported onto a T3D, Cray’s first massively parallel computer. Because of this retrofitting of parallelism, and the pace of change generally, it has become apparent that there are some limitations in terms of scale and performance. Scientifically the LEM is very useful and there is considerable demand for larger, more detailed simulations to be performed at the peta-scale on machines such as ARCHER and the Met Office’s own machines. Whilst the LEM runs on these latest generation machines, it cannot effectively take advantage of the massive number of processor cores available and as such fundamentally limits the scientific problems that it can model.
The Met Office has recently announced the purchase of a new £97m supercomputer and all eyes will be on the capability that this machine will be able to deliver. But as well as the hardware, it is important to have models which can make maximum use of this new technology and so we are working with the Met Office and NERC to write a next-generation LES model called the Met Office NERC Cloud model (MONC.) The aim of this is to create, from the ground up, a model incorporating all of the advanced scientific functionality of the LEM but which scales very well on thousands, if not hundreds of thousands, of cores. Apart from developing parallel code using modern HPC technologies and practices, great emphasis is also put on using modern software engineering practices. This is not only because of the longevity of the model, but also because the end-user scientists often need to change the code itself in order to customise the simulation for their needs. An example of this is the pluggable architecture that we have adopted, where the majority of the model is made up of independent components. Not only is it trivial to switch these in and out to customise the model, it also allows the scientists to add and modify code in isolation inside a component, without the worry that it might have some unintended side effect elsewhere.
We are currently about half way through the project and have been able to demonstrate some early successes. One of the test cases we have modelled in MONC so far is that of a cold bubble of gas dropping through the atmosphere (due to its density being greater than that of the surrounding air.) This is interesting because, as it falls, it creates a small vacuum where the bubble was and as such air rushes in to fill this void. We have run this on ARCHER and demonstrated that, compared to the same run in the LEM, MONC is capable of scaling many times beyond the current capabilities and reaching a solution much faster.