VESTEC: saving the world one byte at a time

Author: Nick Brown
Posted: 15 Nov 2018 | 16:21

With jobs submitted to a batch system, supercomputing has traditionally been centred around an offline, non-interactive approach to running codes such as simulations. However, it is our belief that there is great potential in fusing HPC with real-time data for use as part of urgent decision-making processes in response to natural disasters and crises.

It is not just HPC that has benefited from phenomenal developments in hardware: our ability to physically collect data, for example via high-velocity sensors, has also undergone a revolution in recent years.

Until now, the role that HPC can play in complementing this and turning data streaming into valuable collateral has been overlooked. But this is not a simple job and entails much more than hooking up some data sources to HPC machines. Instead, to fuse HPC with real-time data, a large number of challenges need to be tackled, from the low level software stack up to interactive visualisation tools.

The VESTEC project will tackle these challenges to make HPC more interactive and capable of processing raw data arriving in real time, so creating a tool for use in urgent decision-making. It is our hypothesis that combining HPC computational models with real-time data will significantly aid in urgent decision-making, ultimately saving lives and reducing economic loss.

VESTEC is focusing on three use cases: forest fires, the impact of space weather (specifically the disruption cause by geomagnetic storms), and mosquito-borne diseases. Not only do these areas entail risk to life, they also have a significant economic impact (estimated at billions of dollars a year). We hope that by combining traditional simulations with high-velocity sensor data, it will be possible to make correlations between simulations and observations such that much more precise and reliable predictions can be generated. These would feed into disaster recovery or even prevention.

At EPCC we are leading the work-package on interactive supercomputing. This is especially interesting for us as it combines our expertise in traditional HPC with that of data, to challenge some of the assumptions that the current generation of HPC machines are built upon. I think it is going to be fascinating to see how, over the next three years of the project, the technology and techniques that we will develop as part of VESTEC contribute to solving the challenges we have identified and their resulting wider societal impact.

VESTEC is funded by the EU’s Horizon2020 programme. It started in September 2018 and will run for three years. The project has eleven partners, each with a different area of expertise.

Image above shows Wildfire Analyst, one of the VESTEC partners and three project use-cases. It models the spread of forest fires, and is used to assist in disaster planning and mitigation. The VESTEC project will develop infrastructure which enables this application to be fed with real time sensor data and run ensemble fashion on supercomputers. The end result will be a step change in capability for disaster recovery teams, where they can much more accurately advise fire fighting teams on the ground to contain the fire and ultimately save lives.

Author

Nick Brown, EPCC

Top image: © iStock.com/JPhilipson