Modelling & simulation
Posted: 16 Nov 2014 | 23:11
The Nu-FuSE (Nuclear Fusion Simulations at Exascale) project was a 3-year, G8 funded, international research project to investigate the challenges and requirements for fusion simulations at Exascale levels. The project’s aim was to significantly improve computational modelling capabilities for fusion, and fusion-related sciences, enhancing the predictive capabilities needed to address key physics challenges of a new generation of fusion systems.
Posted: 7 Nov 2014 | 15:39
Posted: 4 Nov 2014 | 10:52
Posted: 10 Sep 2014 | 12:30
Every year the British Science Festival (BSF) visits a city in the UK and engages the public with the latest and greatest science, engineering and technology. It is a fantastic opportunity for people to get involved in science and the programme contains a wide variety of activities to ensure the festival appeals to all ages.
This was the third year that EPCC has been involved with the BSF. We travelled down to Birmingham where we held an exhibition entitled “Supercomputing: From dinosaurs to particle physics” on the Saturday, which was aimed primarily at families. We were based in the Library of Birmingham along with a number of other highly-engaging events that all aimed to introduce the public to HPC and to encourage the next generation of computational scientists.
Posted: 5 Sep 2014 | 19:50
I spent a couple of days last week at Imperial College's Department of Chemistry running a 2-day training course on CP2K. The course was hosted by NSCCS - the National Service for Computational Chemistry Software, and ARCHER - the National HPC Service we manage here in Edinburgh.
I've blogged before about the CP2K-UK project, a direct result of EPSRC's recognition of the growing community of CP2K users in the UK. Currently around £25,000 worth of CPU time is used per month on ARCHER for CP2K calculations, so it is important that users have access to the latest information on how to make the best use of the code.
Posted: 18 Aug 2014 | 13:10
The Fortissimo project, which is coordinated here at EPCC, gives companies a low-risk opportunity to try out HPC. By combining it with cloud computing, they can gain the benefits without buying and running their own systems. Here are three examples of HPC in action under Fortissimo.
Posted: 18 Jul 2014 | 15:05
Posted: 16 Jul 2014 | 18:47
Posted: 3 Jul 2014 | 12:12
Heterogeneous HPC architectures are becoming increasingly prevalent in the Top500 list with CPU-based nodes being enhanced by accelerators or coprocessors optimized for floating-point calculations. This trend is likely to increase as we move towards Exascale (1018 flops) capable systems and it is vital that the relevant HPC applications are able to exploit this heterogeneity [1, 2].
Posted: 3 Jul 2014 | 11:42
Do you use scientific codes in your research? Are the things you can do with it limited by the execution time? The code has been parallelised but does not scale well? How should you go about improving the performance? What can you do when you do not have full understanding of the code? There are some general steps that can be taken to improve the performance of parallelised codes. In this article I will describe briefly the process I have undertaken to optimise the parallel performance of a computational chemistry package, TINKER, as part of the EPCC/SSI APES project.