Modelling & simulation

Data and Software Carpentry combo at Edinburgh

Author: Mario Antonioletti
Posted: 29 Aug 2016 | 10:35

Software Carpentry attendees during the shell session. Pic Credit: Martin Callaghan.

With my Software Sustainability Institute hat on, I recently participated in a back-to-back Data Carpentry and Software Carpentry course sponsored by the University's Research Data Service here at the University of Edinburgh. The courses were held in the main University library in a gorgeous room with a glass wall, providing a rather distracting view of the Meadows parkland. 

Early experiences with KNL

Author: Adrian Jackson
Posted: 29 Jul 2016 | 16:45

Initial experiences on early KNL

Updated 1st August 2016 to add a sentence describing the MPI configurations of the benchmarks run.
Updated 30th August 2016 to add CASTEP performance numbers on Broadwell with some discussion

EPCC was lucky enough to be allowed access to Intel's early KNL (Knights Landing, Intel's new Xeon Phi processor) cluster, through our IPCC project.  KNL Processor Die

KNL is a many-core processor, successor to the KNC, that has up to 72 cores, each of which can run 4 threads, and 16 GB of high bandwidth memory stacked directly on to the chip.

ExTASY: a flexible and scalable approach to biomolecular simulation

Author: Iain Bethune
Posted: 18 Jul 2016 | 12:20

Over the last 10 years, the growth in performance of HPC systems has come largely from increasing core counts, which poses a question of application developers and users – how to best make use of the parallelism on offer?

Collaboration, collaboration, collaboration

Author: Adrian Jackson
Posted: 15 Jun 2016 | 13:35

This week sees our annual collaboration workshop with Tsukuba University, Japan (more details are available here).  This is a great chance to get a flavour of the kind of research another HPC centre is undertaking, how they work, and what platforms they are investing in.

The Centre for Computational Sciences (CCS) at Tsukuba is a department very like EPCC, in that it is responsible for high performance and parallel computing at the university, runs and supports large-scale computers for researchers, and undertakes parallel computing research.

Fortissimo Marketplace: a one-stop-shop for modelling and simulation services

Author: Mark Sawyer
Posted: 3 Jun 2016 | 14:34

 

A consortium of Europe’s leading supercomputing centres and HPC experts is developing the Fortissimo Marketplace, a one-stop-shop where end-users will access modelling and simulation services, plus high-performance data analytics.

Building a strong CP2K user community

Author: Iain Bethune
Posted: 4 Mar 2016 | 12:42

CP2K logoLast week we held the 3rd annual CP2K users group meeting down at Kings College London.  Amazingly, we are already half-way through the 5-year 'CP2K-UK' project - the EPSRC-funded community support effort that I'm leading - how time flies!  It was great to see these meetings continue to go from strength to strength.  This year we had over 50 people there on the day from around the UK but also a significant proportion from overseas too!  While the primary aim of our activities is to support the UK research community, if we have a wider impact that's of course a bonus.  

Debugging in 5D

Author: Adrian Jackson
Posted: 24 Feb 2016 | 16:41

Or why debugging is hard and parallel debugging doubly so

Computing bug: Grace Hopper's famous bug found in 1947 in a relay in the Mark II computer, taped it to the operations logbook with the annotation "First actual case of bug being found". Image courtesy of the Naval Surface Warfare Center, Dahlgren, VA., 1988. - U.S. Naval Historical Center Online Library Photograph

Debugging programs is hard. I give a lecture on debugging for the Programming Skills module of EPCC's MScs in HPC and HPC with Data Science where we try to point out common programming mistakes, programming strategies for making bugs less likely, and the skills and tools required for investigating, identifying, and fixing bugs.

Making the most of ARCHER for Materials Chemistry

Author: Iain Bethune
Posted: 15 Feb 2016 | 15:38

Ab initio modelling of oil formation in clay mineralsIn early December we added a visualisation of the most heavily used application codes to the ARCHER website.  At the moment it only shows data for the current month, but we've been recording the data since the ARCHER service began back in 2013 (table below).

HPC-CORE simulation software workshop

Author: Adrian Jackson
Posted: 31 Jan 2016 | 23:07
 
Tidal simulation

Lancaster University, 7-8 April 2016

The programme for the HPC-CORE (High Performance Computing-based Computational fluid dynamics for Offshore Renewable Energy) workshop has now been published. This event brings together scientific specialists from Engineering, Atmospheric and Environmental Sciences, and HPC experts to discuss the state of the art for simulation software, and the leading-edge simulations being undertaken with such software.

Numerical modelling of clouds and atmospheric flows

Author: Guest blogger
Posted: 11 Dec 2015 | 14:39

 

 

 

 

The Met Office/NERC Cloud model (MONC) has been developed in a collaboration between EPCC and the Met Office. MONC delivers a highly scalable and flexible Large Eddy Simulation (LES) model capable of simulating clouds and other turbulent flows at resolutions of tens of metres on very large domains.  

Pages