Posted: 24 May 2013 | 16:00
What sort of research is the HECToR supercomputing facility used for and what simulation software does it make use of?
EPCC measures the use of different simulation codes used on the HECToR facility to get an idea of which codes are used most and what size of jobs different code are used for. In this post I will take a look at which codes are used most on the facility and speculate whether we can infer anything from the patterns we see.
Posted: 23 May 2013 | 17:00
I've just finished working on two papers for this year's Supercomputing conference, SC13, which is going to be in Denver, Colorado, from the 17th-22nd November. EPCC will have an official presence, with an EPCC booth on the exhibition floor and a number of staff participating in the technical and education programmes. I thought this a good opportunity to write up some recent work I've been undertaking.
Posted: 3 May 2013 | 15:18
This article originally appeared in the Cisco blog by Jeff Squires and was written while I was undertaking a PhD before I joined EPCC as a member of staff. I thought it would be of interest to folks reading this blog.
My PhD involved building a message passing library using C#; not accessing an existing MPI library from C# code but creating a brand new MPI library written entirely in pure C#. The result is McMPI (Managed-code MPI), which is compliant with MPI-1 – as far as it can be given that there are no language bindings for C# in the MPI Standard. It also has reasonably good performance in micro-benchmarks for latency and bandwidth both in shared-memory and distributed-memory.
Posted: 2 May 2013 | 13:32
I'm currently working on a small library to support decomposition changes in parallel programs.
It turns out that a fairly simple interface can be used to describe a very large space of possible data decompositions. I'm therefore writing a library that can redistribute data between any two such decompositions.
Posted: 29 Apr 2013 | 07:09
Materials science - understanding how the microscopic structure of matter gives rise to macroscopic properties of materials - is one of EPSRC's key research areas, with applications in fields as diverse as energy storage, electronics, fabrics and nanotechnology. EPCC helps develop a number of important simulation codes in this area such as CP2K, GROMACS, and in this project GULP, the General Utility Lattice Program.
Posted: 12 Apr 2013 | 11:31
The vast majority of applications running on HECToR today are designed around the Single Instruction Multiple Data (SIMD) parallel programming paradigm. Each processing element (PE), i.e. MPI rank or Fortran Coarray image, runs the same program and performs the same operations in parallel on the same or a similar amount of data. Usually these application are launched on the compute nodes homogenously, with the same number of processes spawned on each node and each with the same number of threads (if required).
Posted: 11 Apr 2013 | 12:34
Posted: 10 Apr 2013 | 10:20
Next week sees the fifth PRACE seminar for industry users being held at Bad Boll, near Stuttgart in Germany.
The seminar targets industrial users of HPC, who will have an opportunity to learn how PRACE resources, expertise, education and training can contribute to their business success. The seminar will act as a forum to develop or strengthen links with CEOs, CTOs, CIOs and R&D managers coming from large corporations and SMEs, as well as Independent Software Vendors.
Posted: 9 Apr 2013 | 10:35
Today, April 9th 2013, sees the first day of the EASC2013 conference. Over 130 delegetes have registered for this conference, which aims to bring together those involved in solving the software challenges associated with programming exascale systems.