Posted: 5 Apr 2018 | 10:28
The INTERTWinE project has spent a lot of effort addressing interoperability challenges raised by task-based models. By rethinking parallelism in the paradigm of tasks, one reduces synchronisation and decouples the management of parallelism from computation.
This is really attractive but existing models typically rely on shared memory, where the programmer expresses input and output dependencies of tasks based upon variables, which in turn limits the technology to a single memory space – often a node of an HPC machine. So to then scale up beyond a single memory space we must combine a task model with distributed memory technology, such as MPI or GASPI, and this has been a focus for a number of activities in INTERTWinE.
Posted: 22 Mar 2018 | 11:46
I recently looked into whether the Python package PyQt5 could be installed on Cirrus, a Tier-2 national service, on behalf of one of our HPC Europa visitors. The Cirrus documentation recommends that you do this using virtual environments and provides a helpful example. However, the problem is that if you subsequently use
easy_install to install additional Python packages within the virtual environment you will get a permission denied as it tries to install the package centrally to directories you do not have access rights to. I eventually managed to find a solution.
Posted: 8 Nov 2017 | 10:23
This year’s MPI Birds-of-a-Feather meeting at SC17 will be held on Wednesday 15th November. I’ll be talking about the Sessions proposal – and explaining why it’s no longer called Sessions!
Spoiler: the working group has been looking at how Teams might interact with Endpoints.
Posted: 10 Aug 2017 | 20:22
Distributed ledgers, the core technology underlying digital currencies such as BitCoin, offer some interesting functionality for constructing distributed data infrastructures.
Ledgers can be considered to be simple data stores. They are styled on accounting ledgers, books where transactions are recorded one after the other, and the overall state of the accounts can be evaluated by working through the recorded transactions to calculate how much money has flowed in and out of the accounts.
Posted: 15 Jun 2017 | 13:33
Earthquakes have caused over three-quarters of a million deaths already in this century, and economic losses of over a quarter of a trillion US dollars since 1980, make them by far the most destructive of the natural hazards. EPCC has been involved in developing a new app that will lessen the danger of aftershocks.
Posted: 15 Jun 2017 | 13:19
EPCC is one of 15 core partners in CompBioMed. This user-driven Centre of Excellence in Computational Biomedicine will nurture and promote the uptake and exploitation of high performance computing within the biomedical modelling community. Its users come from academia, industry and clinical practice.
Posted: 24 May 2017 | 19:30
When we parallelise and optimise computational simulation codes we always have choices to make. Choices about the type of parallel model to use (distributed memory, shared memory, PGAS, single sided, etc), whether the algorithm used needs to be changed, what parallel functionality to use (loop parallelisation, blocking or non-blocking communications, collective or point-to-point messages, etc).
Posted: 21 May 2017 | 16:12
Posted: 24 Apr 2017 | 11:21
Recent devastating earthquakes in Nepal and Italy have illustrated the need for better understanding and more accurate operational forecasting of aftershock sequences to assist emergency response. This project is a multi-disciplinary collaboration to develop risk assessments for earthquake aftershocks using dense networks of traditional seismometers, and to explore the use of mobile phones as sensors and for community engagement.
Posted: 10 Apr 2017 | 16:06
Guest blogger Kara Moraw is an undergraduate Informatics student in Bonn, Germany. Here she writes about her 4-week internship with EPCC, spent working with EPCC's Nick Brown on the ARCHER outreach project.