Posted: 4 Mar 2019 | 09:42
In our role as members of the Research Engineering Group of the Alan Turing Institute, Anna Roubickova and I worked with Efi Tsamoura and Benjamin Spencer (Department of Computer Science at the University of Oxford) on PDQ, a proof-driven query planner that has great potential within the realm of data science for medical research.
Posted: 3 Mar 2019 | 17:28
For the fourth year running, from Wednesday 13th March to Saturday 16th of March, EPCC will be attending the Big Bang Fair (BBF) at the NEC in Birmigham to demonstrate the wonders of supercomputing. The BBF encourages young people to adopt STEM-based subjects at school and later as a career – not only through universities but also apprenticeships and other career choices. This is also a great opportunity for our colleagues to undertake some outreach.
Posted: 1 Mar 2019 | 12:33
Former HPC-Europa3 visitor Dr Mats Simmermacher, Dr Adam Kirrander (Mats' host from the University of Edinburgh's School of Chemistry), and their collaborators from Edinburgh and Copenhagen have recently published a paper in the prestigious Physical Review Letters where they discuss a new effect in ultrafast X-ray scattering.
Posted: 27 Feb 2019 | 15:53
The MPI Standard states that nonblocking communication operations can be used to “improve performance… by overlapping communication with computation”. This is an important performance optimisation in many parallel programs, especially when scaling up to large systems with lots of inter-process communication.
However, nonblocking operations can also help with making a code correct – without introducing additional dependencies that can degrade performance.
Posted: 22 Feb 2019 | 16:09
Peeved with Python? Revolted by R? SAS make you sad? The Julia Language may be for you. Recently reaching version 1.0, Julia claims to be more than just another data science language.
In this post I’ll give a tour of some of the more interesting features of Julia, and its implementation.
Posted: 16 Jan 2019 | 11:06
Analysing genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation.
Typically in a cancer genomics analysis, both a tumour sample and a “normal” sample from the same individual are first sequenced using NGS systems and compared using a series of quality control stages. The first control stage, ‘Sequence Quality Control’ (which is optional), checks sequence quality and performs some trimming. While the second one, ‘Alignment’, involves a number of steps, such as alignment, indexing, and recalibration, to ensure that the alignment files produced are of the highest quality as well as several more to guarantee the variants are called correctly. Both stages compromise a series of intermediately computing and data-intensive steps that very often are handcrafted by researchers and/or analysts.
Posted: 15 Jan 2019 | 11:52
The Fortissimo 2 project ended on 31 December 2018. Together with its predecessor (the plain old 'Fortissimo project') it has helped over 100 SMEs and mid-caps to run experiments that demonstrate the effectiveness of providing HPC services using a business model derived from cloud computing, thereby making it much lower risk for small companies to use HPC.
Posted: 8 Jan 2019 | 15:08
Earlier this year, HPE announced the Catalyst UK programme: a collaboration with Arm, SUSE and three UK universities to deploy one of the largest Arm-based high performance computing (HPC) installations in the world. EPCC was chosen as the site for one of these systems; the other two are the Universities of Bristol and Leicester.
EPCC's system (called 'Fulhame' after pioneering chemist Elizabeth Fulhame) was delivered and installed in early December. This HPE Apollo 70-based system consists of 64 compute nodes with two 32-core Cavium ThunderX2 processors (ie 4096 cores in total), 128GB of memory composed of 16 DDR4 DIMMs, and Mellanox InfiniBand interconnects. It will be made available to both industry and academia, with the aim to build applications that drive economic growth and productivity as outlined in the UK government’s Industrial Strategy.
Posted: 7 Jan 2019 | 15:14
The Advanced Computing Facility (ACF) on the outskirts of Edinburgh is the high performance computing data centre of EPCC.
Built in the 1970s and operated by EPCC since the turn of the millennium, the ACF site has had significant investment over the years. At present, there are three Computer Rooms, imaginatively called: Computer Room 1 (CR1), Computer Room 2 (CR2), and Computer Room 3 (CR3).
Posted: 14 Dec 2018 | 17:33
PickCells is image analysis software developed by the Centre for Regenerative Medicine (CRM) at The University of Edinburgh. PickCells allows biologists to explore multidimensional biological images of stem cell niches, organoids, and embryos. In late October, with the assistance of six researchers, we evaluated the usability of PickCells to help guide its future development.
To run our usability evaluation, we followed Steve Krug's highly-recommended and very readable book "Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems". This book describes a practical way to carry out usability evaluations with minimal overhead.