Posted: 19 Mar 2019 | 10:16
The Software Sustainability Institute's (SSI) Collaborations Workshop 2019 (CW19) will be held at the West Park Teaching Hub, Loughborough University, Loughborough from 1-3 April 2019. This year the workshop will be themed around topics based on interoperability, documentation, training, and sustainability. Keynote speakers will include Catherine Stihler, CEO of Open Knowledge International, and Franziska Heine (link points to a German article), Head of Software & Development at Wikimedia Deutschland. They will open the event on 1st April.
Posted: 6 Mar 2019 | 10:13
EPCC is at the heart of the Data Driven Innovation programme of the Edinburgh and South East Scotland City Region Deal. This £661m programme will lead to the creation of major new data services for Scottish business, with EPCC providing the World Class Data Infrastructure (WCDI) that will underpin it.
Last week we hosted an event to explain the enormous potential of data-driven innovation for industry and to show how companies are already using data technologies to enhance commercial performance.
Posted: 4 Mar 2019 | 09:42
In our role as members of the Research Engineering Group of the Alan Turing Institute, Anna Roubickova and I worked with Efi Tsamoura and Benjamin Spencer (Department of Computer Science at the University of Oxford) on PDQ, a proof-driven query planner that has great potential within the realm of data science for medical research.
Posted: 3 Mar 2019 | 17:28
For the fourth year running, from Wednesday 13th March to Saturday 16th of March, EPCC will be attending the Big Bang Fair (BBF) at the NEC in Birmigham to demonstrate the wonders of supercomputing. The BBF encourages young people to adopt STEM-based subjects at school and later as a career – not only through universities but also apprenticeships and other career choices. This is also a great opportunity for our colleagues to undertake some outreach.
Posted: 1 Mar 2019 | 12:33
Former HPC-EUROPA3 visitor Dr Mats Simmermacher, Dr Adam Kirrander (Mats' host from the University of Edinburgh's School of Chemistry), and their collaborators from Edinburgh and Copenhagen have recently published a paper in the prestigious Physical Review Letters where they discuss a new effect in ultrafast X-ray scattering.
Posted: 27 Feb 2019 | 15:53
The MPI Standard states that nonblocking communication operations can be used to “improve performance… by overlapping communication with computation”. This is an important performance optimisation in many parallel programs, especially when scaling up to large systems with lots of inter-process communication.
However, nonblocking operations can also help with making a code correct – without introducing additional dependencies that can degrade performance.
Posted: 22 Feb 2019 | 16:09
Peeved with Python? Revolted by R? SAS make you sad? The Julia Language may be for you. Recently reaching version 1.0, Julia claims to be more than just another data science language.
In this post I’ll give a tour of some of the more interesting features of Julia, and its implementation.
Posted: 16 Jan 2019 | 11:06
Analysing genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation.
Typically in a cancer genomics analysis, both a tumour sample and a “normal” sample from the same individual are first sequenced using NGS systems and compared using a series of quality control stages. The first control stage, ‘Sequence Quality Control’ (which is optional), checks sequence quality and performs some trimming. While the second one, ‘Alignment’, involves a number of steps, such as alignment, indexing, and recalibration, to ensure that the alignment files produced are of the highest quality as well as several more to guarantee the variants are called correctly. Both stages compromise a series of intermediately computing and data-intensive steps that very often are handcrafted by researchers and/or analysts.
Posted: 15 Jan 2019 | 11:52
The Fortissimo 2 project ended on 31 December 2018. Together with its predecessor (the plain old 'Fortissimo project') it has helped over 100 SMEs and mid-caps to run experiments that demonstrate the effectiveness of providing HPC services using a business model derived from cloud computing, thereby making it much lower risk for small companies to use HPC.
Posted: 8 Jan 2019 | 15:08
Earlier this year, HPE announced the Catalyst UK programme: a collaboration with Arm, SUSE and three UK universities to deploy one of the largest Arm-based high performance computing (HPC) installations in the world. EPCC was chosen as the site for one of these systems; the other two are the Universities of Bristol and Leicester.
EPCC's system (called 'Fulhame' after pioneering chemist Elizabeth Fulhame) was delivered and installed in early December. This HPE Apollo 70-based system consists of 64 compute nodes with two 32-core Cavium ThunderX2 processors (ie 4096 cores in total), 128GB of memory composed of 16 DDR4 DIMMs, and Mellanox InfiniBand interconnects. It will be made available to both industry and academia, with the aim to build applications that drive economic growth and productivity as outlined in the UK government’s Industrial Strategy.