Posted: 15 Jun 2021 | 12:26
Illuminate, a commercial collaborator of EPCC, has a mission to accelerate informed decisions by providing their customers with the means to find specific needle in a haystack data points from the mass of their network traffic. As networks evolve, they are continuously innovating to ensure that their data pipelines operate at maximum efficiency.
Data analytics pipelines have three main functions: data ingestion, processing, and analysis. The data ingestion stage focuses on obtaining or extracting data from a range of data sources and importing it into systems that can enable processing and analysis. This is followed by data processing to validate, transform and enrich it, and load it into some form of queryable data storage, such as a data lake. Finally, we utilise Artificial Intelligence/Machine Learning processes to build models from the data in the lake and use them to gain insight.
Posted: 11 May 2021 | 10:36
EPCC held its first GPU hackathon this April in partnership with NVIDIA, hosting 28 participants and 20 mentors across seven teams. The event was held virtually due to the ongoing Covid-19 pandemic. Using Zoom and Slack, individual teams were able to work alongside mentors in separate breakout rooms and channels.
Posted: 21 Apr 2021 | 15:38
The Software Sustainability Institute’s (SSI) Research Software Camp on research accessibility took place from 21 February to 5 March 2021. The Camp focussed on different aspects of research accessibility, ranging from making tools, datasets and software reproducible, open and sustainable, to exploring formats and tools to facilitate and improve accessibility for the disabled. In this post, we will share how we used relevant sections of the SSI Event Organisation Guide to plan, organise, and deliver the Camp.
Posted: 14 Apr 2021 | 10:46
The HPC Systems Team provides the System Development and System Operations functions for ARCHER2 - but who are we and what do we do?
We are a team of fifiteen System Administrators and Developers who work to deploy, manage, and maintain the services and systems offered by EPCC, as well as the infrastructure required to host and support all of EPCC’s services and systems.
Posted: 1 Mar 2021 | 14:03
Choosing the right software for use in a research software project can be challenging. How do we know which software is both fit for purpose and provides a sound basis for our project for the foreseeable future? And, how do we make such a choice given that the time and effort to explore what could be myriad alternatives may be limited?
This was a challenge we faced in the RiboViz project, a multi-disciplinary team of biologists, bioinformaticians and research software engineers based at EPCC and The Wallace Lab at University of Edinburgh, The Shah Lab at Rutgers University, and The Lareau Lab at University of California, Berkeley. RiboViz is an open source package to help us develop our understanding of protein synthesis via analysis of ribosome profiling data. At the heart of RiboViz is an analysis workflow, implemented in a Python script. To conform to best practices for scientific computing, which recommend the use of build tools to automate workflows and to reuse code instead of rewriting it, we sought to reimplement this workflow within a bioinformatics workflow management system.
Posted: 11 Nov 2020 | 10:51
I am delighted to announce that EPCC has passed our annual external ISO 27001 Information Security and ISO 9001 Quality audits with flying colours.
EPCC recognises the importance of a process-led approach to the delivery of National Tier-1 and Tier-2 services such as ARCHER, ARCHER2 and Cirrus. The implementation of an ISO 9001-based Quality Management Systems has provided a framework to ensure that the services delivered meet user and customer requirements. It requires that service improvements are identified and the impacts of their implementation tracked, ensuring that the improvement is effective.
Posted: 31 Aug 2020 | 15:33
SC, the world's largest high-performance computing conference, will be held virtually during the week of November 9-13. It is a disappointment not to be going to SC in person, but the flip side is that the conference will be open to a wider audience than would have been possible had it been held in Atlanta as originally planned.
This year I am organising the second run of the Urgent HPC workshop, which is an event aimed at bringing together those who are researching the role of HPC and data science in making urgent decisions to tackle disasters. The event first ran last year at SC19 and comprised a keynote talk by the founder of Technosylva – the world’s leading wildfire simulation code development company – six technical papers, and a panel. Based upon that success we decided to run the workshop again this year, and given all that has happened since then, exploring this topic is more timely than ever before!
Posted: 24 Aug 2020 | 18:11
Edoardo Coronado visited EPCC from 18th May–15th August 2019 under the HPC-Europa3 programme. Here he gives us an update on his research since his previous blog article.
During my visit to EPCC in 2019 under the auspices of the HPC-Europa3 programme, I worked on optimising routines to perform the Sparse Matrix-Vector (SpMV) product, a common operation in lots of scientific applications, on Intel Xeon Gold processors and Graphics Processing Units (GPUs).
Posted: 29 Jun 2020 | 14:38
We currently have applications open for PhD students, starting late 2020/early 2021 depending upon their situation. Four of these have been advertised on findaphd, with the closing date at the end of the month.
In addition to our MSc programmes, EPCC also hosts a number of PhD students who are researching a diverse set of areas: from traditional performance optimisation for HPC, to new data science technologies and novel computing architectures. Supervised by EPCC members of staff and housed in the Bayes, not only do these students benefit from being part of the UK’s leading HPC centre, but they also have access to the wider University’s large range of resources.
Posted: 23 Jun 2020 | 15:10
This week ISC, one of the largest conferences in the supercomputing calendar, should have been running in Frankfurt. It’s a funny feeling because, as I type, I realise that if it wasn’t for COVID-19 then, instead of being stuck at home, I would have been busy navigating the Messe Frankfurt, going from one session to another.