Posted: 11 May 2021 | 10:36
EPCC held its first GPU hackathon this April in partnership with NVIDIA, hosting 28 participants and 20 mentors across seven teams. The event was held virtually due to the ongoing Covid-19 pandemic. Using Zoom and Slack, individual teams were able to work alongside mentors in separate breakout rooms and channels.
Posted: 5 May 2021 | 10:21
EPCC is a partner in the EXCELLERAT project, a single point of access for expertise on how data management, data analytics, visualisation, simulation-driven design and Co-design with high-performance computing (HPC) can benefit engineering, especially in the aeronautics, automotive, energy and manufacturing sectors.
Posted: 21 Apr 2021 | 15:38
The Software Sustainability Institute’s (SSI) Research Software Camp on research accessibility took place from 21 February to 5 March 2021. The Camp focussed on different aspects of research accessibility, ranging from making tools, datasets and software reproducible, open and sustainable, to exploring formats and tools to facilitate and improve accessibility for the disabled. In this post, we will share how we used relevant sections of the SSI Event Organisation Guide to plan, organise, and deliver the Camp.
Posted: 14 Apr 2021 | 10:46
The HPC Systems Team provides the System Development and System Operations functions for ARCHER2 - but who are we and what do we do?
We are a team of fifiteen System Administrators and Developers who work to deploy, manage, and maintain the services and systems offered by EPCC, as well as the infrastructure required to host and support all of EPCC’s services and systems.
Posted: 1 Mar 2021 | 14:03
Choosing the right software for use in a research software project can be challenging. How do we know which software is both fit for purpose and provides a sound basis for our project for the foreseeable future? And, how do we make such a choice given that the time and effort to explore what could be myriad alternatives may be limited?
This was a challenge we faced in the RiboViz project, a multi-disciplinary team of biologists, bioinformaticians and research software engineers based at EPCC and The Wallace Lab at University of Edinburgh, The Shah Lab at Rutgers University, and The Lareau Lab at University of California, Berkeley. RiboViz is an open source package to help us develop our understanding of protein synthesis via analysis of ribosome profiling data. At the heart of RiboViz is an analysis workflow, implemented in a Python script. To conform to best practices for scientific computing, which recommend the use of build tools to automate workflows and to reuse code instead of rewriting it, we sought to reimplement this workflow within a bioinformatics workflow management system.
Novel ammonia-hydrogen sulphide mixtures under extreme conditions: Implications for Ice-giant planetsAuthor: Guest blogger
Posted: 12 Feb 2021 | 13:14
Sudip Kumar Mondal from Jadavpur University in Kolkata visited the University of Edinburgh through the HPC-Europa3 Transnational Access programme. Falling from 11 October to 29 January, Sudip's visit was unlike most others because the Covid pandemic was ongoing. In this blog post Sudip describes his experiences in coming to and working in Edinburgh.
My doctoral research focuses on the physical behaviour of naturally occurring mineral phases at high temperature and pressure conditions by employing quantum chemical simulations which are hard to realize through experiments at the laboratory. Dr Andreas Hermann, my supervisor at the University of Edinburgh, was undoubtedly perfect for supervising this project, being an expert on materials at extreme conditions.
Posted: 1 Feb 2021 | 12:10
Since 2005, the Advanced Computing Facility (ACF) has housed all the major systems managed by EPCC. It has expanded and evolved since its creation, becoming one of the most innovative and efficient facilities of its kind in the world.
The building and its internals have changed greatly since I started in February 2018, as part of a drive to ensure that our wider master planning for the site is reflected in what visitors see. This includes a video wall using Raspberry Pis and PiWall software to allow us demonstrate HPC visualisations to visitors.
We are developing a site-wide Data Centre Infrastructure Management (DCIM) approach which allows us to view real-time data or room and system performance on screens outside of different rooms and on our video wall.
In addition, the ACF has had significant investment over the years, most recently with the creation of Computer Room 4, the home of the new Edinburgh International Data Facility (EIDF). We also host and support a number of other HPC systems at the ACF, such as the National Tier-2 system, Cirrus. The first phase of the next UK national supercomputing service, ARCHER2, has also been installed.
Posted: 29 Jan 2021 | 11:03
EPCC's Advanced Computing Facility (ACF) was born through necessity – but the opportunity was taken to create an expandable and highly-efficient facility that attracted future business including national computing services. Mike Brown, EPCC’s Director of HPC Operations until 2017, gives an overview of the development of this unique building.
Posted: 13 Jan 2021 | 10:35
Work on the Edinburgh International Data Facility has passed three key milestones, bringing the infrastructure that will underpin the £600m Data-Driven Innovation Programme significantly closer to reality.
Firstly, and perhaps most importantly, EIDF’s home, Computer Room 4 (cr4) at the University’s Advanced Computing Facility, completed its main construction phase at the end of the third quarter of 2020 and cr4 has entered its commissioning and fit-out phase. If everything goes to plan, we will start to build infrastructure in the room from January 2021.
Posted: 11 Dec 2020 | 08:50
By Edward Wallace, School of Biological Sciences, University of Edinburgh.
Why I need sustainable software for my research
I run a lab, or research group, in the School of Biological Sciences at Edinburgh. My group is funded by the Wellcome Trust, the Royal Society and BBSRC. We're interested in questions about how cells decide which proteins to make and when. Also, how cells change which proteins they make when they learn something from the environment and need to change what they're doing. In the 21st century we collect some very large datasets to measure this. There are datasets based on sequencing the RNA, which encodes protein, and datasets that measure all the proteins in cells at the same time. These datasets are measuring thousands of different things in many samples, often dozens of samples. Each dataset is gigabytes in size, and so it's quite hard work to dig into them and get the simplest and most relevant answers about what cells are doing.