Blog

Administrative Data Research (ADR)

Author: Mark Sawyer
Posted: 30 Jul 2021 | 10:01

ADR Scotland (Administrative Data Research Scotland) is a partnership combining specialists in the Scottish Government’s Data Sharing and Linkage Unit with the expertise of academic researchers at the Scottish Centre for Administrative Data Research. Together they are transforming how public sector data in Scotland are curated, accessed and explored, so it can deliver its full potential for policymakers and for the public.

Developing an outbreak analysis platform

Author: Lucy Norris
Posted: 27 Jul 2021 | 11:49

EPCC is working with ISARIC4C, the Coronavirus Clinical Characterisation Consortium, to develop an integrated analysis platform that will be hosted in part on the Edinburgh International Data Facility.

The data analysis platform provides a unique combination of linked, curated data from UK sovereign data assets, together with a flexible high performance compute space. Created for COVID-19 research, the ISARIC4C data analysis platform combines the data safeguards of an NHS trusted research environment, with more than £100M of new exabyte-scale computational capacity at the home of the UK national supercomputer. This creates a unique opportunity to combine clinical, biological, genomics and virology research in a secure, openly-accessible framework.

CompBioMed: creating virtual humans

Author: Gavin Pringle
Posted: 22 Jul 2021 | 14:09

EPCC is a Core Partner of CompBioMed, a European Centre of Excellence focused on the use and development of computational methods for biomedical applications. We prepare these applications for future exascale systems, to create the first virtual humans: digital twins to enable personalised medicine.

Quantum computing at EPCC

Author: Oliver Brown
Posted: 23 Jun 2021 | 09:15

Computing doesn’t come much more novel than quantum computing. 

Classical computers rely on the manipulation of bits, physical systems (usually transistors) that can be found in one of two states, which we label 0 and 1. Quantum computers on the other hand, use quantum bits (qubits) which can be measured in one of two physical states, still labelled 0 and 1, but can exist in any linear superposition of the two – they can be in 0 and 1.

Such a fundamental change at such a low level means that programming a quantum computer, or even developing quantum algorithms is very different to what we’re used to. 

Training & education provided by EPCC

Author: Tracy Peet
Posted: 21 Jun 2021 | 11:13

Through our links with national services, international collaborations and other projects, we deliver a wide range of training courses in high performance computing and data science.
http://www.epcc.ed.ac.uk/education-training

National HPC Services hosted by EPCC

Author: Tracy Peet
Posted: 21 Jun 2021 | 11:10

At our Advanced Computing Facility (ACF), we host a remarkable collection of high-performance computing and data services. These include Public Health Scotland’s National Safe Haven and UK national research systems such as ARCHER2, Cirrus and Tesseract, as well as smaller scale systems designed to explore new technologies.

Here are some highlights from our current work in this area.

Data-driven innovation at EPCC

Author: Tracy Peet
Posted: 21 Jun 2021 | 10:59

Some highlights from our current work in this area.

Novel architectures and hardware hosted by EPCC

Author: Tracy Peet
Posted: 21 Jun 2021 | 10:55

We provide world-class computing systems, data storage and support services. Here are some highlights from our work in this area.

ExCALIBUR FPGA testbed

ExCALIBUR is a £45.7m programme to address the challenges and opportunities offered by computing at the exascale (high performance computing at 1018 floating point operations per second). The programme will address problems of strategic importance, and how to approach them in an efficient, effective, and productive fashion on the world’s largest computers. 

Performance optimisation and application scaling activities at EPCC

Author: Tracy Peet
Posted: 21 Jun 2021 | 10:32

EPCC has a history of working with some of the most powerful supercomputers in the world. As such, we have decades of experience optimising and fine-tuning applications for the best performance. One of our key areas of research is HPC software optimisation to achieve maximum efficiency with the aim of delivering scientific results quickly and accurately. Here are some highlights from our current work in this area.

Accelerating high performance data pipelines using EPCC's supercomputing facilities

Author: Guest blogger
Posted: 15 Jun 2021 | 12:26

illuminate logoIlluminate, a commercial collaborator of EPCC, has a mission to accelerate informed decisions by providing their customers with the means to find specific needle in a haystack data points from the mass of their network traffic. As networks evolve, they are continuously innovating to ensure that their data pipelines operate at maximum efficiency.

Data analytics pipelines have three main functions: data ingestion, processing, and analysis. The data ingestion stage focuses on obtaining or extracting data from a range of data sources and importing it into systems that can enable processing and analysis. This is followed by data processing to validate, transform and enrich it, and load it into some form of queryable data storage, such as a data lake. Finally, we utilise Artificial Intelligence/Machine Learning processes to build models from the data in the lake and use them to gain insight.

 

Pages

Blog Archive

Contact

Tracy Peet
+44 (0) 131 650 5362
t.peet@epcc.ed.ac.uk