Data

Investigating high-performance data engineering

Author: Rob Baxter
Posted: 8 Jan 2018 | 16:03

Big data has always been a part of high-performance computing and the science it supports, but new open-source technologies are now being applied to a wider range of scientific and business problems. We’ve spent time recently testing some of the big data toolkits.

Making Scotland's medical images available for research

Author: Ally Hume
Posted: 20 Dec 2017 | 13:37

I've worked on many data analysis projects in my career and two common themes are that obtaining the data can be a  significant challenge and that once you obtain it you’ll notice it is very messy.

Building earthquake resilience in at-risk communities

Author: Rob Baxter
Posted: 15 Jun 2017 | 13:33

Earthquakes have caused over three-quarters of a million deaths already in this century, and economic losses of over a quarter of a trillion US dollars since 1980, make them by far the most destructive of the natural hazards. EPCC has been involved in developing a new app that will lessen the danger of aftershocks.

Building a scaleable, extensible data infrastructure

Author: Amy Krause
Posted: 8 Jul 2016 | 14:48

Modern genome-sequencing technologies are easily capable of producing data volumes that can swamp a genetic researcher’s existing computing infrastructure. EPCC is working with the breeding company Aviagen to build a system that allows such researchers to scale up their data infrastructures to handle these increases in volume without compromising their analytical pipelines.

Creating a safe haven for health data

Author: Donald Scobbie
Posted: 6 Jul 2016 | 14:36

Safe havens allow data from electronic records to be used to support research when it is not practicable to obtain individual patient consent while protecting patient identity and privacy. EPCC is now the operator of the new NHS National Services Scotland (NSS) national safe haven in collaboration with the Farr Institute of Health Informatics Research which provides the infrastructure. 

Large Synoptic Survey Telescope: data-intensive research in action

Author: George Beckett
Posted: 24 May 2016 | 09:49

This is an exciting time for astronomy in the UK, a fact that is reflected by our involvement and leadership of some amazingly ambitious new telescopes.

A number of recent, significant discoveries have propelled astronomy research into the spotlight. The discovery of dark matter and dark energy at the beginning of the 21st century over-turned our understanding of how the Universe works. And the first observation of a gravitational wave earlier this year confirmed Albert Einstein’s long-standing hypothesis precisely 100 years after it was first published in his general theory of relativity.

Get into SHAPE! Removing barriers to HPC adoption for SMEs

Author: Paul Graham
Posted: 13 Apr 2016 | 14:47

SHAPE (SME HPC Adoption Programme in Europe) is a pan-European initiative supported by PRACE (Partnership for Advanced Computing in Europe). The Programme aims to raise awareness and provide European SMEs with the expertise necessary to take advantage of the innovation possibilities created by high-performance computing (HPC), thus increasing their competitiveness. SHAPE allows SMEs to benefit from the expertise and knowledge developed within the top-class PRACE Research Infrastructure.

Data infrastructure: highlights of the EUDAT Conference 2013

Author: Rob Baxter
Posted: 13 Nov 2013 | 14:27

EUDAT - the European Data Infrastructure project - has reached the end of its second year and has, with some success, distilled the first version of a common, collaborative, horizontal data infrastructure from among the vertical stacks of its various partners.

Data interoperability is a state of mind

Author: Rob Baxter
Posted: 9 Sep 2013 | 09:58

The research data tsunami is firmly upon us. Open access to data is very much on the agenda. One of the hopes for capturing and preserving all these data is that reuse and recombination may yield new science. Improving the interoperability of data from different domains is key to making this a reality.

Now, data interoperability is not technically hard, so why are we not further on?

Digging deeper into digital preservation

Author: Alistair Grant
Posted: 2 Sep 2013 | 19:20

A few months have gone by on the Pericles project (see my earlier post), more meetings have passed and more are coming up, but in between meetings, we do actually get some work done as well!

Preserving art, records and other items has been a challenge throughout history, not just how to store them but how to help future generations to understand them. Even in the short time digital art and records have been around, this problem has become increasingly apparent in modern technology. It is exacerbated by the rapid cycles that technology follows. Pericles is attempting to define and develop a framework or method to manage how digital data is stored in archives and how to keep the archives relevant and accessible. A small challenge it is not.

Pages