Posted: 28 Apr 2019 | 16:07
One of the benefits of teaching a Carpentry course is that it can increase or deepen your understanding of a subject. A recent instance for me was in using OpenRefine, a tool that runs locally on your machine (you do not have to export your data to a third party service).
OpenRefine can help you:
• Explore and clean/transform your data. You can reconcile your data with other external data sources, i.e. enrich your data using external data
• Create a new dataset. It does not modify your original data and keeps provenance of all the steps. Depending on the capabilities of your local machine it can deal with data sets that are up to about 100k rows.
Watch the videos on the OpenRefine website for a good overview. If you want to know more, follow the Carpentry OpenRefine for Ecologists lesson. In this example, I am going to show how easy is to generate a new dataset from the EPCC website. Follow along after you have installed OpenRefine on your system.
Posted: 26 Apr 2019 | 13:31
We are embarking on a survey of the HPC environment and would appreciate your input! Jean-Thomas Acquaviva, DDN Storage, explains the background below.
Supercomputers are helping us to solve the world greatest challenges but who is helping us to harness the computational power of supercomputers?
The HPC Certification Forum is a community effort that aims to provide an international certification programme for the skills needed by practitioners to harness supercomputers. It aims to clearly categorise, define, and examine competencies, which will be beneficial to all stake-holders involved in training and education. To make this effort relevant to the whole HPC community, a number of online surveys and polls will be created to capture and estimate the importance of the various tools, technologies, and skills expected from HPC practitioners.
Posted: 24 Apr 2019 | 11:51
The blog post below is based on the abstract of a talk at the PASC mini-symposium 'Modelling Cloud Physics: Preparing for Exascale' (Zurich, 13 June 2019).
The Met Office NERC Cloud model (MONC) is an atmospheric model used throughout the weather and climate community to study clouds and turbulent flows. This is often coupled with the CASIM microphysics model, which provides the capability to investigate interactions at the millimetre scale and study the formation and development of moisture. One of the main targets of these models is the problem of fog, which is very hard to model due to the high resolution required – for context the main UK weather forecast resolves to 1km, whereas the fog problem requires 1metre or less.
Posted: 16 Apr 2019 | 11:27
Funded by EXDCI-2, EPCC is organising a 2-day workshop in Edinburgh on 8-9 July 2019 to present ideas on how science and engineering can be supported from Raspberry Pis to large scale supercomputers. We are looking to investigate how to introduce these topics to school-age audiences including linking to existing school curricula. This will include examples of existing education materials and activities and looking ahead to current plans.
Posted: 15 Apr 2019 | 10:41
This year's Software Sustainability Institute's Collaboration's Workshop, CW19, was held from 1–3 April at the University of Loughborough. There were almost 70 attendees from all over the UK and further afield too - Germany, the Netherlands and the US. I am of the opinion that this is one of the best types networking workshops I have been to, possibly equalled by the UK RSE conferences.