Deep Learning

Planning for high performance in MPI

Author: Daniel Holmes
Posted: 25 Jan 2018 | 14:36

Many HPC applications contain some sort of iterative algorithm and so do the same steps repeatedly, over and over again, with the data gradually converging to a stable solution. There are examples of this archetype in structural engineering, fluid flow, and all manner of other physical simulation codes.

Deep Learning at scale: SC17 talk

Author: Daniel Holmes
Posted: 8 Nov 2017 | 10:13

Are you interested in using machine learning for something big enough to need supercomputing resources?

Have you worked on, or with, one of the Deep Learning frameworks, like TensorFlow or Caffe?

Are you just curious about the state-of-the-art at the crossover between AI and HPC?

Demystifying data input to TensorFlow for deep learning

Author: Alan Gray
Posted: 29 Nov 2016 | 10:07

Shape SorterView this post on GitHub

TensorFlow is an incredibly powerful new framework for deep learning. The “MNIST For ML Beginners” and “Deep MNIST for Experts” TensorFlow tutorials give an excellent introduction to the framework. This article acts as a follow-on tutorial which addresses the following issues:

  1. The above tutorials use the MNIST dataset of hand written numbers, which pre-exists in TensorFlow TFRecord format and is loaded automatically. This can be a bit mysterious if you have no experience of data format manipulation in TensorFlow.
  2. Since the MNIST dataset is fixed, there is little scope for experimentation through adjusting the images and network to get a feel for how to deal with particular aspects of real data.

Blog Archive