Below I give a brief synopsis of my research interests in the topics below. No citations are given for the moment - sorry! - as I've only got time to write down some broad brushstrokes. I’ve highlighted a few potential projects, but there are many more possibilities.
Links: Efficient algorithms for Markov chain Monte Carlo simulaton, Virial series, Enumeration of self-avoiding walks and related objects, Endless walk models, Multi-gas modelling for climate change, Polymer physics and Hamiltonian paths, Mathematical visualisation, Backtracking algorithms implemented on GPUs .
Efficient algorithms for Markov chain Monte Carlo simulation
In 2010 I developed a new implementation of the pivot algorithm for self-avoiding walks which used a binary tree representation called a SAW-tree, building on earlier work of Tom Kennedy, Neal Madras and Alan Sokal. This radically efficient implementation allows for global changes to be made to the system in logarithmic time, and so opens up the possibility of studying truly large polymer systems with millions or even billions of monomers.
The underlying principle can be applied to any system for which simple sampling is the state of the art. I am pursuing applications to percolation, Hamiltonian paths (mentioned in more detail below), hard spheres, and the pricing of exotic options in quantitative finance, but there must be many more possibilities.
Jason Whyte has written up an excellent explanation of the basic idea at the SAW feature page.
Virial series for hard spheres and related models
The virial expansion is an old technique for understanding the low density behaviour of classical and quantum systems, which comes down to the evaluation of certain integrals that can be represented graphically. For the hard sphere model, also known as the billiard ball model of the gas, this leads to interesting research questions around the evaluation of graph polynomials, geometry of packings/sphere overlaps, high performance computing, and series analysis. The resulting series are interesting in and of themselves, but the key question is: can coefficients of the virial series be evaluated to sufficiently high order, and high accuracy, so that the series can tell us something about the fluid-solid phase transition?
The answer is completely unclear! However, it is possible to make a great stride forward in this area by performing a high powered computer experiment in combination with advanced algorithms.
Other related problems which are interesting, and possibly more tractable than hard spheres, are lattice hard cubes (amenable to lattice enumeration methods) and parallel hard squares and cubes (for which the integrals become rational numbers).
There are lots of fun, open problems in this area.
Enumeration of self-avoiding walks and related objects
Lattice enumeration is a powerful method for the study of lattice models in statistical mechanics. I recently developed, in collaboration with Iwan Jensen, an improved variant of the finite lattice method for enumeration of self-avoiding walks and polygons, which opens up the study of these models in three-dimensions. The improved method can also be applied to various 2d models, but there the relative gain is more modest.
The length-doubling algorithm developed by Schram, Barkema and Bisseling is another fascinating recent development in the enumeration of 3d self-avoiding walks.
Endless walk models
Recently, I introduced a variant of self-avoiding walks with no ends, which are roughly equivalent to necklaces in the field of combinatorics.
This simple idea can be applied to any walk model, and so opens up a whole host of fun problems, e.g. the derivation of exact results for endless variants of partially directed walks.
Multi-gas modelling for better policy decisions regarding climate change
Over the course of the coming century it is absolutely essential that humanity completely eliminate its dependence on fossil fuels, as long-lived greenhouse gases would otherwise cause sufficient amounts of warming and climate change that civilisation as we know could no longer function.
There is much science still to be done in this field, but equally important are questions of economics, so that the transition to a low-carbon economy can occur along the pathway of least cost. One key issue here is the trade-off between mitigating the most significant anthropogenic greenhouse gases, namely carbon dioxide and methane. Often you’ll see it stated that methane is 20 or 100 times as potent as carbon dioxide, but such statements usually gloss over a key point: the relative importance, as measured by the Global Warming Potential (GWP) metric, is determined by the time frame considered. Methane is relatively short-lived in the atmosphere, of the order of a decade, while atmospheric carbon dioxide has a number of different time scales, but a key point is that of the order of 5-10% of the carbon dioxide emitted now will remain in the atmosphere for centuries or even millenia.
A key consequence of this, which is not widely understood, is that net emission of carbon dioxide from fossil fuels must go to zero, while methane emissions from anthropogenic but non-fossil fuel sources (think emissions from ruminants in agriculture, i.e. farting cows) can continue forever. (They make a steady-state contribution to the increase in radiative forcing which results in global warming.)
There is an important space here for rigorous economic modelling combined with scientific understanding of the underlying processes. It is possible to approximately decouple the economics from the mix of greenhouse gases by introducing notions of equivalence which do a (much) better job than GWP on a wide variety of timescales, from a decade or two to multi-century.
Polymer physics and Hamiltonian paths
Monte Carlo algorithms which utilise so-called connectivity changing moves may be very effective for simulation of dense polymer systems, such as confined / crystallised polymers which can be represented by Hamiltonian paths, or polymer melts.
These moves can be performed extremely efficiently by using data structures like the SAW-tree.
It may also be possible to extend variants of the efficient Monte Carlo algorithms for self-avoiding walks (i.e. similar to the pivot move) to other polymer systems, such as confined polymers and the polymer theta transition (the collapse transition for polymers as they go from a good solvent to a poor solvent), and semi-dilute polymer systems.
It is also possible to efficiently calculate virial coefficients for polymers (the second virial coefficient corresponds to the so-called interpenetration ratio) via a combination of the SAW-tree data structure and scale-free moves which allow for a radically effective form of importance sampling.
Simulations and visualisations can be fantastic tools for both research, and for education, as they allow us to gain intuition into mathematical objects which is very hard to arrive at through pure thought (for me at least!).
Videos, especially in 3D, may be very powerful for mathematical objects.
Other fun tools which will allow for completely new experiences in the future are virtual reality, and 3d printing to create tangible realisations of mathematical objects.
I have a number of ideas for mathematically inspired games which arise out of algorithms for walks.
Now I will outline one concrete, realistically achievable Masters level project which is at the interface of computer science and mathematical physics.
The SAW-tree implementation of the pivot algorithm I have developed makes it feasible to sample uniformly at random extremely long self-avoiding walks. Recently, I have pushed this to the point where these walks may have up to 1 billion steps. Now, these are sparse, random fractal objects, with non-trivial structure from the largest to the smallest length scales. In two dimensions the self-avoiding constraint is strongest, and hence the walks are spread out more in this dimension. Walks with one billion steps have linear dimensions of roughly 5 million steps by 5 million, which could be represented in a 25 terapixel image if the full resolution is kept. This is of the same order as the largest images generated until now. (N.B., they are sparse, so this is admittedly an atypical case.)
Now, the same SAW-tree data structure that allows for efficient Monte Carlo simulation also allows for dramatic compression, and makes it possible to produce an interactive, zoomable image of such long SAWs as a web application. I think that if completed this would be a tour de force in mathematical visualisation! If you're interested in doing this let me know. (I'd like to, but have other things to do for now.)
Please visit the projects page if you are interested in seeing visualisations of my work.
Backtracking algorithms implemented on GPUs
GPUs are extremely powerful for some applications - namely those which are easily parallelisable, especially so-called embarrassingly parallel applications - but are not a general purpose replacement for CPUs.
One class of problems for which it is not obvious that efficient algorithms exist, are backtracking problems which are ubiquitous in combinatorial search and graph algorithms. These can in fact be cast in a form that is appropriate for GPUs, and this leads to a fun and challenging field of problems in algorithms in computer science, especially given that the technology is changing so quickly.