Many of our academic staff are directly involved with sustainability research, in areas like energy, biodiversity and food supply. Read below for a snapshot of just some of the sustainability-related research in the School, in the researcher's own words. Energy and Resilience Miguel Anjos My research is concerned with using mathematical optimization to provide guaranteed optimal, or near-optimal, solutions for important classes of large-scale discrete nonlinear optimization problems arising in engineering applications. In particular, mathematical optimization can help to improve the overall performance of electric power systems, which are of critical importance to sustainability. I’m working to support the development of smart grids. These combine a traditional electrical power production, transmission, and distribution system with a two-way flow of information and energy between suppliers and consumers. This combination is expected to deliver energy savings and cost reductions, which are both key in keeping energy sustainable. I was also the Founding Director of the “Trottier Institute for Energy” at Polytechnique Montreal. The Institute was inaugurated in May 2013 and promotes sustainable solutions to secure the future of energy; with them I further served on the Energy Futures Project as an Expert Reviewer. Read more about energy networks, sustainable sources and the National Grid. I work in operational research, developing algorithms to solve optimization problems which are then applied to real-world issues. For the last ten years I’ve developed models to solve problems concerning energy and electricity: to use existing networks more efficiently, incorporate renewable energy, such as wind and solar, and store energy until it is needed. I focus on both long-term and short-term problems that, while asking different questions, use the same mathematics. So how does this relate to sustainability? We want to move to a system where we use gas and coal less, but sometimes a lack of sustainable energy sources (for example if there is no wind) means that energy needs to be obtained urgently from other less sustainable sources. This is very expensive, as the short notice means that other power stations or large-scale energy storage units need to be on standby constantly, which drives up the cost. If prices continue to rise, many could resort to generating their own electricity at home, which would further increase carbon emissions. It’s therefore very important to keep the price of energy down, and optimization models play a large part in this. I’m currently involved in a project with National Grid about how to plan the maintenance of the transmission system, as to maintain the lines you sometimes have to switch the power off! There is the possibility that this will prevent wind generated power from entering the system, which would likely be replaced with fossil fuel generation currently, the problem is about making sure the power keeps flowing, and not distinguishing between different types of power. In a future phase I hope to co-ordinate these actions to prioritise the entry of wind generated energy. Chris Dent My main focus these days is on the National Digital Twin programme’s Climate Resilience Demonstrator (CReDo) where I’m the technical lead. The Climate Resilience Demonstrator is looking at climate resilience of local area infrastructure systems – water, electric grid, data networks -- particularly their exposure to the risk of flooding. This brings together the important societal application with the overall goals of the Digital Twin programme which are about interoperability of data, and modelling between organisations. Read more about how data and modelling can be brought together to develop climate applications. Climate applications often provide a particularly complex environment for having to manage data from different sources and that of course makes it a very good demonstrator for the interlocking of mathematics, data science and climate prediction and energy forecasting, because it's a very rich environment for studying how different kinds of data and modelling need to be brought together. In terms of understanding how climate model outputs can best be used at decision making, and perhaps the most interesting thing for us, both as researchers and in our work supporting policy and decision-making, we’re interested in what the ultimate best practice here might be? That would mean, how might the next generation of climate models, the next generation of the UK climate projections studies, how those could those be designed to support decision making in a better way across the huge range of applications for which they are used? Ilaria Salerno I focus my PhD research on the heating and cooling consumption of the built environment by looking at two key areas: daily operations and the design process of buildings. Currently, we use approximately 30 kWh/day/dwelling for heating/cooling our homes. In order to optimise daily operations, our aim is to control the energy behaviour of a building by using the building itself. Read more about the benefits of thermal energy management systems, and their impact on decarbonising cities. The structure of our buildings acts as a sort of thermal storage, and it’s charging cycles can be optimised to achieve any goal. There are many factors which affect heating demand; therefore, we propose an optimisation framework that represents the brain of smart buildings. We call this TEMS (Thermal Energy Management System), and it aims to be an affordable solution to reduce energy consumption and decarbonise cities by providing flexibility to the grid. We also study building refurbishment; this is important because of the need to reduce our energy consumption. We model a framework that outputs the optimal actions, such as optimal values of insulation, to renovate a building. This is intended to be a tool used by both architects and engineers. I was able to test my knowledge in real projects whilst working as an intern with the University of Oxford on project LEO (Local Energy Oxfordshire). The aim of project LEO is to find evidence of the technological, market and social conditions needed for a more sustainable and affordable energy system. Alongside my PhD, I also work as a sustainable engineer intern in the IES company (Integrated Environment Solution). IES is a software and consultancy company that work to make our cities more sustainable. Lars Schewe I'm working in mathematical optimization. The goal is to support people in making decisions. I build mathematical models to derive optimal decisions. I have experience in various problems in the optimization of energy systems and energy markets. Currently, my main project is the optimal outage planning system (Optimal Outage Planning System — University of Edinburgh Research Explorer) that we're building with NESO. Read more about enery system opertations and renewable generation. With more and more renewable electricity generation, the task of the energy system operators becomes much more complicated. They must control more generation units than before. And renewable generation is less controllable and more unpredictable. This means that lots of tasks that could be done manually before now need to be automated to support the engineers. I've also started to use my expertise on general pipeline networks to study the design of hydrogen networks. Amy Wilson My work is centred on statistics and uncertainty. A large part of the climate crisis is (a) understanding what the current situation is, in terms of the climate and the use of energy, and understanding how these might evolve through time; and (b) understanding what can be done through policy to change the current trajectories. Mathematics has a lot to contribute to both of those points. For example, large computer simulators are often used to study the evolution of energy systems and to study the effect of different policy choices, but these simulators are not the real-world. Linking the outputs of the simulators to the real-world so that we can make robust decisions is a statistical problem. Read more about data-driven modelling, probabilistic structures and uncertainties. There are lots of different models and data sets that we can use to learn about the climate and energy use, but to relate those models and datasets to the real-world we need to understand and quantify the underlying probabilistic structure. That is something that statisticians and mathematicians can contribute to because there are lots of different uncertainties that need to be combined together in a principled way. For example, uncertainties about the inputs to the models; the models themselves; and the data sets (e.g. measurement error). As an applied statistician I often work with large computer simulators (or models) developed by other disciplines. These are physical models and they are based on very good science, but at the end of the day they are models, simplified versions of reality. If you want to use them to say anything about reality, then you need probability, because you need to understand the link between the two. You can do that if you have real world data, and you have a model because you can compare the two and that tells you something about how close your model is to the real world. But that kind of measure of how close, that's a probabilistic thing and we need statistics and uncertainty to understand that. Biodiversity in a changing climate Nicole Augustin I have worked on developing statistical methodology using Generalised Additive Mixed Models (GAMMs) for spatio-temporal trend estimation of natural resources, namely forests and fish stocks. For both, the methodology has been used to ensure sustainable management. The methodology relating to forests has been developed in close collaboration with forestry experts in Germany and has been adopted for official reporting in the national survey of German forest soil. Additionally, it has been used in reports on forest health for the states of Baden-Württemberg, Saxony and North Rhine-Westphalia; improving the monitoring of forest health in Germany. Read more about Nicole's use of Generalised Additive Mixed Models and the resulting outcomes. The model was also used for a simulation study on the optimal grid resolution of the German monitoring survey; these reports have a major impact on regional and national strategic policy decisions on forest maintenance. My work on this (Augustin et al, 2009) has been successful in the sense that the proposed model produced results which secured the continuation of the monitoring programme, and the methodology is used for official yearly reporting on the forest health status of Baden-Wuerttemberg. In addition, the methodology has recently been applied to forest health data of the whole of Germany for official reporting. My work in space-time modelling of blue ling, a species of cod, for fisheries’ stock management was a collaboration with scientists at the French Institute for Exploration of the Sea (IFREMER) and resulted in Augustin et al, 2013. The method uses Generalised Additive Mixed Models (GAMMs) similar to the above, but with a soap film smooth which allows us to define aerial boundaries. The methodology has been taken up in fisheries and other areas and I was invited to give a keynote speech on this at a workshop organised by the Center for the Advancement of Population Assessment Methodology (CAMPAM) in California. In addition, the results were used for setting fisheries’ quotas on blue ling. Augustin, N., Musio, M., von Wilpert, K., Kublin, E., Wood, S. and Schumacher, M., 2009. Modeling Spatiotemporal Forest Health Monitoring Data. Journal of the American Statistical Association, 104(487), pp.899-911. Augustin, N., Trenkel, V., Wood, S. and Lorance, P., 2013. Space-time modelling of blue ling for fisheries stock management. Environmetrics, 24(2), pp.109-119. Ruth King I am a statistical ecologist with interests in developing new statistical methods/models and their application to different ecosystems. Particular areas of interest include capture-recapture-type models; state-space models and hidden Markov models. These methods are applied to different ecological systems to address questions such as estimating population abundance (or trends); identifying underlying factors that drive ecological populations; and developing methods for improved inference via integrated modelling approaches. Read more about the wide variety of population estimation techniques and how these are developing with the assistance of new technology. Estimating the population sizes of wildlife species within a given area can be very important for conservation and management purposes. To estimate abundance a variety of different survey techniques may be employed, including capture-recapture-type approaches whereby the population is repeatedly sampled over a period of capture occasions. Models can then be developed, dependent on the specific survey protocol, to analyse the data to obtain an estimate of the total population size, amongst other interpretable quantities. The ability to estimate such quantities can be particularly important for endangered species, in terms of not only providing an estimate of absolute abundance but also to monitor relative abundance over time. Further, estimating the demographic parameters that influence their population dynamics over time (such as birth/death rates and associated factors that may influence these) may help within conservation management and/or assessment of policies. With advancing technology, new survey techniques and associated data are becoming more available such as motion sensor cameras, drones and even earth observation data using satellites. These new forms of data provide new associated statistical challenges, but also the potential for greater insight into particularly difficult to observe populations that are under threat from the many challenges such as habitat loss and climate change. Stuart King My research area is applied computational mathematics and data science. In particular this encompasses using remote sensing Earth observation data to answer questions of environmental or ecological significance. I have interests in applications of mathematics to detect forest loss and heat stress, natural hazard monitoring, and wildlife population detection and monitoring. Read more about the collection of satellite data, and how it is used to monitor tropical forest loss and assess populations of endangered species. I am interested in the various mathematical ways of dealing with images and sequences of imaging, to extract information about the scene being imaged. For Earth observation data the image sequences are usually taken from a satellite either working with reflected light (visible or infra-red light like a normal sort of photo), or involve radar where a satellite beams radio waves at the ground and images the reflection. Satellite data of these types is becoming more frequently collected, and higher resolutions are becoming available more routinely. This leads to a proliferation of data to analyse, and enables new types of questions to be answered. This requires expertise from across computing, applied mathematics, statistics and geosciences and sometimes also from an application area specialist. These interdisciplinary teams are exciting environments for mathematicians to work in and contribute to. Applications of this data I am interested in and have worked on include using satellite data to determine tropical forest loss and degradation, trying to determine areas of tropical rainforest where deforestation is happening. I am also interested in temperature measurements and imaging to determine heat stress on forest or plants that is being exacerbated by global warming. Also I am exploring ways of assessing animal populations (such as penguin colonies for example) using satellite imaging, and linking these to other types of data to monitor the numbers of endangered species. Tom Leinster Within sustainability, my main focus is biodiversity. Realistic measures of biodiversity should reflect not only the relative abundances of species, but also the differences between them; in my work I have created a natural family of diversity measures taking both factors into account. This is not just another addition to the already long list of diversity indices. Instead, it is a single formula that subsumes many of the most popular indices (like Shannon's, Simpson's, species richness, and Rao's quadratic entropy), which can then be used and understood in a unified way, with the relationships between them made plain. I’m also an advocate for the use of diversity profiles, which provide a graphical representation of the shape of an ecological community; they show how the perceived diversity changes as the emphasis shifts from rare to common species. Communities can usefully be compared by considering their diversity profiles, which is a far more subtle method than any relying on any single statistic. Read more about the theory of diversity measurement, and how the abstract nature of mathematics complements biological reality. What does it mean to quantify diversity? Briefly, it is to take a biological community and extract from it a numerical measure of its “diversity”. There are major practical and statistical challenges here, but my focus is on the fundamental conceptual problem, assuming that we have complete and perfect data. In both news media and scientific literature, the most common meaning given to the word “diversity” is simply the number of species present. This measure of diversity makes rare species count for as much as common ones: every species is precious. Certainly, this is an important quantity. However, it is not always very informative. For instance, the number of species of great ape on the planet is 8, but 99.99% of all great apes belong to just one species: us. In terms of global ecology, it is arguably more accurate to say that there is effectively only one species of great ape. But there is an opposing viewpoint that prioritizes the balance of communities. Common species are important; they are the ones that exert the most influence on the community. The theory of diversity measurement is driven by both abstract mathematical questions and biological reality. It is the subject of my book “Entropy and Diversity: The Axiomatic Approach”, which builds on the Ecology paper “Measuring Diversity: the importance of species similarity” (joint with Christina Cobbold). In all of this, abstract mathematical aesthetics and faithfulness to biological reality are seen to work in concert, rather than being in conflict. Gail Robertson I am a statistical consultant in the School of Mathematics’ Statistical Consultancy Unit. I work with academics and industry clients on various research projects providing statistical advice, carrying out data analysis, and developing new statistical methodologies. My background in ecology and spatial analysis mean that many of the projects I work on have an environmental focus, such as modelling the distribution of environmental contaminants across space, designing sampling regimes for the detection of contaminants, and developing appropriate modelling techniques to understand seabird behaviour at sea. Read more about statistical model validation and its use in projects with ad hoc data sets. The kind of data produced by environmental and ecological studies can pose challenges for statisticians. Such data often show a high amount of spatial and temporal autocorrelation, and are subject to various limitations due to difficulties in sampling and data collection, such as ad hoc sampling, missing data, and reliance on proxy variables. Working with ad hoc data collected using different sampling methods was a problem I encountered when working with environmental researchers on a project for the Scottish Government on estimating the number of houses in Scotland with internal lead piping. Lead is an important environmental contaminant which can be detected in tap water across Scotland due to the presence of internal lead piping in people’s homes. The Scottish Government would like to estimate the cost of replacing internal lead piping to reduce the population’s exposure to lead. We developed a modelling framework to predict the number of houses with lead pipes per postcode using data collated from various sources. We are currently examining ways of improving our predictions by accounting for spatial autocorrelation in a Poisson hurdle model and developing a new sampling regime to validate the model. Model validation is also the focus of an ecological research project that I am working on with colleagues in the School of Mathematics and Biomathematics and Statistics Scotland. In this project, we have located unique seabird tracking datasets that have been validated using observation data. We are using these data to examine the accuracy of hidden Markov models (HMMs) fitted to GPS data recording seabird movement during foraging trips. This work has important implications for marine conservation as HMMs fitted to GPS animal tracking data are used to identify important offshore foraging areas and advise on the location of marine protected areas and offshore developments. Sustainable food supply Ben Goddard I am an industrial coordinator for MAC-MIGS, helping to organise the work students and academics do with the industrial partners. MAC-MIGS is a PhD programme in Mathematical Modelling, Analysis and Computation run together by the Edinburgh and Heriot-Watt universities, under the banner of the Maxwell Institute. Many of the PhD students undertake projects relating to sustainability, for example studying how wind turbines affect bird migration, flight patterns and nesting behaviour. Read more about the use of bioreactors as a solution to reduce food waste in the home. Food waste is a huge problem related to sustainability. Recycling is currently the main solution to this, and is done on a global scale. IntelliDigest are developing a bioreactor to convert food waste to useful chemicals. People could have this bioreactor at home and then sell these chemicals to those who could use them, creating an individual solution to the problem. Ideally for this process, the food waste needs to be broken down into smaller pieces to increase surface area and speed up the reaction. IntelliDigest want to understand how fluids with particles of a variety of different shapes and sizes suspended in them, i.e. the food waste, act under different physical conditions. Existing mathematical models assume that all these particles are spherical and of the same size, however different particles may act differently. For example, it is relatively easy to create mathematical models for spherical particles as their geometry remains the same as they spin, and you can easily model spheres bouncing off one another. However, if we consider two lozenge shaped particles (ellipsoids), their geometry changes as they rotate, so the angle at which they collide is important. I am working to extend the existing models to consider a variety of different particles. These models have other potential applications, for example in different bioreactors, to separate materials, or even in the brewing of beer. Julian Hall I work in the field of optimization, which can be used for many practical purposes. My main interest is in solving large linear programming problems and developing optimization software, which is a fundamental tool that people can use in various applications, such as maximizing cost, or minimizing impact on the environment. From my work in the latter, I've found out firsthand that one of the interesting things about mathematics, in a sustainability context, is that it can lead you to rather surprising and counter-intuitive results. I was involved in a collaboration with the Roslin Institute that led to such a conclusion when researching the sustainability of beef production in the Brazilian Cerrado. Read more about how intensive beef production and pasture management could lead to greater carbon sequestration and reduced deforestation. Raising cattle is commonly viewed as being bad for the environment, due to emissions like methane released by cows, as well as the deforestation associated with raising them. However, our model of beef production in the Cerrado demonstrated that better pasture management could lead to greater carbon sequestration and reduced deforestation in the context of more intensive beef production. This is because as demand for beef increases, so does the incentive for farmers to take better care of their pastureland. In the Brazilian Cerrado, the grass genus, Brachiaria, which most pasturelands are made up of, is especially effective at absorbing the carbon dioxide in the air and storing it in its roots, due to their depth in the soil and length. This means that well-kept pastureland is more effective at absorbing carbon dioxide. We carried out the study by using our model to simulate different potential scenarios of increased or decreased demand for beef in 2030 and found that if deforestation rates are kept fixed by effective policy, then by 2030, if demand for beef increases by 30%, net emissions could decrease by 10%. Articles about our model were published in Nature Climate Change in 2016, and Agricultural Systems in 2017, afterwards being covered by various media outlets. At the time, the media measure for the article was the second highest in that the School had achieved! If you would like to read more about our study, here are a couple of links to the media pages that covered it: Carbon Brief; Clear on Climate article. "Higher beef producation could lower Brazil's emissions, study says" Phys.org article. "Eating less meat might not be the way to go green, say researchers" University of Edinburgh Research. "Rafael Silva finds beef-producing areas could boost greenhouse gas emissions" This article was published on 2025-04-22