Super Speedy Reef Modelling

Coral reefs and the islands that they protect from flooding are in big trouble. This is a recurring theme on this blog, and now it’s time for the latest update. We are currently building towards the development of an early flood warning system for low-lying tropical islands fronted by coral reefs. Our previous work on this topic has focused on finding ways to do this accurately for a wide variety of coral reef shapes and sizes, as well as different wave and sea level conditions. However, it’s not enough to be accurate- to deliver timely early warnings, you also need to be fast.

That’s where the latest research of Vesna Bertoncelj comes in.

I am extremely proud to announce that Vesna Bertoncelj has successfully defended her MSc thesis, “Efficient and accurate modeling of wave-driven flooding on coral reef-lined coasts: On the interpolation of parameterized boundary conditions“. I had the great privilege of sitting on her graduation committee and working with her over the past year or so.

Vesna’s research provides us with new approaches for making highly accurate predictions of coastal flooding, at limited computational expense. The numerical models that we use to estimate flooding often take a long time to simulate, since they resolve many complex physical processes at high resolution in space and time. However, by paring down these models to only the most essential components for the task at hand, we can do this much faster. My colleagues at Deltares recently developed the SFINCS model, which has been successfully used to predict flooding in a fraction of the time that our standard models take. But how do we put all these different pieces together?


A schematic overview of Vesna’s research methodology. [Source].

First, Vesna established a baseline for model performance by running a computationally intensive XBeach Non-Hydrostatic model (XB-NH+), and a much faster SFINCS model. These models provide an estimate for runup (R2%), which can be taken as a proxy for coastal flooding. In the second step, she used a lookup table (LUT) of pre-computed XBeach model output and to derive the input for the SFINCS model. The crucial task is doing this quickly and accurately, so she experimented with different interpolation techniques for deriving that input. She then compared her new approach with the standard models to find the fastest and most accurate combination.

Her research gives us a useful methodology that we can implement to speed up our early flood warning system, saving time and hopefully someday saving lives.

Vesna’s quality of work is excellent and she has a fantastic attitude towards research and collaboration. Her curiosity, professionalism, and diligence will undoubtedly serve her well in the years to come. I hope that we will have other opportunities to collaborate in the future. If anybody out there needs a bright young coastal researcher and/or modeller, hire her!

Rolling the Dice: Dealing with Uncertainty in Coastal Flood Predictions for Small Island Developing States

Small island developing states around the world are especially vulnerable to the hazards posed by sea level rise and climate change. As engineers, we have a number of tools in our toolbox for reducing the risk posed by coastal flooding and for planning adaptation measures. We often rely on predictive models which combine information about expected wave and sea level conditions, the topography of the coast, and vulnerable buildings and population to estimate potential flooding and expected damage.

However, to use these types of models, we first need to answer a lot of questions: what exactly are the expected wave and sea level conditions? What if detailed topographic measurements are unavailable? What if the population of a given coastal area increases? How are the local buildings constructed, and what are the consequences of that for estimating damage from flooding?

If our information is imperfect (which it almost always is), all is not lost: we can still make educated guesses or test the sensitivity of our models to a range of values. However, these uncertainties can multiply out of control rather quickly, so we need to be able to quantify them. There is no sense in spending the time to develop a detailed hydrodynamic model if your bathymetry data is crap. Can we get a better handle on which variables are the most important to quantify properly? Can we prioritize which data is the most important to collect? This would help us make better predictions, and to make better use of scarce resources (data collection is expensive, especially on remote islands!).

Matteo Parodi investigated these questions in his master’s thesis, and just published his first paper, “Uncertainties in coastal flood risk assessments in small island developing states“. I had the great privilege and joy of co-supervising Matteo during his thesis, and I am immensely proud of him and his work!

Based on a study of the islands of São Tomé and Príncipe, off the coast of Africa, Matteo found that topographic measurements and the relationship between flood depth and damage to buildings were the biggest uncertainties for predicting present-day flood damage. This means that measuring topography of vulnerable coastal areas in high resolution, and performing better post-disaster damage surveys will provide the best “bang for your buck” right now. However, for longer time horizons (i.e. the year 2100), uncertainty in sea level rise estimates become most important.

Matteo’s work will help coastal managers on vulnerable islands to better prioritize limited financial resources, and will improve the trustworthiness of our predictive models. Great job, Matteo!