Ice thickness inversion

This example shows how to run the OGGM ice thickness inversion model with various ice parameters: the deformation parameter A and a sliding parameter (fs).

There is currently no “best” set of parameters for the ice thickness inversion model. As shown in Maussion et al. (2019), the default parameter set results in global volume estimates which are a bit larger than previous values. For the consensus estimate of Farinotti et al. (2019), OGGM participated with a deformation parameter A that is 1.5 times larger than the generally accepted default value.

There is no reason to think that the ice parameters are the same between neighboring glaciers. There is currently no “good” way to calibrate them, or at least no generaly accepted one. We won’t discuss the details here, but we provide a script to illustrate the sensitivity of the model to this choice.

New in version 1.4: we demonstrate how to apply a new global task in OGGM, workflow.calibrate_inversion_from_consensus to calibrate the A parameter to match the consensus estimate from Farinotti et al. (2019).

At the end of this tutorial, we show how to distribute the “flowline thicknesses” on a glacier map.


# Libs
import geopandas as gpd

# Locals
import oggm.cfg as cfg
from oggm import utils, workflow, tasks, graphics

# Initialize OGGM and set up the default run parameters
rgi_region = '11'  # Region Central Europe

# Local working directory (where OGGM will write its output)
WORKING_DIR = utils.gettempdir('OGGM_Inversion')
cfg.PATHS['working_dir'] = WORKING_DIR

# This is useful here
cfg.PARAMS['use_multiprocessing'] = True

# RGI file
path = utils.get_rgi_region_file(rgi_region)
rgidf = gpd.read_file(path)

# Select the glaciers in the Pyrenees
rgidf = rgidf.loc[rgidf['O2Region'] == '2']

# Sort for more efficient parallel computing
rgidf = rgidf.sort_values('Area', ascending=False)

# Go - get the pre-processed glacier directories
# We start at level 3, because we need all data for the inversion
gdirs = workflow.init_glacier_directories(rgidf, from_prepro_level=3, prepro_border=10)

# Because of recent changes in OGGM not yet available in the preprocessed directories, 
# we re-run this task:
workflow.execute_entity_task(tasks.prepare_for_inversion, gdirs)

# Default parameters
# Deformation: from Cuffey and Patterson 2010
glen_a = 2.4e-24
# Sliding: from Oerlemans 1997
fs = 5.7e-20
2022-10-07 12:51:58: oggm.cfg: Reading default parameters from the OGGM `params.cfg` configuration file.
2022-10-07 12:51:58: oggm.cfg: Multiprocessing switched OFF according to the parameter file.
2022-10-07 12:51:58: oggm.cfg: Multiprocessing: using all available processors (N=2)
2022-10-07 12:51:58: oggm.cfg: Multiprocessing switched ON after user settings.
InvalidParamsError                        Traceback (most recent call last)
Cell In [1], line 31
     27 rgidf = rgidf.sort_values('Area', ascending=False)
     29 # Go - get the pre-processed glacier directories
     30 # We start at level 3, because we need all data for the inversion
---> 31 gdirs = workflow.init_glacier_directories(rgidf, from_prepro_level=3, prepro_border=10)
     33 # Because of recent changes in OGGM not yet available in the preprocessed directories, 
     34 # we re-run this task:
     35 workflow.execute_entity_task(tasks.prepare_for_inversion, gdirs)

File /usr/local/pyenv/versions/3.10.7/lib/python3.10/site-packages/oggm/, in init_glacier_directories(rgidf, reset, force, from_prepro_level, prepro_border, prepro_rgi_version, prepro_base_url, from_tar, delete_tar)
    348     reset = utils.query_yes_no('Delete all glacier directories?')
    350 if from_prepro_level:
--> 351     url = utils.get_prepro_base_url(base_url=prepro_base_url,
    352                                     border=prepro_border,
    353                                     prepro_level=from_prepro_level,
    354                                     rgi_version=prepro_rgi_version)
    355     if cfg.PARAMS['has_internet'] and not utils.url_exists(url):
    356         raise InvalidParamsError("base url seems unreachable with these "
    357                                  "parameters: {}".format(url))

File /usr/local/pyenv/versions/3.10.7/lib/python3.10/site-packages/oggm/utils/, in get_prepro_base_url(base_url, rgi_version, border, prepro_level)
   1275 """Extended base url where to find the desired gdirs."""
   1277 if base_url is None:
-> 1278     raise InvalidParamsError('Starting with v1.6, users now have to '
   1279                              'explicitly indicate the url they want '
   1280                              'to start from.')
   1282 if not base_url.endswith('/'):
   1283     base_url += '/'

InvalidParamsError: Starting with v1.6, users now have to explicitly indicate the url they want to start from.
with utils.DisableLogger():  # this scraps some output - to use with caution!!!
    # Correction factors
    factors = [0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1]
    factors += [1.1, 1.2, 1.3, 1.5, 1.7, 2, 2.5, 3, 4, 5]
    factors += [6, 7, 8, 9, 10]

    # Run the inversions tasks with the given factors
    for f in factors:
        # Without sliding
        suf = '_{:03d}_without_fs'.format(int(f * 10))
        workflow.execute_entity_task(tasks.mass_conservation_inversion, gdirs,
                                     glen_a=glen_a*f, fs=0)
        # Store the results of the inversion only
        utils.compile_glacier_statistics(gdirs, filesuffix=suf,

        # With sliding
        suf = '_{:03d}_with_fs'.format(int(f * 10))
        workflow.execute_entity_task(tasks.mass_conservation_inversion, gdirs,
                                     glen_a=glen_a*f, fs=fs)
        # Store the results of the inversion only
        utils.compile_glacier_statistics(gdirs, filesuffix=suf,

Read the data

The data are stored as csv files in the working directory. The easiest way to read them is to use pandas!

import matplotlib.pyplot as plt
import pandas as pd
import numpy as np
from scipy import stats
import os

Let’s read the output of the inversion with the default OGGM parameters first:

df = pd.read_csv(os.path.join(WORKING_DIR, 'glacier_statistics_011_without_fs.csv'), index_col=0)

There are only 35 glaciers in the Pyrenees! That’s why the run was relatively fast.


One way to visualize the output is to plot the volume as a function of area in a log-log plot, illustrating the well known volume-area relationship of mountain glaciers:

ax = df.plot(kind='scatter', x='rgi_area_km2', y='inv_volume_km3')
ax.semilogx(); ax.semilogy()
xlim, ylim = [1e-2, 0.7], [1e-5, 0.05]
ax.set_xlim(xlim); ax.set_ylim(ylim);

As we can see, there is a clear relationship, but it is not perfect. Let’s fit a line to these data (the “volume-area scaling law”):

# Fit in log space 
dfl = np.log(df[['inv_volume_km3', 'rgi_area_km2']])
slope, intercept, r_value, p_value, std_err = stats.linregress(dfl.rgi_area_km2.values, dfl.inv_volume_km3.values)

In their seminal paper, Bahr et al. (1997) describe this relationship as:

\[V = \alpha S^{\gamma}\]

With V the volume in km\(^3\), S the area in km\(^2\) and \(\alpha\) and \(\gamma\) the scaling parameters (0.034 and 1.375, respectively). How does OGGM compare to these in the Pyrenees?

print('power: {:.3f}'.format(slope))
print('slope: {:.3f}'.format(np.exp(intercept)))
ax = df.plot(kind='scatter', x='rgi_area_km2', y='inv_volume_km3', label='OGGM glaciers')
ax.plot(xlim, np.exp(intercept) * (xlim ** slope), color='C3', label='Fitted line')
ax.semilogx(); ax.semilogy()
ax.set_xlim(xlim); ax.set_ylim(ylim);

Sensitivity analysis

Now, let’s read the output files of each run separately, and compute the regional volume out of them:

dftot = pd.DataFrame(index=factors)
for f in factors:
    # Without sliding
    suf = '_{:03d}_without_fs'.format(int(f * 10))
    fpath = os.path.join(WORKING_DIR, 'glacier_statistics{}.csv'.format(suf))
    _df = pd.read_csv(fpath, index_col=0, low_memory=False)
    dftot.loc[f, 'without_sliding'] = _df.inv_volume_km3.sum()
    # With sliding
    suf = '_{:03d}_with_fs'.format(int(f * 10))
    fpath = os.path.join(WORKING_DIR, 'glacier_statistics{}.csv'.format(suf))
    _df = pd.read_csv(fpath, index_col=0, low_memory=False)
    dftot.loc[f, 'with_sliding'] = _df.inv_volume_km3.sum()

And plot them:

plt.xlabel('Factor of Glen A (default 1)'); plt.ylabel('Regional volume (km$^3$)');

As you can see, there is quite a difference between the solutions. In particular, close to the default value for Glen A, the regional estimates are very sensitive to small changes in A. The calibration of A is a problem that has yet to be resolved by global glacier models…

New in version 1.4: calibrate to match the consensus estimate

Here, a “best Glen A” is found in order that the total inverted volume of the glaciers of gdirs fits to the 2019 consensus estimate.

cdf = workflow.calibrate_inversion_from_consensus(gdirs, filter_inversion_output=False)

Note that here we calibrate the Glen A parameter to a value that is equal for all glaciers of gdirs (here \(A \sim 9.504\cdot A_0\)), i.e. we calibrate to match the total volume of all glaciers and not to match them individually.


just as a side note, “vol_bsl_itmix_m3” means volume below sea level and is therefore zero for these Alpine glaciers!

Distributed ice thickness

The OGGM inversion and dynamical models use the “1D” flowline assumption: for some applications, you might want to use OGGM to create distributed ice thickness maps. Currently, OGGM implements two ways to “distribute” the flowline thicknesses, but only the simplest one works robustly:

# Distribute
workflow.execute_entity_task(tasks.distribute_thickness_per_altitude, gdirs);

We just created a new output of the model, which we can access in the gridded_data file:

# xarray is an awesome library! Did you know about it?
import xarray as xr
import rioxarray as rioxr
ds = xr.open_dataset(gdirs[0].get_filepath('gridded_data'))

Since some people find geotiff data easier to read than netCDF, OGGM also provides a tool to convert the variables in file to a geotiff file:

# save the distributed ice thickness into a geotiff file
workflow.execute_entity_task(tasks.gridded_data_var_to_geotiff, gdirs, varname='distributed_thickness')

# The default path of the geotiff file is in the glacier directory with the name "distributed_thickness.tif"
# Let's check if the file exists
for gdir in gdirs:
    path = os.path.join(gdir.dir, 'distributed_thickness.tif')
    assert os.path.exists(path)
# Open the last file with xarray's open_rasterio

In fact, tasks.gridded_data_var_to_geotiff() can save any variable in the file. The geotiff is named as the variable name with a .tif suffix. Have a try by yourself ;-)

Plot many glaciers on a map

Let’s select a group of glaciers close to each other:

rgi_ids = ['RGI60-11.0{}'.format(i) for i in range(3205, 3211)]
sel_gdirs = [gdir for gdir in gdirs if gdir.rgi_id in rgi_ids]
# you might need to install motionless if it is not yet in your environment

Using OGGM

Since a recent PR (21.05.2020), OGGM can plot the thickness of a group of glaciers on a map:


This is however not very useful because OGGM can only plot on a map as large as the local glacier map of the first glacier in the list. See this issue for a discussion about why.

Using salem

Under the hood, OGGM uses salem to make the plots. Let’s do that for our case: it requires some manual tweaking, but it should be possible to automatize this better in the future.

Note: this also requires a very recent version of salem to work (21.05.2020)

import salem
# Make a grid covering the desired map extent
g = salem.mercator_grid(center_ll=(0.65, 42.64), extent=(4000, 4000))
# Create a map out of it
smap = salem.Map(g, countries=False)
# Add the glaciers outlines
for gdir in sel_gdirs:
    crs = gdir.grid.center_grid
    geom = gdir.read_pickle('geometries')
    poly_pix = geom['polygon_pix']
    smap.set_geometry(poly_pix, crs=crs, fc='none', zorder=2, linewidth=.2)
    for l in poly_pix.interiors:
        smap.set_geometry(l, crs=crs, color='black', linewidth=0.5)
f, ax = plt.subplots(figsize=(6, 6))
# Now add the thickness data
for gdir in sel_gdirs:
    grids_file = gdir.get_filepath('gridded_data')
    with utils.ncDataset(grids_file) as nc:
        vn = 'distributed_thickness'
        thick = nc.variables[vn][:]
        mask = nc.variables['glacier_mask'][:]
    thick = np.where(mask, thick, np.NaN)
    # The "overplot=True" is key here
    # this needs a recent version of salem to run properly
    smap.set_data(thick, crs=gdir.grid.center_grid, overplot=True)
# Set colorscale and other things
# Plot
f, ax = plt.subplots(figsize=(6, 6))
smap.visualize(ax=ax, cbar_title='Glacier thickness (m)');