Filter the glacier length and area time series

In this short tutorial, we show how to deal with unwanted “spikes” in the length and area time series of individual glaciers. These happen because OGGM currently doesn’t differentiate between snow and ice, i.e. occasional years with large snowfall can artificially increase the glacier area.

While the best solution would be to deal with this in OGGM, this is currently not possible because we do not have a generally applicable solution to this problem. In the meantime, we recommend a simple workaround.


import matplotlib.pyplot as plt
import xarray as xr
import os
from oggm import cfg, utils, workflow, tasks
Downloading salem-sample-data...
2021-09-15 09:53:57: oggm.cfg: Reading default parameters from the OGGM `params.cfg` configuration file.
2021-09-15 09:53:57: oggm.cfg: Multiprocessing switched OFF according to the parameter file.
2021-09-15 09:53:57: oggm.cfg: Multiprocessing: using all available processors (N=2)
2021-09-15 09:53:57: oggm.utils: Downloading to /github/home/OGGM/download_cache/
2021-09-15 09:53:58: oggm.utils: Checking the download verification file checksum...
2021-09-15 09:53:59: oggm.utils: Downloading to /github/home/.oggm/downloads.sha256.hdf...
2021-09-15 09:54:03: oggm.utils: Done downloading.
2021-09-15 09:54:03: oggm.utils: Checking the download verification file checksum...
2021-09-15 09:54:04: oggm.utils: No known hash for
cfg.PATHS['working_dir'] = utils.gettempdir(dirname='OGGM-Filter')

Define the glaciers for the run

We take the Kesselwandferner in the Austrian Alps:

rgi_ids = ['RGI60-11.00787']

Glacier directories

gdirs = workflow.init_glacier_directories(rgi_ids, from_prepro_level=4, prepro_border=80)
2021-09-15 09:54:05: oggm.workflow: init_glacier_directories from prepro level 4 on 1 glaciers.
2021-09-15 09:54:05: oggm.workflow: Execute entity task gdir_from_prepro on 1 glaciers
2021-09-15 09:54:05: oggm.utils: Downloading to /github/home/OGGM/download_cache/
2021-09-15 09:54:08: oggm.utils: /github/home/OGGM/download_cache/ verified successfully.


We can step directly to a new experiment! This runs under a random climate representative for the recent climate (1985-2015) and a warm temperature bias:

workflow.execute_entity_task(tasks.run_random_climate, gdirs,
                             nyears=200, y0=2000, seed=5,
2021-09-15 09:54:08: oggm.workflow: Execute entity task run_random_climate on 1 glaciers
2021-09-15 09:54:08: oggm.core.flowline: (RGI60-11.00787) run_random_climate_commitment
2021-09-15 09:54:09: oggm.core.flowline: (RGI60-11.00787) flowline_model_run_commitment

The problem

ds = utils.compile_run_output(gdirs, filesuffix='_commitment')
ds = ds.isel(rgi_id=0)  # take just the one glacier
2021-09-15 09:54:12: oggm.utils: Applying global task compile_run_output on 1 glaciers
/usr/local/pyenv/versions/3.8.12/lib/python3.8/site-packages/oggm/utils/ FutureWarning: The `filesuffix` kwarg is deprecated for compile_* tasks. Use input_filesuffix from now on.
  warnings.warn('The `filesuffix` kwarg is deprecated for '
2021-09-15 09:54:12: oggm.utils: Applying compile_run_output on 1 gdirs.

For small areas, the glacier has the unrealistic “spikes” described above.


A good way to deal with the issue is to run a moving filter which keeps the smallest area or length in a given window size:

roll_yrs = 5
# Take the minimum out of 5 years
ts = ds.area.to_series()
ts = ts.rolling(roll_yrs).min()
ts.iloc[0:roll_yrs] = ts.iloc[roll_yrs]
# Plot

It works the same with length:

# Take the minimum out of 5 years
ts = ds.length.to_series()
ts = ts.rolling(roll_yrs).min()
ts.iloc[0:roll_yrs] = ts.iloc[roll_yrs]
# Plot

What’s next?