Mass balance parameter perturbation experiments with OGGM#

By now, we assume that you have read and run the mass balance calibration tutorial. We know that it is a lot of new information to take in!

In this notebook, we will:

  • re-iterate and discuss the important role that mass balance calibration plays in the projections

  • give you some tools to run parameter perturbation experiments (this will also illustrate useful aspects of the OGGM internals)

  • provide some keys about how to address the calibration process in your use case


import matplotlib.pyplot as plt
import matplotlib
import pandas as pd
import xarray as xr
import numpy as np
import os

import oggm
from oggm import cfg, utils, workflow, tasks, graphics
from oggm.core import massbalance
from oggm.core.massbalance import mb_calibration_from_scalar_mb, mb_calibration_from_geodetic_mb, mb_calibration_from_wgms_mb
/tmp/ipykernel_21792/ DeprecationWarning: 
Pyarrow will become a required dependency of pandas in the next major release of pandas (pandas 3.0),
(to allow more performant data types, such as the Arrow string type, and better interoperability with other libraries)
but was not found to be installed on your system.
If this would cause problems for you,
please provide us feedback at
  import pandas as pd
cfg.PATHS['working_dir'] = utils.gettempdir(dirname='OGGM-calib-pertubation', reset=True)
cfg.PARAMS['border'] = 80
2024-02-02 17:12:33: oggm.cfg: Reading default parameters from the OGGM `params.cfg` configuration file.
2024-02-02 17:12:33: oggm.cfg: Multiprocessing switched OFF according to the parameter file.
2024-02-02 17:12:33: oggm.cfg: Multiprocessing: using all available processors (N=4)

We start from our two well known glaciers in the Austrian Alps, Kesselwandferner and Hintereisferner. But you can also choose any other other glacier, e.g. from this list.

# we start from preprocessing level 5
base_url = ''
gdirs = workflow.init_glacier_directories(['RGI60-11.00787',  'RGI60-11.00897'], from_prepro_level=5, prepro_base_url=base_url)
2024-02-02 17:12:34: oggm.workflow: init_glacier_directories from prepro level 5 on 2 glaciers.
2024-02-02 17:12:34: oggm.workflow: Execute entity tasks [gdir_from_prepro] on 2 glaciers

Changing the mass balance parameters for the mass balance model#

We just downloaded the data. Let’s have a look at the calibrated parameters for these glaciers:

# Pick each glacier
gdir_kwf = gdirs[0]
gdir_hef = gdirs[1]
{'rgi_id': 'RGI60-11.00897',
 'bias': 0,
 'melt_f': 5.0,
 'prcp_fac': 3.5331695984128135,
 'temp_bias': 1.7583130779353044,
 'reference_mb': -1100.3,
 'reference_mb_err': 171.8,
 'reference_period': '2000-01-01_2020-01-01',
 'mb_global_params': {'temp_default_gradient': -0.0065,
  'temp_all_solid': 0.0,
  'temp_all_liq': 2.0,
  'temp_melt': -1.0},
 'baseline_climate_source': 'GSWP3_W5E5'}
{'rgi_id': 'RGI60-11.00787',
 'bias': 0,
 'melt_f': 5.0,
 'prcp_fac': 2.96122858393561,
 'temp_bias': 1.7583130779353044,
 'reference_mb': -514.8000000000001,
 'reference_mb_err': 136.3,
 'reference_period': '2000-01-01_2020-01-01',
 'mb_global_params': {'temp_default_gradient': -0.0065,
  'temp_all_solid': 0.0,
  'temp_all_liq': 2.0,
  'temp_melt': -1.0},
 'baseline_climate_source': 'GSWP3_W5E5'}

These parameters are stored in a file called mb_calib.json in the glacier directory. This file is then read by the mass balance model when created:

mbmod = massbalance.MonthlyTIModel(gdir_hef)
{'rgi_id': 'RGI60-11.00897',
 'bias': 0,
 'melt_f': 5.0,
 'prcp_fac': 3.5331695984128135,
 'temp_bias': 1.7583130779353044,
 'reference_mb': -1100.3,
 'reference_mb_err': 171.8,
 'reference_period': '2000-01-01_2020-01-01',
 'mb_global_params': {'temp_default_gradient': -0.0065,
  'temp_all_solid': 0.0,
  'temp_all_liq': 2.0,
  'temp_melt': -1.0},
 'baseline_climate_source': 'GSWP3_W5E5'}

Therefore, if you want to mess around with these parameters, “all you have to do” is to overwrite this file somehow, or create a new one and ask the mass balance model to read it instead of the default one. Let’s do that:

params = gdir_hef.read_json('mb_calib')
params['melt_f'] = 7  # a new value
gdir_hef.write_json(params, 'mb_calib', filesuffix='_perturbed')  # write a new file, with perturbed parameters

We can read it in with:

mbmod_perturbed = massbalance.MonthlyTIModel(gdir_hef, mb_params_filesuffix='_perturbed')
TypeError                                 Traceback (most recent call last)
Cell In[9], line 1
----> 1 mbmod_perturbed = massbalance.MonthlyTIModel(gdir_hef, mb_params_filesuffix='_perturbed')
      2 mbmod_perturbed.calib_params

TypeError: MonthlyTIModel.__init__() got an unexpected keyword argument 'mb_params_filesuffix'

Just for fun, check what this means for the mass balance:

h = np.linspace(2000, 3800, 80)
mb_default = mbmod.get_annual_mb(h, year=1980) * cfg.SEC_IN_YEAR * cfg.PARAMS['ice_density']
mb_perturbed = mbmod_perturbed.get_annual_mb(h, year=1980) * cfg.SEC_IN_YEAR * cfg.PARAMS['ice_density']
plt.plot(mb_default, h);
plt.plot(mb_perturbed, h);
plt.xlabel('Mass balance (mm w.e)'); plt.ylabel('Elevation');
NameError                                 Traceback (most recent call last)
Cell In[10], line 3
      1 h = np.linspace(2000, 3800, 80)
      2 mb_default = mbmod.get_annual_mb(h, year=1980) * cfg.SEC_IN_YEAR * cfg.PARAMS['ice_density']
----> 3 mb_perturbed = mbmod_perturbed.get_annual_mb(h, year=1980) * cfg.SEC_IN_YEAR * cfg.PARAMS['ice_density']
      4 plt.plot(mb_default, h);
      5 plt.plot(mb_perturbed, h);

NameError: name 'mbmod_perturbed' is not defined

So far so good. But how to feed this into the heavy OGGM pipeline? Many OGGM users will be more familiar with the run_* entity tasks. These tasks “hide” the process of creating the mass balance model and therefore make it look like we cant’s change anything internally.

We could have added a mechanism to pass the mb_params_filesuffix from, for example, run_random_climate, to the underlying mass balance model (similar to the “climate_input_filesuffix” mechanism). We may add this one day, but for now I’d like to use this opportunity to demonstrate another possible mechanism:

# So far so good: default run with the default mass balance
tasks.run_random_climate(gdir_hef, y0=2000, nyears=100, seed=1, output_filesuffix='_default');
# Let' create another "mass balance model" which is like the default one but with another default parameter
from functools import partial
PerturbedMassBalance = partial(massbalance.MonthlyTIModel, mb_params_filesuffix='_perturbed')

# Pass it to the run task
tasks.run_random_climate(gdir_hef, y0=2000, nyears=100, seed=1, mb_model_class=PerturbedMassBalance, output_filesuffix='_perturbed');
2024-02-02 17:12:35: oggm.core.flowline: TypeError occurred during task run_random_climate_perturbed on RGI60-11.00897: MonthlyTIModel.__init__() got an unexpected keyword argument 'mb_params_filesuffix'
TypeError                                 Traceback (most recent call last)
Cell In[12], line 6
      3 PerturbedMassBalance = partial(massbalance.MonthlyTIModel, mb_params_filesuffix='_perturbed')
      5 # Pass it to the run task
----> 6 tasks.run_random_climate(gdir_hef, y0=2000, nyears=100, seed=1, mb_model_class=PerturbedMassBalance, output_filesuffix='_perturbed');

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/oggm/utils/, in entity_task.__call__.<locals>._entity_task(gdir, reset, print_log, return_value, continue_on_error, add_to_log_file, **kwargs)
    490     signal.alarm(cfg.PARAMS['task_timeout'])
    491 ex_t = time.time()
--> 492 out = task_func(gdir, **kwargs)
    493 ex_t = time.time() - ex_t
    494 if cfg.PARAMS['task_timeout'] > 0:

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/oggm/core/, in run_random_climate(gdir, nyears, y0, halfsize, ys, ye, bias, seed, temperature_bias, precipitation_factor, store_monthly_step, store_model_geometry, store_fl_diagnostics, mb_model_class, climate_filename, climate_input_filesuffix, output_filesuffix, init_model_fls, init_model_filesuffix, init_model_yr, zero_initial_glacier, unique_samples, **kwargs)
   3343 @entity_task(log)
   3344 def run_random_climate(gdir, nyears=1000, y0=None, halfsize=15,
   3345                        ys=None, ye=None,
   3357                        zero_initial_glacier=False,
   3358                        unique_samples=False, **kwargs):
   3359     """Runs the random mass balance model for a given number of years.
   3361     This will initialize a
   3432         kwargs to pass to the FluxBasedModel instance
   3433     """
-> 3435     mb_model = MultipleFlowlineMassBalance(gdir,
   3436                                            mb_model_class=partial(
   3437                                                RandomMassBalance,
   3438                                                mb_model_class=mb_model_class),
   3439                                            y0=y0, halfsize=halfsize,
   3440                                            bias=bias, seed=seed,
   3441                                            filename=climate_filename,
   3442                                            input_filesuffix=climate_input_filesuffix,
   3443                                            unique_samples=unique_samples)
   3445     if temperature_bias is not None:
   3446         mb_model.temp_bias += temperature_bias

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/oggm/core/, in MultipleFlowlineMassBalance.__init__(self, gdir, fls, mb_model_class, use_inversion_flowlines, input_filesuffix, **kwargs)
   1195     else:
   1196         rgi_filesuffix = input_filesuffix
   1198     self.flowline_mb_models.append(
-> 1199         mb_model_class(gdir, input_filesuffix=rgi_filesuffix,
   1200                        **kwargs))
   1202 self.valid_bounds = self.flowline_mb_models[-1].valid_bounds
   1203 self.hemisphere = gdir.hemisphere

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/oggm/core/, in RandomMassBalance.__init__(self, gdir, mb_model_class, y0, halfsize, seed, all_years, unique_samples, prescribe_years, **kwargs)
    923 super().__init__()
    924 self.valid_bounds = [-1e4, 2e4]  # in m
--> 925 self.mbmod = mb_model_class(gdir, **kwargs)
    927 # Climate period
    928 self.prescribe_years = prescribe_years

TypeError: MonthlyTIModel.__init__() got an unexpected keyword argument 'mb_params_filesuffix'

The partial function allows to create a function that is created by fixing a certain number of arguments of another function. Here we create a new “class” which is the same as the default original one, but by setting one parameters to another value. This proves very useful here, since we are just tricking OGGM into using the new one!

Let’s check the outcome:

with xr.open_dataset(gdir_hef.get_filepath('model_diagnostics', filesuffix='_default')) as ds:
    ds_default = ds.load()
with xr.open_dataset(gdir_hef.get_filepath('model_diagnostics', filesuffix='_perturbed')) as ds:
    ds_perturbed = ds.load()
KeyError                                  Traceback (most recent call last)
File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/xarray/backends/, in CachingFileManager._acquire_with_cache_info(self, needs_lock)
    210 try:
--> 211     file = self._cache[self._key]
    212 except KeyError:

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/xarray/backends/, in LRUCache.__getitem__(self, key)
     55 with self._lock:
---> 56     value = self._cache[key]
     57     self._cache.move_to_end(key)

KeyError: [<class 'netCDF4._netCDF4.Dataset'>, ('/tmp/OGGM/OGGM-calib-pertubation/per_glacier/RGI60-11/RGI60-11.00/RGI60-11.00897/',), 'r', (('clobber', True), ('diskless', False), ('format', 'NETCDF4'), ('persist', False)), '6d018781-2ba6-40e3-b2e8-6b96cdf38eff']

During handling of the above exception, another exception occurred:

FileNotFoundError                         Traceback (most recent call last)
Cell In[13], line 3
      1 with xr.open_dataset(gdir_hef.get_filepath('model_diagnostics', filesuffix='_default')) as ds:
      2     ds_default = ds.load()
----> 3 with xr.open_dataset(gdir_hef.get_filepath('model_diagnostics', filesuffix='_perturbed')) as ds:
      4     ds_perturbed = ds.load()

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/xarray/backends/, in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, chunked_array_type, from_array_kwargs, backend_kwargs, **kwargs)
    560 decoders = _resolve_decoders_kwargs(
    561     decode_cf,
    562     open_backend_dataset_parameters=backend.open_dataset_parameters,
    568     decode_coords=decode_coords,
    569 )
    571 overwrite_encoded_chunks = kwargs.pop("overwrite_encoded_chunks", None)
--> 572 backend_ds = backend.open_dataset(
    573     filename_or_obj,
    574     drop_variables=drop_variables,
    575     **decoders,
    576     **kwargs,
    577 )
    578 ds = _dataset_from_backend_dataset(
    579     backend_ds,
    580     filename_or_obj,
    590     **kwargs,
    591 )
    592 return ds

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/xarray/backends/, in NetCDF4BackendEntrypoint.open_dataset(self, filename_or_obj, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, use_cftime, decode_timedelta, group, mode, format, clobber, diskless, persist, lock, autoclose)
    623 def open_dataset(  # type: ignore[override]  # allow LSP violation, not supporting **kwargs
    624     self,
    625     filename_or_obj: str | os.PathLike[Any] | BufferedIOBase | AbstractDataStore,
    641     autoclose=False,
    642 ) -> Dataset:
    643     filename_or_obj = _normalize_path(filename_or_obj)
--> 644     store =
    645         filename_or_obj,
    646         mode=mode,
    647         format=format,
    648         group=group,
    649         clobber=clobber,
    650         diskless=diskless,
    651         persist=persist,
    652         lock=lock,
    653         autoclose=autoclose,
    654     )
    656     store_entrypoint = StoreBackendEntrypoint()
    657     with close_on_error(store):

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/xarray/backends/, in, filename, mode, format, group, clobber, diskless, persist, lock, lock_maker, autoclose)
    401 kwargs = dict(
    402     clobber=clobber, diskless=diskless, persist=persist, format=format
    403 )
    404 manager = CachingFileManager(
    405     netCDF4.Dataset, filename, mode=mode, kwargs=kwargs
    406 )
--> 407 return cls(manager, group=group, mode=mode, lock=lock, autoclose=autoclose)

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/xarray/backends/, in NetCDF4DataStore.__init__(self, manager, group, mode, lock, autoclose)
    352 self._group = group
    353 self._mode = mode
--> 354 self.format = self.ds.data_model
    355 self._filename = self.ds.filepath()
    356 self.is_remote = is_remote_uri(self._filename)

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/xarray/backends/, in NetCDF4DataStore.ds(self)
    414 @property
    415 def ds(self):
--> 416     return self._acquire()

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/xarray/backends/, in NetCDF4DataStore._acquire(self, needs_lock)
    409 def _acquire(self, needs_lock=True):
--> 410     with self._manager.acquire_context(needs_lock) as root:
    411         ds = _nc4_require_group(root, self._group, self._mode)
    412     return ds

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/, in _GeneratorContextManager.__enter__(self)
    135 del self.args, self.kwds, self.func
    136 try:
--> 137     return next(self.gen)
    138 except StopIteration:
    139     raise RuntimeError("generator didn't yield") from None

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/xarray/backends/, in CachingFileManager.acquire_context(self, needs_lock)
    196 @contextlib.contextmanager
    197 def acquire_context(self, needs_lock=True):
    198     """Context manager for acquiring a file."""
--> 199     file, cached = self._acquire_with_cache_info(needs_lock)
    200     try:
    201         yield file

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/xarray/backends/, in CachingFileManager._acquire_with_cache_info(self, needs_lock)
    215     kwargs = kwargs.copy()
    216     kwargs["mode"] = self._mode
--> 217 file = self._opener(*self._args, **kwargs)
    218 if self._mode == "w":
    219     # ensure file doesn't get overridden when opened again
    220     self._mode = "a"

File src/netCDF4/_netCDF4.pyx:2469, in netCDF4._netCDF4.Dataset.__init__()

File src/netCDF4/_netCDF4.pyx:2028, in netCDF4._netCDF4._ensure_nc_success()

FileNotFoundError: [Errno 2] No such file or directory: '/tmp/OGGM/OGGM-calib-pertubation/per_glacier/RGI60-11/RGI60-11.00/RGI60-11.00897/'
NameError                                 Traceback (most recent call last)
Cell In[14], line 2
      1 ds_default.volume_m3.plot(label='Default');
----> 2 ds_perturbed.volume_m3.plot(label='Perturbed');
      3 plt.legend();

NameError: name 'ds_perturbed' is not defined

Quite a big difference for “just” 2 units of melt factor more, from 5 to 7!

More structured parameters perturbation experiments#

OK, so let’s say we want to do this “at scale”. We actually had such an assignment recently for the PROTECT SLR project. We were asked to do a number of perturbed simulations with parameters diverging from their default values, for example +1 temp_bias everywhere. But how to do this, knowing that each glacier has a different temp_bias? We can’t simply set the bias to 1 everywhere (we need +=1).

For this I wrote a “task”, originally outside of OGGM but that is now (v1.6.4) part of the main codebase. Let’s have a look at it:

@entity_task(log, writes=['mb_calib'])
def perturbate_mb_params(gdir, perturbation=None, reset_default=False, filesuffix=''):
    """Replaces pre-calibrated MB params with perturbed ones for this glacier.

    It simply replaces the existing `mb_calib.json` file with an
    updated one with perturbed parameters. The original ones
    are stored in the file for re-use after perturbation.

    Users can change the following 4 parameters:
    - 'melt_f': unit [kg m-2 day-1 K-1], the melt factor
    - 'prcp_fac': unit [-], the precipitation factor
    - 'temp_bias': unit [K], the temperature correction applied to the timeseries
    - 'bias': unit [mm we yr-1], *substracted* from the computed MB. Rarely used.

    All parameter perturbations are additive, i.e. the value
    provided by the user is added to the *precalibrated* value.
    For example, `temp_bias=1` means that the temp_bias used by the
    model will be the precalibrated one, plus 1 Kelvin.

    The only exception is prpc_fac, which is multiplicative.
    For example prcp_fac=1 will leave the precalibrated prcp_fac unchanged,
    while 2 will double it.

    perturbation : dict
        the parameters to change and the associated value (see doc above)
    reset_default : bool
        reset the parameters to their original value. This might be
        unnecessary if using the filesuffix mechanism.
    filesuffix : str
        write the modified parameters in a separate mb_calib.json file
        with the filesuffix appended. This can then be read by the
        MassBalanceModel for example instead of the default one.
        Note that it's always the default, precalibrated params
        file which is read to start with.
    df = gdir.read_json('mb_calib')

    # Save original params if not there
    if 'bias_orig' not in df:
        for k in ['bias', 'melt_f', 'prcp_fac', 'temp_bias']:
            df[k + '_orig'] = df[k]

    if reset_default:
        for k in ['bias', 'melt_f', 'prcp_fac', 'temp_bias']:
            df[k] = df[k + '_orig']
        gdir.write_json(df, 'mb_calib', filesuffix=filesuffix)
        return df

    for k, v in perturbation.items():
        if k == 'prcp_fac':
            df[k] = df[k + '_orig'] * v
        elif k in ['bias', 'melt_f', 'temp_bias']:
            df[k] = df[k + '_orig'] + v
            raise InvalidParamsError(f'Perturbation not valid: {k}')

    gdir.write_json(df, 'mb_calib', filesuffix=filesuffix)
    return df

It’s a fairly easy piece of code isn’t it? Let’s apply it in a latin hypercube parameter set where we change all parameters in a structured way:

from scipy.stats import qmc
sampler = qmc.LatinHypercube(d=3)
sample = sampler.random(n=30)

def log_scale_value(value, low, high):
    """This is to sample multiplicative factors in log space to avoid assymetry (detail, but important)."""
    return 2**((np.log2(high) - np.log2(low))*value + np.log2(low))

sample[:,0] = 4*sample[:,0] - 2  # DDF factor (melt_f): apply change [-2, +2] mm/(°C day)
sample[:,1] = 2*sample[:,1] - 1  # temperature bias (temp_bias): apply change [-1, +1] °C
sample[:,2] = log_scale_value(sample[:,2], 0.5, 2)  # precipitation scaling factor (prcp_fac): apply scaling [0.5, 2] on log2

params_df = pd.DataFrame(sample, columns=['melt_f', 'temp_bias', 'prcp_fac'])
params_df.plot.scatter(x='temp_bias', y='prcp_fac');
cfg.set_logging_config('CRITICAL')  # shut down log output (bad!)
cfg.PARAMS['continue_on_error'] = True

for exp in range(len(params_df)):
    params = params_df.loc[exp]
    # clim_params = {k: params[k] for k in ('temp_bias', 'prcp_fac', 'melt_f')}
    exp = f'{exp:02d}'
    workflow.execute_entity_task(tasks.perturbate_mb_params, gdirs, perturbation=params, filesuffix=f'_{exp}')

    PerturbedMassBalance = partial(massbalance.MonthlyTIModel, mb_params_filesuffix=f'_{exp}')
    workflow.execute_entity_task(tasks.run_random_climate, gdirs,          
                                 output_filesuffix=f'_{exp}',  # recognize the run for later
AttributeError                            Traceback (most recent call last)
Cell In[16], line 8
      6 # clim_params = {k: params[k] for k in ('temp_bias', 'prcp_fac', 'melt_f')}
      7 exp = f'{exp:02d}'
----> 8 workflow.execute_entity_task(tasks.perturbate_mb_params, gdirs, perturbation=params, filesuffix=f'_{exp}')
     10 PerturbedMassBalance = partial(massbalance.MonthlyTIModel, mb_params_filesuffix=f'_{exp}')
     12 workflow.execute_entity_task(tasks.run_random_climate, gdirs,          
     13                              y0=2000,
     14                              nyears=100,
     17                              output_filesuffix=f'_{exp}',  # recognize the run for later
     18                              );

AttributeError: module 'oggm.tasks' has no attribute 'perturbate_mb_params'
out_df = pd.DataFrame()
for exp in range(len(params_df)):
        ds = utils.compile_run_output(gdirs, input_filesuffix=f'_{exp:02d}')
        if np.any(ds.volume.isnull()):
        out_df[f'{exp:02d}'] = ds.volume.sum(dim='rgi_id').to_series()
    except RuntimeError:
out_df.plot(legend=False, color='k', alpha=0.5);
TypeError                                 Traceback (most recent call last)
Cell In[18], line 1
----> 1 out_df.plot(legend=False, color='k', alpha=0.5);

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/pandas/plotting/, in PlotAccessor.__call__(self, *args, **kwargs)
   1027             label_name = label_kw or data.columns
   1028             data.columns = label_name
-> 1030 return plot_backend.plot(data, kind=kind, **kwargs)

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/pandas/plotting/_matplotlib/, in plot(data, kind, **kwargs)
     69         kwargs["ax"] = getattr(ax, "left_ax", ax)
     70 plot_obj = PLOT_CLASSES[kind](data, **kwargs)
---> 71 plot_obj.generate()
     72 plot_obj.draw()
     73 return plot_obj.result

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/pandas/plotting/_matplotlib/, in MPLPlot.generate(self)
    497 @final
    498 def generate(self) -> None:
--> 499     self._compute_plot_data()
    500     fig = self.fig
    501     self._make_plot(fig)

File /usr/local/pyenv/versions/3.11.7/lib/python3.11/site-packages/pandas/plotting/_matplotlib/, in MPLPlot._compute_plot_data(self)
    696 # no non-numeric frames or series allowed
    697 if is_empty:
--> 698     raise TypeError("no numeric data to plot")
    700 = numeric_data.apply(type(self)._convert_to_ndarray)

TypeError: no numeric data to plot

Parameters perturbation experiments which match observations#

The section above is nice, but works only for a problem setting where we don’t ask the mass balance model to match observations. If we were to match obervations, things would be quite different!

To do this, we could define a new task very much like the above, but this time realizing a calibration step before writing its solution down.

This exercise is left to the reader ;-)

What’s next?#