Storing glacier directories for later use

“Glacier directories” are the fundamental data structure used by OGGM. They allow to share data between runs, between the OGGM developers and users, and between users themselves.

GLacier directories can also be confusing at times, and can contain a high number of files, making them hard to move between clusters or computers. This notebook explains how these directories are structured and how to store them for move and later use.

The main use-cases documented by this notebook are:

  • pre-process a number of glacier directories

  • stop working, and then re-start again from the same location

  • stop working, store them and copy them to another storage, or move them to another machine

  • re-start from them on another machine / instance

# Libs
import os
import shutil

# Locals
import oggm.cfg as cfg
from oggm import utils, workflow, tasks

The structure of the working directory

Let’s open a new workflow for two glaciers:

# Initialize OGGM and set up the default run parameters
rgi_version = '62'
cfg.PARAMS['border'] = 80

# Local working directory (where OGGM will write its output)
WORKING_DIR = utils.gettempdir('oggm_gdirs_wd', reset=True)
cfg.PATHS['working_dir'] = WORKING_DIR

# RGI glaciers: Hintereisferner and Kesselwandferner
rgi_ids = utils.get_rgi_glacier_entities(['RGI60-11.00897', 'RGI60-11.00787'])

# Go - get the pre-processed glacier directories
gdirs = workflow.init_glacier_directories(rgi_ids, from_prepro_level=3)
2022-10-07 13:10:09: oggm.cfg: Reading default parameters from the OGGM `params.cfg` configuration file.
2022-10-07 13:10:09: oggm.cfg: Multiprocessing switched OFF according to the parameter file.
2022-10-07 13:10:09: oggm.cfg: Multiprocessing: using all available processors (N=2)
2022-10-07 13:10:09: oggm.cfg: PARAMS['border'] changed from `40` to `80`.
InvalidParamsError                        Traceback (most recent call last)
Cell In [2], line 14
     11 rgi_ids = utils.get_rgi_glacier_entities(['RGI60-11.00897', 'RGI60-11.00787'])
     13 # Go - get the pre-processed glacier directories
---> 14 gdirs = workflow.init_glacier_directories(rgi_ids, from_prepro_level=3)

File /usr/local/pyenv/versions/3.10.7/lib/python3.10/site-packages/oggm/, in init_glacier_directories(rgidf, reset, force, from_prepro_level, prepro_border, prepro_rgi_version, prepro_base_url, from_tar, delete_tar)
    348     reset = utils.query_yes_no('Delete all glacier directories?')
    350 if from_prepro_level:
--> 351     url = utils.get_prepro_base_url(base_url=prepro_base_url,
    352                                     border=prepro_border,
    353                                     prepro_level=from_prepro_level,
    354                                     rgi_version=prepro_rgi_version)
    355     if cfg.PARAMS['has_internet'] and not utils.url_exists(url):
    356         raise InvalidParamsError("base url seems unreachable with these "
    357                                  "parameters: {}".format(url))

File /usr/local/pyenv/versions/3.10.7/lib/python3.10/site-packages/oggm/utils/, in get_prepro_base_url(base_url, rgi_version, border, prepro_level)
   1275 """Extended base url where to find the desired gdirs."""
   1277 if base_url is None:
-> 1278     raise InvalidParamsError('Starting with v1.6, users now have to '
   1279                              'explicitly indicate the url they want '
   1280                              'to start from.')
   1282 if not base_url.endswith('/'):
   1283     base_url += '/'

InvalidParamsError: Starting with v1.6, users now have to explicitly indicate the url they want to start from.

OGGM downloaded the pre-processed directories, stored the tar files in your cache, and extracted them in your working directory. But how is this working directory structured? Let’s have a look:

def file_tree_print(prepro_dir=False):
    # Just a utility function to show the dir structure and selected files
    tab = '  '
    for dirname, dirnames, filenames in os.walk(cfg.PATHS['working_dir']):
        for subdirname in dirnames:
            print(tab + subdirname + '/')
        for filename in filenames:
            if '.tar' in filename and 'RGI' in filename:
                print(tab + filename)
        tab += '  '

OK, so from the WORKING_DIR, OGGM creates a per_glacier folder (always) where the glacier directories are stored. In order to avoid a large cluttering of the folder (and for other reasons which become apparent later), the directories are organised in regional (here RGI60-16) and then in folders containing up to 1000 glaciers (here RGI60-16.02, i.e. for ids RGI60-16.020000 to RGI60-16.029999).

Our files are located in the final folders of this tree (not shown in the tree). For example:

gdirs[0].get_filepath('dem').replace(WORKING_DIR, 'WORKING_DIR')

Let’s add some steps to our workflow, for example a spinup run that we would like to store for later:

# Run
workflow.execute_entity_task(tasks.run_from_climate_data, gdirs, 
                             output_filesuffix='_spinup',  # to use the files as input later on

Stop there and restart from the same spot

The glacier directories are on disk, and won’t move away. This means that next time you’ll open OGGM, from this notebook or another script, you can start from them again. The only steps you have to take:

  • set the working directory to the one you want to start from

  • initialize the working directories without arguments (or, faster, with the list of IDs)

See for example:

# Set the working dir correctly
cfg.PATHS['working_dir'] = utils.gettempdir('oggm_gdirs_wd')

# Go - re-open the pre-processed glacier directories from what's there
gdirs = workflow.init_glacier_directories()

The step above can be quite slow (because OGGM has to parse quite some info from the directories). Better is to start from the list of glaciers you want to work with:

# Go - re-open the pre-processed glacier directories from what's there but with the list of glaciers
gdirs = workflow.init_glacier_directories(rgi_ids)

!!!CAREFUL!!! do not start from a preprocessed level (or from a tar file), or your local directories (which may contain new data) will be overwritten, i.e. workflow.init_glacier_directories(rgi_ids, from_prepro_level=3) will always start from the pre-processed, fresh state.

Store the single glacier directories into tar files

The gdir_to_tar task will compress each single glacier directory into the same folder per default (but you can actually also put the compressed files somewhere else, e.g. in a folder in your $home):

workflow.execute_entity_task(utils.gdir_to_tar, gdirs, delete=False);

Most of the time, you will actually want to delete the orginal directories because they are not needed for this run anymore:

workflow.execute_entity_task(utils.gdir_to_tar, gdirs, delete=True);

Now the original directories are gone, and the gdirs objects are useless (attempting to do anything with them will lead to an error).

Since they are already available in the correct file structure, however, OGGM will know how to reconstruct them from the tar files if asked to:

gdirs = workflow.init_glacier_directories(rgi_ids, from_tar=True, delete_tar=True)

These directories are now ready to be used again! To summarize: thanks to this first step, you already reduced the number of files to move around from N x M (where M is the number of files in each glacier directory) to N (where N is the number of glaciers).

You can now move this working directory somewhere else, and in another OGGM run instance, simply start from them as shown above.

Bundle of directories

It turned out that the file structure above was a bit cumbersome to use, in particular for glacier directories that we wanted to share online. For this, we found it more convenient to bundle the directories into groups of 1000 glaciers. Fortunately, this is easy to do:

# Tar the individual ones first
workflow.execute_entity_task(utils.gdir_to_tar, gdirs, delete=True);
# Then tar the bundles
utils.base_dir_to_tar(WORKING_DIR, delete=True)

Now, the glacier directories are bundled in a file at a higher level even. This is even more convenient to move around (less files), but is not a mandatory step. The nice part about this bundling is that you can still select individual glaciers, as we will see in the next section. In the meantime, you can do:

gdirs = workflow.init_glacier_directories(rgi_ids, from_tar=True)

Which did the trick! Note that the bundled tar files are never deleted. This is why they are useful for another purpose explained in the next section: creating your own “pre-processed directories”.

Self-made pre-processed directories for “restart” workflows

This workflow is the one used by OGGM to prepare the preprocessed directories that many of you are using. It is a variant of the workflow above, the only difference being that the directories are re-started from a file which is located elsewhere than in the working directory:

# Where to put the compressed dirs
PREPRO_DIR = utils.get_temp_dir('prepro_dir')
if os.path.exists(PREPRO_DIR):

# Lets start from a clean state
# Beware! If you use `reset=True` in `utils.mkdir`, ALL DATA in this folder will be deleted! Use with caution!
utils.mkdir(WORKING_DIR, reset=True)
gdirs = workflow.init_glacier_directories(rgi_ids, from_prepro_level=3)

# Then tar the gdirs and bundle
workflow.execute_entity_task(utils.gdir_to_tar, gdirs, delete=True)

# Copy the outcome in a new directory: scratch folder, new machine, etc.
shutil.copytree(os.path.join(WORKING_DIR, 'per_glacier'), PREPRO_DIR);

OK so this PREPRO_DIR directory is where the files will stay for longer now. You can start from there at wish with:

# Lets start from a clean state
utils.mkdir(WORKING_DIR, reset=True)
# This needs to work
# It uses the files you prepared beforehand to start the dirs
gdirs = workflow.init_glacier_directories(rgi_ids, from_tar=PREPRO_DIR)

What’s next?