Storing glacier directories for later use

“Glacier directories” are the fundamental data structure used by OGGM. They allow to share data between runs, between the OGGM developers and users, and between users themselves.

GLacier directories can also be confusing at times, and can contain a high number of files, making them hard to move between clusters or computers. This notebook explains how these directories are structured and how to store them for move and later use.

The main use-cases documented by this notebook are:

  • pre-process a number of glacier directories

  • stop working, and then re-start again from the same location

  • stop working, store them and copy them to another storage, or move them to another machine

  • re-start from them on another machine / instance

# Libs
import os
import shutil

# Locals
import oggm.cfg as cfg
from oggm import utils, workflow, tasks

The structure of the working directory

Let’s open a new workflow for two glaciers:

# Initialize OGGM and set up the default run parameters
cfg.initialize(logging_level='WARNING')
rgi_version = '62'
cfg.PARAMS['border'] = 80

# Local working directory (where OGGM will write its output)
WORKING_DIR = utils.gettempdir('oggm_gdirs_wd')
utils.mkdir(WORKING_DIR, reset=True)
cfg.PATHS['working_dir'] = WORKING_DIR

# RGI glaciers: Hintereisferner and Kesselwandferner
rgi_ids = utils.get_rgi_glacier_entities(['RGI60-11.00897', 'RGI60-11.00787'])

# Go - get the pre-processed glacier directories
gdirs = workflow.init_glacier_directories(rgi_ids, from_prepro_level=3)
2021-06-07 15:11:13: oggm.cfg: Reading default parameters from the OGGM `params.cfg` configuration file.
2021-06-07 15:11:13: oggm.cfg: Multiprocessing switched OFF according to the parameter file.
2021-06-07 15:11:13: oggm.cfg: Multiprocessing: using all available processors (N=2)
2021-06-07 15:11:13: oggm.cfg: PARAMS['border'] changed from `40` to `80`.
2021-06-07 15:11:14: oggm.workflow: init_glacier_directories from prepro level 3 on 2 glaciers.
2021-06-07 15:11:14: oggm.workflow: Execute entity task gdir_from_prepro on 2 glaciers

OGGM downloaded the pre-processed directories, stored the tar files in your cache, and extracted them in your working directory. But how is this working directory structured? Let’s have a look:

def file_tree_print(prepro_dir=False):
    # Just a utility function to show the dir structure and selected files
    print("cfg.PATHS['working_dir']/")
    tab = '  '
    for dirname, dirnames, filenames in os.walk(cfg.PATHS['working_dir']):
        for subdirname in dirnames:
            print(tab + subdirname + '/')
        for filename in filenames:
            if '.tar' in filename and 'RGI' in filename:
                print(tab + filename)
        tab += '  '
file_tree_print()
cfg.PATHS['working_dir']/
  per_glacier/
    RGI60-11/
      RGI60-11.00/
        RGI60-11.00787/
        RGI60-11.00897/

OK, so from the WORKING_DIR, OGGM creates a per_glacier folder (always) where the glacier directories are stored. In order to avoid a large cluttering of the folder (and for other reasons which become apparent later), the directories are organised in regional (here RGI60-16) and then in folders containing up to 1000 glaciers (here RGI60-16.02, i.e. for ids RGI60-16.020000 to RGI60-16.029999).

Our files are located in the final folders of this tree (not shown in the tree). For example:

gdirs[0].get_filepath('dem').replace(WORKING_DIR, 'WORKING_DIR')
'WORKING_DIR/per_glacier/RGI60-11/RGI60-11.00/RGI60-11.00787/dem.tif'

Let’s add some steps to our workflow, for example a spinup run that we would like to store for later:

# Run
workflow.execute_entity_task(tasks.run_from_climate_data, gdirs, 
                             output_filesuffix='_spinup',  # to use the files as input later on
                             );
2021-06-07 15:11:15: oggm.workflow: Execute entity task run_from_climate_data on 2 glaciers

Stop there and restart from the same spot

The glacier directories are on disk, and won’t move away. This means that next time you’ll open OGGM, from this notebook or another script, you can start from them again. The only steps you have to take:

  • set the working directory to the one you want to start from

  • initialize the working directories without arguments (or, faster, with the list of IDs)

See for example:

# Set the working dir correctly
cfg.PATHS['working_dir'] = utils.gettempdir('oggm_gdirs_wd')

# Go - re-open the pre-processed glacier directories from what's there
gdirs = workflow.init_glacier_directories()
2021-06-07 15:11:16: oggm.workflow: init_glacier_directories by parsing all available folders (this takes time: if possible, provide rgidf instead).

The step above can be quite slow (because OGGM has to parse quite some info from the directories). Better is to start from the list of glaciers you want to work with:

# Go - re-open the pre-processed glacier directories from what's there but with the list of glaciers
gdirs = workflow.init_glacier_directories(rgi_ids)
2021-06-07 15:11:17: oggm.workflow: Execute entity task GlacierDirectory on 2 glaciers

!!!CAREFUL!!! do not start from a preprocessed level (or from a tar file), or your local directories (which may contain new data) will be overwritten, i.e. workflow.init_glacier_directories(rgi_ids, from_prepro_level=3) will always start from the pre-processed, fresh state.

Store the single glacier directories into tar files

The gdir_to_tar task will compress each single glacier directory into the same folder per default (but you can actually also put the compressed files somewhere else, e.g. in a folder in your $home):

utils.gdir_to_tar?
workflow.execute_entity_task(utils.gdir_to_tar, gdirs, delete=False);
file_tree_print()
2021-06-07 15:11:19: oggm.workflow: Execute entity task gdir_to_tar on 2 glaciers
cfg.PATHS['working_dir']/
  per_glacier/
    RGI60-11/
      RGI60-11.00/
        RGI60-11.00787/
        RGI60-11.00897/
        RGI60-11.00787.tar.gz
        RGI60-11.00897.tar.gz

Most of the time, you will actually want to delete the orginal directories because they are not needed for this run anymore:

workflow.execute_entity_task(utils.gdir_to_tar, gdirs, delete=True);
file_tree_print()
2021-06-07 15:11:20: oggm.workflow: Execute entity task gdir_to_tar on 2 glaciers
cfg.PATHS['working_dir']/
  per_glacier/
    RGI60-11/
      RGI60-11.00/
        RGI60-11.00787.tar.gz
        RGI60-11.00897.tar.gz

Now the original directories are gone, and the gdirs objects are useless (attempting to do anything with them will lead to an error).

Since they are already available in the correct file structure, however, OGGM will know how to reconstruct them from the tar files if asked to:

gdirs = workflow.init_glacier_directories(rgi_ids, from_tar=True, delete_tar=True)
file_tree_print()
2021-06-07 15:11:20: oggm.workflow: Execute entity task GlacierDirectory on 2 glaciers
cfg.PATHS['working_dir']/
  per_glacier/
    RGI60-11/
      RGI60-11.00/
        RGI60-11.00787/
        RGI60-11.00897/

These directories are now ready to be used again! To summarize: thanks to this first step, you already reduced the number of files to move around from N x M (where M is the number of files in each glacier directory) to N (where N is the number of glaciers).

You can now move this working directory somewhere else, and in another OGGM run instance, simply start from them as shown above.

Bundle of directories

It turned out that the file structure above was a bit cumbersome to use, in particular for glacier directories that we wanted to share online. For this, we found it more convenient to bundle the directories into groups of 1000 glaciers. Fortunately, this is easy to do:

utils.base_dir_to_tar?
# Tar the individual ones first
workflow.execute_entity_task(utils.gdir_to_tar, gdirs, delete=True);
# Then tar the bundles
utils.base_dir_to_tar(WORKING_DIR, delete=True)
file_tree_print()
2021-06-07 15:11:20: oggm.workflow: Execute entity task gdir_to_tar on 2 glaciers
cfg.PATHS['working_dir']/
  per_glacier/
    RGI60-11/
      RGI60-11.00.tar

Now, the glacier directories are bundled in a file at a higher level even. This is even more convenient to move around (less files), but is not a mandatory step. The nice part about this bundling is that you can still select individual glaciers, as we will see in the next section. In the meantime, you can do:

gdirs = workflow.init_glacier_directories(rgi_ids, from_tar=True)
file_tree_print()
2021-06-07 15:11:20: oggm.workflow: Execute entity task GlacierDirectory on 2 glaciers
cfg.PATHS['working_dir']/
  per_glacier/
    RGI60-11/
      RGI60-11.00/
      RGI60-11.00.tar
        RGI60-11.00787/
        RGI60-11.00897/

Which did the trick! Note that the bundled tar files are never deleted. This is why they are useful for another purpose explained in the next section: creating your own “pre-processed directories”.

Self-made pre-processed directories for “restart” workflows

This workflow is the one used by OGGM to prepare the preprocessed directories that many of you are using. It is a variant of the workflow above, the only difference being that the directories are re-started from a file which is located elsewhere than in the working directory:

# Where to put the compressed dirs
PREPRO_DIR = utils.get_temp_dir('prepro_dir')
if os.path.exists(PREPRO_DIR):
    shutil.rmtree(PREPRO_DIR)

# Lets start from a clean state
utils.mkdir(WORKING_DIR, reset=True)
gdirs = workflow.init_glacier_directories(rgi_ids, from_prepro_level=3)

# Then tar the gdirs and bundle
workflow.execute_entity_task(utils.gdir_to_tar, gdirs, delete=True)
utils.base_dir_to_tar(delete=True)

# Copy the outcome in a new directory: scratch folder, new machine, etc.
shutil.copytree(os.path.join(WORKING_DIR, 'per_glacier'), PREPRO_DIR);
2021-06-07 15:11:20: oggm.workflow: init_glacier_directories from prepro level 3 on 2 glaciers.
2021-06-07 15:11:20: oggm.workflow: Execute entity task gdir_from_prepro on 2 glaciers
2021-06-07 15:11:20: oggm.workflow: Execute entity task gdir_to_tar on 2 glaciers

OK so this PREPRO_DIR directory is where the files will stay for longer now. You can start from there at wish with:

# Lets start from a clean state
utils.mkdir(WORKING_DIR, reset=True)
# This needs https://github.com/OGGM/oggm/pull/1158 to work
# It uses the files you prepared beforehand to start the dirs
gdirs = workflow.init_glacier_directories(rgi_ids, from_tar=PREPRO_DIR)
file_tree_print()
2021-06-07 15:11:20: oggm.workflow: Execute entity task gdir_from_tar on 2 glaciers
cfg.PATHS['working_dir']/
  per_glacier/
    RGI60-11/
      RGI60-11.00/
        RGI60-11.00787/
        RGI60-11.00897/

What’s next?