diff options
author | tradke <schnetter@cct.lsu.edu> | 2007-10-19 09:14:00 +0000 |
---|---|---|
committer | tradke <schnetter@cct.lsu.edu> | 2007-10-19 09:14:00 +0000 |
commit | 4ff785680bea6a8509b9f12fd6d1aa1c480b0d76 (patch) | |
tree | b857d1f44af37f023cade29d1b0b013be7d9400c /Carpet | |
parent | 66eff79629268735c6695f7700bd8e8a1eadd2fb (diff) |
CarpetIOHDF5: workaround for excessive HDF5-internal memory requirements during recovery
Various people had reported problems of running out of memory when recovering
from multiple chunked checkpoint files. It turned out that the HDF5 library
itself requires a considerable amount of memory for each opened HDF5 file.
When all chunked files of a checkpoint are opened at the same time during
recovery (which is the default) this may cause the simulation to abort with an
'out of memory' error in extreme cases.
This patch introduces a new steerable boolean parameter
IOHDF5::open_one_input_file_at_a_time
which, if set to "yes", will tell the recovery code to open/read/close chunked
files one after another for each refinement level, thus avoiding excessive
HDF5-internal memory requirements due to multiple open files.
The default behaviour is (as before) to keep all input files open until all
refinement levels are recovered.
darcs-hash:20071019091424-3fd61-834471be8da361b235d0a4cbf3d6f16ae0b653f0.gz
Diffstat (limited to 'Carpet')
-rw-r--r-- | Carpet/CarpetIOHDF5/param.ccl | 6 | ||||
-rw-r--r-- | Carpet/CarpetIOHDF5/src/Input.cc | 6 |
2 files changed, 12 insertions, 0 deletions
diff --git a/Carpet/CarpetIOHDF5/param.ccl b/Carpet/CarpetIOHDF5/param.ccl index b8dc67158..a133a754f 100644 --- a/Carpet/CarpetIOHDF5/param.ccl +++ b/Carpet/CarpetIOHDF5/param.ccl @@ -99,6 +99,12 @@ BOOLEAN one_file_per_group "Write one file per group instead of per variable" ST { } "no" +BOOLEAN open_one_input_file_at_a_time "Open only one HDF5 file at a time when reading data from multiple chunked checkpoint/data files" STEERABLE = ALWAYS +{ + "no" :: "Open all input files first, then import data (most efficient)" + "yes" :: "Process input files one after another (reduces memory requirements)" +} "no" + INT compression_level "Compression level to use for writing HDF5 data" STEERABLE = ALWAYS { 0:9 :: "Higher numbers compress better, a value of zero disables compression" diff --git a/Carpet/CarpetIOHDF5/src/Input.cc b/Carpet/CarpetIOHDF5/src/Input.cc index 7e046b914..8d3ebda71 100644 --- a/Carpet/CarpetIOHDF5/src/Input.cc +++ b/Carpet/CarpetIOHDF5/src/Input.cc @@ -486,6 +486,12 @@ int Recover (cGH* cctkGH, const char *basefilename, int called_from) if (all_done) { break; } + + // keep the file open if not requested otherwise by the user + if (open_one_input_file_at_a_time) { + HDF5_ERROR (H5Fclose (file.file)); + file.file = -1; + } } // check that all variables have been read completely on this mglevel/reflevel |