diff options
author | tradke <schnetter@cct.lsu.edu> | 2007-10-19 09:14:00 +0000 |
---|---|---|
committer | tradke <schnetter@cct.lsu.edu> | 2007-10-19 09:14:00 +0000 |
commit | 4ff785680bea6a8509b9f12fd6d1aa1c480b0d76 (patch) | |
tree | b857d1f44af37f023cade29d1b0b013be7d9400c /Carpet/CarpetLib | |
parent | 66eff79629268735c6695f7700bd8e8a1eadd2fb (diff) |
CarpetIOHDF5: workaround for excessive HDF5-internal memory requirements during recovery
Various people had reported problems of running out of memory when recovering
from multiple chunked checkpoint files. It turned out that the HDF5 library
itself requires a considerable amount of memory for each opened HDF5 file.
When all chunked files of a checkpoint are opened at the same time during
recovery (which is the default) this may cause the simulation to abort with an
'out of memory' error in extreme cases.
This patch introduces a new steerable boolean parameter
IOHDF5::open_one_input_file_at_a_time
which, if set to "yes", will tell the recovery code to open/read/close chunked
files one after another for each refinement level, thus avoiding excessive
HDF5-internal memory requirements due to multiple open files.
The default behaviour is (as before) to keep all input files open until all
refinement levels are recovered.
darcs-hash:20071019091424-3fd61-834471be8da361b235d0a4cbf3d6f16ae0b653f0.gz
Diffstat (limited to 'Carpet/CarpetLib')
0 files changed, 0 insertions, 0 deletions