| Commit message (Collapse) | Author | Age |
... | |
|\| |
|
| |
| |
| |
| | |
writing string attributes, another in querying the "Checkpoint" tag of CCTK groups
|
| | |
|
| |
| |
| |
| |
| |
| |
| | |
functionCarpetIOHDF5_InitCheckpointingIntervals
Calling the scheduled function CarpetIOHDF5_InitCheckpointingIntervals
leads to namespace problems with PGI compiler.
|
| | |
|
| |
| |
| |
| |
| |
| |
| | |
functionCarpetIOHDF5_InitCheckpointingIntervals
Calling the scheduled function CarpetIOHDF5_InitCheckpointingIntervals
leads to namespace problems with PGI compiler.
|
|\|
| |
| |
| |
| | |
Conflicts:
Carpet/CarpetWeb/index.html
|
| |
| |
| |
| |
| |
| |
| |
| | |
variables also in the POST_RECOVER_VARIABLES bin so that the
last checkpoint iteration counter starts counting from the recovered
iteration number.
(see also discussion thread starting at
http://lists.carpetcode.org/archives/developers/2008-August/002309.html)
|
| |
| |
| |
| |
| |
| | |
(<last-checkpoint-iteration> + IO::checkpoint_every)"
This reverts commit 30f1c46a7f94d423bda65e04220015e0296e1347.
|
| | |
|
| |
| |
| |
| |
| |
| | |
(<last-checkpoint-iteration> + IO::checkpoint_every)"
This reverts commit 30f1c46a7f94d423bda65e04220015e0296e1347.
|
|\| |
|
| |
| |
| |
| | |
(<last-checkpoint-iteration> + IO::checkpoint_every)
|
| |
| |
| |
| |
| | |
Broadcast the result of checkpoint_every_walltime_hours, since different
processors may come to different decisions.
|
| |
| |
| |
| | |
Correct a multi-processor synchronisation problem when using IO::out_dt.
|
| | |
|
| |
| |
| |
| |
| | |
Broadcast the result of checkpoint_every_walltime_hours, since different
processors may come to different decisions.
|
| |
| |
| |
| | |
Correct a multi-processor synchronisation problem when using IO::out_dt.
|
|/
|
|
|
| |
The CactusBase/IOUtil routine IOUtil_ParseVarsForOutput now accepts an
additional parameter out_dt_default.
|
|
|
|
|
|
|
|
|
| |
Introduce a tree data structure "fulltree", which decomposes a single,
rectangular region into a tree of non-overlapping, rectangular sub-regions.
Move the processor decomposition from the regridding thorns into Carpet.
Create such trees during processor decomposition.
Store these trees with the grid hierarchy.
|
| |
|
|\ |
|
| |
| |
| |
| | |
for generalized grids.
|
|/
|
|
| |
Correct error in determining checkpointing interval after restarting.
|
|
|
|
| |
integer-type grid variables
|
| |
|
| |
|
|
|
|
| |
This isn't necessary but very convenient to generate complete thornlists with the MakeThornList script.
|
| |
|
| |
|
|
|
|
| |
initialisation scheme (ie. explicite initialisation of all three timelevels, resulting in slightly different output data)
|
|
|
|
| |
changed
|
|
|
|
| |
- don't output sum norms (not suitable for testsuites)
|
|
|
|
| |
darcs-hash:20080220003219-dae7b-0505c565d4989163b001ee53b1ff6577211649f1.gz
|
|
|
|
|
|
|
| |
Change some CCTK_REAL variables to double, because they are read by
HDF5 routines as "native double".
darcs-hash:20080219052326-dae7b-0bca03938f51c0ed598e671f5278659e6e051827.gz
|
|
|
|
|
|
| |
Change sprintf to snprintf. Add assert statements.
darcs-hash:20080219052249-dae7b-abbdbb9df6c099cdd62ebaac135b654062659619.gz
|
|
|
|
|
|
|
| |
Add functions which write a single HDF5 attribute. Call these
functions to write the attributes.
darcs-hash:20080219052032-dae7b-dbb3e2ee3aad27cce26735639b9065c86cc9cef3.gz
|
|
|
|
|
|
|
| |
Write attributes containing the configuration, source tree, and run
ids into the output files.
darcs-hash:20080203190955-dae7b-e22cbe007b26040e7131b1f93307382dc0cc5882.gz
|
|
|
|
|
|
|
| |
Fix creative change of argument order to a call to WriteMetaData which resulted
in nioprocs=-1. This should fix checkpoint/recovery.
darcs-hash:20080203024904-fff0f-5d3514b62c80cd2920ac45a98758fff8c00135c1.gz
|
|
|
|
|
|
|
| |
Without this bugfix, whenever there is an aliased function "Multipatch_MapIsCartesian" function,
the corresponding write call of this attribute fails because the attribute's scalar dataspace was already closed.
darcs-hash:20080131151816-79e7e-3678879958c776fc1240e8b93f054732c64b62c5.gz
|
|
|
|
| |
darcs-hash:20080130222154-dae7b-113029d4e40be633fca3253f5e2d47f656ae41ac.gz
|
|
|
|
| |
darcs-hash:20080130222322-dae7b-223b30e93a6f7860c1234ed181453c77b1ee056e.gz
|
|
|
|
| |
darcs-hash:20080130221846-dae7b-08cd77d33269fc3ec8d20db87731b2b2097d5d38.gz
|
|
|
|
|
|
|
|
| |
Add an HDF5 attribute "MapisCartesian" specifying whether the
coordinate system is Cartesian. This attribute is added if there is a
thorn providing this information.
darcs-hash:20080128155820-dae7b-759cc1608ba8a7c9d34203b114ceb3836db0f64d.gz
|
|
|
|
| |
darcs-hash:20080123132645-3fd61-ed195918c2c55889623aaf2195237e2ff7bd6e17.gz
|
|
|
|
|
|
|
| |
Enclose the macro "HDF5_ERROR" in a do { ... } while (0) pair to make
it safe to use with a trailing semicolon.
darcs-hash:20080111111512-dae7b-b65c7b375ee6ac882b59414db2810f06fcc3d799.gz
|
|
|
|
|
|
|
|
| |
Output an additional attribute "Datasets" into the "Parameters and
Global Attributes" group. This attribute is an array of strings and
contains the full variable names of all variables in this file.
darcs-hash:20080111111324-dae7b-895d9c619e1f2126ea367cb93b77f407d50f15b4.gz
|
|
|
|
| |
darcs-hash:20071130052144-fff0f-57258dcf6536be5b2d8a14f2a5ef5be56d6f038f.gz
|
|
|
|
| |
darcs-hash:20071102224421-dae7b-79e89575a8e23c46fea47ef043d28ab7331da985.gz
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
during recovery
Various people had reported problems of running out of memory when recovering
from multiple chunked checkpoint files. It turned out that the HDF5 library
itself requires a considerable amount of memory for each opened HDF5 file.
When all chunked files of a checkpoint are opened at the same time during
recovery (which is the default) this may cause the simulation to abort with an
'out of memory' error in extreme cases.
This patch introduces a new steerable boolean parameter
IOHDF5::open_one_input_file_at_a_time
which, if set to "yes", will tell the recovery code to open/read/close chunked
files one after another for each refinement level, thus avoiding excessive
HDF5-internal memory requirements due to multiple open files.
The default behaviour is (as before) to keep all input files open until all
refinement levels are recovered.
darcs-hash:20071019091424-3fd61-834471be8da361b235d0a4cbf3d6f16ae0b653f0.gz
|