| Commit message (Collapse) | Author | Age |
|
|
|
|
|
|
|
| |
Use "m" instead of "map" as local variable name.
Remove "Carpet::" qualifier in front of variable "maps".
darcs-hash:20070112224022-dae7b-0c5241b73c1f4a8ff4722e04bc70ed047d6158da.gz
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Implement variable-specific output request option 'compression_level'
so that users can specify eg.
IOHDF5::compression_level = 1
IOHDF5::out_vars = "admbase::metric
admconstraints::hamiltonian
admbase::lapse{ compression_level = 0 }"
to request HDF5 dataset compression for every output variable except for the
lapse.
This modification also requires an update of thorn CactusBase/IOUtil.
darcs-hash:20061117132206-776a0-0e1d07a85cf206fa262a94fd0dd63c6f27e50fa2.gz
|
|
|
|
|
|
|
|
|
| |
With the new steerable integer parameter IOHDF5::compression_level users can
now request gzip dataset compression while writing HDF5 files: levels from 1-9
specify a specific compression rate, 0 (which is the default) disables dataset
compression.
darcs-hash:20061117115153-776a0-7aaead5d2a0216841a27e091fddb9b6c4f40eed4.gz
|
|
|
|
| |
darcs-hash:20061005133135-776a0-1550cde9db6e0a375b661b2ab6b0ed9c762bbe9d.gz
|
|
|
|
|
|
|
| |
If no thorn provided this aliased function, CarpetIOHDF5 assumes that no
coordinate information is available.
darcs-hash:20061004144616-776a0-00da12cca7d6b6ad1ae0a38a96923f771239de79.gz
|
|
|
|
| |
darcs-hash:20060925220348-dae7b-303594fd2b999c93d2b816a9d3f11d0d97e391c2.gz
|
|
|
|
| |
darcs-hash:20060925220323-dae7b-040ddfb0afc83c15cd4802fe26fe4822826a2e8a.gz
|
|
|
|
|
|
|
| |
Save grid structure in the output files whenever the parameters are
saved, even when it is not a checkpoint file.
darcs-hash:20060925220235-dae7b-09e23cc6ec48e20df0b560356f19648a67e955dd.gz
|
|
|
|
| |
darcs-hash:20060925220415-dae7b-2b72aef51ae3d8b056b18221acd9989668a2864d.gz
|
|
|
|
|
|
|
|
|
|
|
|
| |
../exe/wave/hdf5toascii_slicer [--match <regex string>]
...
where
[--match <regex string>] selects HDF5 datasets by their names
matching a regex string using POSIX
Extended Regular Expression syntax
darcs-hash:20060901085302-776a0-3a9cfb71f9008b1a7bf93d9857195f92b67f1e25.gz
|
|
|
|
| |
darcs-hash:20060831174548-776a0-bf352f210d8736f48a666654309f0ad3443cf3a3.gz
|
|
|
|
|
|
| |
files
darcs-hash:20060828173429-776a0-b231124b73645983b8ed56efa35d5ec2c9354add.gz
|
|
|
|
|
|
|
| |
Clarify an ambiguity in the --help output: timestep selection is by
cctk_time value, not by cctk_iteration
darcs-hash:20060822142353-b0a3f-e0ad1ea0e2f4eca7038fc2d77b0ff3d724b97aab.gz
|
|
|
|
|
|
| |
Adds a missing #include <cmath> so std::fabs() can be used
darcs-hash:20060822114437-b0a3f-f85a976d02c36aa6311c7b6916aad107747954b2.gz
|
|
|
|
|
|
|
| |
The new command line parameter option '--timestep <timestep>' allows users to
select an individual timestep in an HDF5 file.
darcs-hash:20060821200601-776a0-8e977e93014eded2d6f6cab376209ce7d4293b94.gz
|
|
|
|
|
|
|
|
| |
While Intel C++ (9.0) had no problems compiling hdf5toascii_slicer.cc at all,
GNU C++ generates an error when assigning -424242424242 to a long int.
Also fixed some other things that g++ warns about.
darcs-hash:20060821094951-776a0-84d68b511de0bbd65b212755d699a415779c9674.gz
|
|
|
|
|
|
| |
is used
darcs-hash:20060817173012-dae7b-857f1313b12144e1ba997d029990a0464bb36df5.gz
|
|
|
|
| |
darcs-hash:20060814142824-776a0-1553d2404adc099fea75b546d4169f184dc5d3ab.gz
|
|
|
|
|
|
|
| |
This utility program extracts 2D slices from CarpetIOHDF5 output files and
prints them to stdout in CarpetIOASCII format.
darcs-hash:20060808123300-776a0-4088993ec64ad291510977ee5c10d521863ef2ac.gz
|
|
|
|
|
|
|
|
|
|
|
| |
Due to bug in my previous patch, the logic for removing the checkpoint file
after successful recovery was wrong: it was removed if IO::recover_and_remove
was set to "false".
This patch fixes this bug by reversing the logic.
Thanks to Ian Hinder for noticing this and presenting the fix.
darcs-hash:20060626162548-776a0-8d3ebc0c43a74cb3faa892aa2a410e13bb37825e.gz
|
|
|
|
| |
darcs-hash:20060613171412-dae7b-cf8a7c6112d6c364bd7f4e7568e0df2c683c01f3.gz
|
|
|
|
|
|
|
| |
If IO::recover_and_remove is set, the recovery file will also be removed after
IO::checkpoint_keep successful checkpoints have been written.
darcs-hash:20060607102431-776a0-92edd93f6dc004ab824b237fbd03ee732f7a3841.gz
|
|
|
|
|
|
| |
points (zero size)
darcs-hash:20060512115514-776a0-dba29d6e31a12d4cff6772e69bd1ef54e3aa2d8b.gz
|
|
|
|
|
|
|
|
|
| |
datatype for H5Sselect_hyperslab() arguments
This patch lets you compile CarpetIOHDF5 also with HDF5-1.8.x (and future
versions).
darcs-hash:20060511172957-776a0-acbc1bd6b8d92223c0b52a43babf394c0ab9b0f4.gz
|
|
|
|
|
|
|
| |
Check whether a group has storage only after checking whether it
should be output.
darcs-hash:20060511203215-dae7b-20604fda3117034cccf38998561b7e3bed1e6873.gz
|
|
|
|
|
|
|
| |
Correct errors in the handling of the parameter
"use_grid_structure_from_checkpoint".
darcs-hash:20060508193609-dae7b-c5cf907171eb31e8298669cf4bd4aa03f2c79429.gz
|
|
|
|
|
|
|
|
|
|
|
| |
Add a parameter "use_grid_structure_from_checkpoint" that reads the
grid structure from the checkpoint file, and sets up the Carpet grid
hierarchy accordingly.
The Carpet grid hierarchy is written unconditionally to all checkpoint
files.
darcs-hash:20060413202124-dae7b-f97e6aac2267ebc5f5e3867cbf78ca52bbd33016.gz
|
|
|
|
|
|
|
|
|
| |
When recovering from a checkpoint, each processor now continuously reads through
all chunked files until all grid variables on this processor have been fully
recovered. This should always minimise the number of individual checkpoint files
necessary to open on each processor.
darcs-hash:20060217160928-776a0-28c076749861c0b26d1c41a6f4ef3bdb00c23274.gz
|
|
|
|
|
|
|
|
| |
This patch introduces some optimisation for the case when recovering with the
same number of processors as used during the checkpoint: each processor
opens only its own chunked file and reads its metadata, skipping all others.
darcs-hash:20060212200032-776a0-3dd501d20b8efb66faa715b401038218bb388b4f.gz
|
|
|
|
|
|
| |
While outputting dataset attributes, an HDF5 dataspace wasn't closed properly.
darcs-hash:20060212143008-776a0-41e46c61bce2dc22fbfc7093d2ad776bfae00687.gz
|
|
|
|
|
|
|
|
|
|
|
|
| |
The scheduled routine CarpetIOHDF5_CloseFiles() was declared to return an int
and take no arguments. Instead it must be declared to take a 'const cGH* const'
argument. It should also return void.
See http://www.cactuscode.org/old/pipermail/developers/2006-February/001656.html.
This patch also fixes a couple of g++ warning about signed-unsigned integer
comparisons.
darcs-hash:20060209165534-776a0-24101ebd8c09cea0a9af04acc48f8e2aa2961e34.gz
|
|
|
|
|
|
|
|
|
|
|
| |
This patch finally closes the long-standing issue of keeping both old and new
CarpetIOHDF5 parameters around. Now the old parameters have been removed, only
the new ones can be used further on.
If you still have old-style parameter files, you must convert them now.
The perl script CarpetIOHDF5/srcutil/SubstituteDeprecatedParameters.pl does
that for you automatically.
darcs-hash:20060206184519-776a0-29d9d612e011dda4bf2b6054cee73546beae373a.gz
|
|
|
|
|
|
|
|
| |
Accumulate any low-level errors returned by HDF5 library calls and check them
after writing a checkpoint. Do not remove an existing checkpoint if there were
any low-level errors in generating the previous one.
darcs-hash:20060206183846-776a0-549e715d7a3fceafe70678aaf1329052dce724bb.gz
|
|
|
|
|
|
|
|
| |
When the filereader was used, CarpetIOHDF5 still checked all grid variables
whether they had been read completely from a datafile, even those which
weren't even specified in the IO::filereader_ID_vars parameter.
darcs-hash:20060201174945-776a0-faa9fe295ef273ffd38308bbda7fde092503513c.gz
|
|
|
|
|
|
| |
warnings about the use of deprecated I/O parameters
darcs-hash:20060127164814-776a0-89f59f04f6118191ba7a965cf72e3c6c548c817d.gz
|
|
|
|
|
|
|
|
|
|
|
| |
The recovery code didn't properly recover grid functions with multiple maps:
all maps were initialised with the data from map 0.
This patch fixes the problem so that checkpointing/recovery should work now
also for multipatch applications.
The patch only affects recovery code, meaning it will also work with older
checkpoint files.
darcs-hash:20060120164515-776a0-68f93cb5fb197f805beedfdc176fd8da9b7bfc49.gz
|
|
|
|
| |
darcs-hash:20051119212808-dae7b-023ea8552306cda54ab6204ee338809a55228a3b.gz
|
|
|
|
| |
darcs-hash:20051119212959-dae7b-d50e2cc4c8a980720b44cfafd9504eb201e3aa8b.gz
|
|
|
|
| |
darcs-hash:20051119212924-dae7b-3447198d7a1d4090ffc6cff4cde12ccf037c5e8f.gz
|
|
|
|
|
|
|
|
| |
Before reading a variable from a dataset, also check that its timelevel is valid.
This fixes problems when recovering from a checkpoint (created with
'Carpet::enable_all_storage = true') and this boolean parameter set to 'false'.
darcs-hash:20051120134642-776a0-4fe21611ca733ecb42f8e2a82bfa1fe51a5d9e81.gz
|
|
|
|
|
|
|
| |
Don't remove an initial data checkpoint file if IO::checkpoint_keep is set to
a value larger 0.
darcs-hash:20051116133326-776a0-5fa5bd333cd26434609e920cf49434551db9ff2e.gz
|
|
|
|
|
|
|
|
|
|
| |
be output
Apart from setting the parameter IO::out_unchunked to choose the output mode
for all variables, this can be overridden for individual variables in an option
string appended to the variable's name in the IOHDF5::out_vars parameter.
darcs-hash:20051005100152-776a0-9f6f2e4b691a46b12aefab555440625f39836aaf.gz
|
|
|
|
|
|
|
|
|
|
| |
single-processor runs
For single-processor runs, CarpetIOHDF5 unconditionally wrote HDF5 output files
in unchunked format. Like for multi-processor runs, the user can now choose
between chunked and unchunked through the out_unchunked parameter.
darcs-hash:20050918214401-776a0-882e8b1e6dcee4d25330bc11d4b6973e297f1a52.gz
|
|
|
|
| |
darcs-hash:20050913162936-776a0-7b3fa7d3f08c37321b6ea836178168131fa98964.gz
|
|
|
|
|
|
|
|
|
|
|
| |
"yes" during recovery
When IOHDF5::use_reflevels_from_checkpoint is set, the parameter CarpetLib::refinement_levels is steered to take the number of levels found in the checkpoint.
This steering used to happen during parameter recovery where it didn't have
any effect if the parameter had been set in the parfile already.
Now it's done in a separate routine scheduled at STARTUP.
darcs-hash:20050906140808-776a0-bae608c103b161ac67690da2a8803bdff84cf2f4.gz
|
|
|
|
|
|
|
|
|
|
| |
Before a variable is output it is checked whether it has been output already
during the current iteration (eg. due to triggers).
This check was only variable-based and therefore caused problems when the
same variable was to be output to multiple files (using different alias names).
Now the check has been extended to also take the output filenames into account.
darcs-hash:20050823135345-776a0-1555987b4aee34bb646e67f491375dbcc44dddad.gz
|
|
|
|
|
|
|
|
|
| |
CarpetLib's comm_state class (actually, it's still just a struct) has been
extended to handle collective buffer communications for all possible C datatypes
at the same time. This makes it unnecessary for the higher-level communication
routines to loop over each individual datatype separately.
darcs-hash:20050815150023-776a0-dddc1aca7ccaebae872f9f451b2c3595cd951fed.gz
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
from a single chunked checkpoint file
The list of HDF5 datasets to process is now reordered so that processor-local
components are processed first. When processing the list of datasets, only
those will be reopened from which more data is to be read. The check when
a variable of a given timelevel has been fully recovered was improved.
This closes http://bugs.carpetcode.org/show_bug.cgi?id=87 "Single-file,
many-cpu recovery very slow".
darcs-hash:20050728122546-776a0-21dfceef87e12e72b8a0ccb0911c76066521e192.gz
|
|
|
|
|
|
|
|
|
|
|
|
| |
Util_asprintf()) to construct C output strings
There was a small memory leak in using Util_asprintf() to continuously append
to an allocated string buffer. The code has now been rewritten to use C++ string
class objects which are destroyed automatically.
This closes http://bugs.carpetcode.org/show_bug.cgi?id=89.
darcs-hash:20050726122331-776a0-874ccd0d5766b85b1110fcd6f501a7e39c35e965.gz
|
|
|
|
|
|
| |
written successfully by all output processors
darcs-hash:20050725150549-776a0-fe03ace195af6a723af91ca7d0a63eaeae25b050.gz
|