| Commit message (Collapse) | Author | Age |
|
|
|
|
|
|
|
| |
last commit to CarpetIOHDF5 broke this.
This commit also updates the test suite data so that it actually tests
the file format. This commit adds a level 2 warning if no grid structure
is found in file.
|
|
|
|
|
| |
this allows the reader to read a dataset into a different variable than
which one was written. Eg. GRHydro::dens in PPAnalysis::dens.
|
|
|
|
| |
out_group_separator chooses the string by which thorn name and group name are separated in file names. The default is "::" for backward compatibility. This parameter only affects output where CarpetIO*::one_file_per_group is set; otherwise, the thorn name does not appear in the file name.
|
|
|
|
| |
Right now there is no facility to actually use this test unfortunately.
|
| |
|
|
|
|
| |
example and testsuite parfiles
|
|
|
|
| |
This isn't necessary but very convenient to generate complete thornlists with the MakeThornList script.
|
| |
|
|
|
|
| |
initialisation scheme (ie. explicite initialisation of all three timelevels, resulting in slightly different output data)
|
|
|
|
| |
changed
|
|
|
|
| |
- don't output sum norms (not suitable for testsuites)
|
|
|
|
|
|
|
|
|
| |
In order to make the recovery testsuites pass, one would have to specify
a rather large absolute tolerance for the sum norms. Since this isn't
possible for an individual norm, I decided to omit the output of sum norms
entirely.
darcs-hash:20070223165809-776a0-83dbbb59855996cec51bd6b9da945d4bf467bd4e.gz
|
|
|
|
|
|
|
|
|
| |
Some of CarpetIOHDF5's testsuites produce CarpetIOASCII output files
which format is always dependent on the number of processors used.
In the newly added test.ccl configuration file this number is explicitely
specified for running the testsuites successfully.
darcs-hash:20070118100106-776a0-1317671e489dd815dafd7c4e376f9d5dcc45c552.gz
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Update the filereader testsuite so that it passes again, also on multiple
processors. Added output of more norms for comparison.
The testsuite was broken because its norms output files had been created by
CactusBase/IOBasic. Temporarily going back to using that I/O thorn verified
that the testsuite still gave the same results.
After updating the parfile to use Carpet's scalar output method, both the output
filenames changed (hence the removing of old output files and adding new ones)
as well as their data contents - the latter due to CarpetIOScalar calling
CCTK_Reduce() in global mode and CactusBase/IOBasic calling it in level mode.
darcs-hash:20060915115726-776a0-dbf8ce75815a6e302b90dbd799c1e0e56274da43.gz
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Doubled Driver::global_nsize in order to get the grids properly nested.
This requires an update of the checkpoint and all output files.
While both CarpetWaveToyRecover_test_[14]proc testsuites continue to work
the CarpetWaveToyNewRecover_test_1proc is still broken, although it should
give exactly the same results as CarpetWaveToyRecover_test_1proc.
It needs to be investigated that IOHDF5::use_reflevels_from_checkpoint really
works as expected.
darcs-hash:20060911164811-776a0-70a5d06de9506fa4ea68672ed5776c0b236546d0.gz
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
again
This patch partially undoes patch 'CarpetIOHDF5: Add test case for grid
structure recovery' (recorded Mon May 8 21:46:21 CEST 2006) by
* renewing the checkpoint file (containing same data as before above's patch,
now with grid structure added)
* fixing the 1D output files (by removing comment lines which aren't output
anymore by CarpetIOASCII)
It leaves the CarpetWaveToyNewRecover_test_1proc testsuite unchanged,
ie. broken as it was from the beginning.
darcs-hash:20060911160325-776a0-5339d1271436f39e6d9c2fc55e136bcbf9f5fe4e.gz
|
|
|
|
| |
darcs-hash:20060508194621-dae7b-3094a05a1414c3ba19a0661a03d78102417a918b.gz
|
|
|
|
|
|
|
| |
The checkpoint/recovery testsuite now uses a tarball which does not contain the old I/O parameters anymore.
All parameter files now use CarpetIOBasic and CarpetIOScalar as a replacement for CactusBase/IOBasic. This modification also required an update of various output files.
darcs-hash:20060206190059-776a0-1c88d51f696442a15fd4c3182af23f9c9a5d5048.gz
|
|
|
|
|
|
| |
Filereader files are found in IO::filereader_ID_dir and not in IO::recover_dir.
darcs-hash:20050623155818-776a0-cc24468227060880e5cb1a7c6259ffceb7536fad.gz
|
|
|
|
|
|
|
|
|
|
|
| |
The CarpetIOASCII 1D output files now have two additional comment lines:
# 1D ASCII output created by CarpetIOASCII
#
This caused the testsuite to be broken.
darcs-hash:20050614154107-776a0-c530d6ce356996d8c6f14b63c7d4ec3dd5c56b8e.gz
|
|
|
|
|
|
|
|
| |
testsuite
The processor decomposition has changed recently so the chunked output looks different now (but still has the same results).
darcs-hash:20050606115735-776a0-c1d26041bb4cbb73b76908f0a42b086772fd131b.gz
|
|
|
|
|
|
|
| |
The old checkpoint file contained additional variables from my local version
of WaveToyC.
darcs-hash:20050604153525-776a0-ee642d7a86461be70c9aa64a9f614b166794fc34.gz
|
|
|
|
|
|
|
|
|
| |
The wavetoy checkpoint parfile did set CarpetLib::buffer_width = 6 which is not
necessary for wavetoy. It even caused slight differences on 16 processors
because then there apparently were more buffer zones than real gridpoints on
a processor. Carpet should test for this case.
darcs-hash:20050526161905-776a0-843b778a8175c27e966ae7d237c46d843b7dc75b.gz
|
|
|
|
| |
darcs-hash:20050101162121-891bb-ac9d070faecc19f91b4b57389d3507bfc6c6e5ee.gz
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
CarpetIOHDF5 does output for grid variables of any dimensions, not only 3D.
Therefore parameters with '3D' in their names have been marked deprecated now
and should not be used anymore. They are still valid though but you will get
a level-1 warning if you still use them.
At some point in the future those deprecated parameters will be removed.
So you should eventually fix your parameter files to substitute their occurances
by their newly introduced counterparts (parameters of the same name but without
the '3D').
CarpetIOHDF5/src/util/ contains a small perl script which can be applied to
parfiles to automatically substitute old parameter names:
~/cactus/arrangements/Carpet/CarpetIOHDF5/src/util> ./SubstituteDeprecatedParameters.pl
This perl script automatically substitutes deprecated parameter names in a parfile.
Usage: ./SubstituteDeprecatedParameters.pl <parameter file>
darcs-hash:20041203134032-3fd61-5d49fdff6c13f19772c6b441d5d558708dd88c71.gz
|
|
|
|
| |
darcs-hash:20041201135259-3fd61-f3dd4087fd35e004b6a103f4dc604cd4a4a191b3.gz
|
|
|
|
|
|
| |
Replace all CVS header tags with the standard "$Header:$".
darcs-hash:20040918132147-891bb-dea889bdd94a479ec412d14d08e9efca63e5c24d.gz
|
|
|
|
| |
darcs-hash:20040823114441-1d9bf-272546310287747271232487d564efb988d55176.gz
|
|
|
|
| |
darcs-hash:20040819094731-1d9bf-0bc6a754da6dc4e5f66407d838cbf1d2e15a75f8.gz
|
|
|
|
|
|
| |
and input directory.
darcs-hash:20040709133922-1d9bf-856e524dbf98e8dea9c2c78d2d4dbc0a96892b77.gz
|
|
|
|
| |
darcs-hash:20040707141357-1d9bf-184aa06d2e62ed79d5742887ef4fff92b613a22b.gz
|
|
|
|
| |
darcs-hash:20040707134610-1d9bf-5ca669e79f72fde76705199642b44efca0c195d7.gz
|
|
|
|
| |
darcs-hash:20040707134404-1d9bf-c6281af47ac4af8f3914549686c533921952e119.gz
|
|
|
|
|
|
| |
a variable name change in CarpetIOASCII.
darcs-hash:20040628094110-1d9bf-a7f270de47a14ebc4239eacc0908faff6af2ed20.gz
|
|
|
|
| |
darcs-hash:20040628093937-1d9bf-71fc13748248a2329bdda3580d03b07eb2f0123f.gz
|
|
|
|
| |
darcs-hash:20040607093019-19929-1fcc19ce88d347d656f8f4908e552d6c50e58bbf.gz
|
|
|
|
| |
darcs-hash:20040607092228-19929-8555cf7a779133c3bd7786552a956b3d9a080428.gz
|
|
darcs-hash:20010301114010-f6438-12fb8a9ffcc80e86c0a97e37b5b0dae0dbc59b79.gz
|