CarpetCode Documentation Mailing lists Development Visualisation Results Related Carpet Users Feedback |
Carpet is a mesh refinement driver for Cactus. Cactus is a framework for solving time-dependent partial differential equations on uniform grids, and Carpet is an extension of Cactus that makes mesh refinement possible. Carpet was originally written in 2001 by Erik Schnetter at the TAT (Theoretische Astrophysik Tübingen) and later brought into production use by Erik Schnetter, Scott Hawley, and Ian Hawke at the AEI (Max-Planck-Institut für Gravitationsphysik, Albert-Einstein-Institut). Carpet is currently maintained at the CCT (Center for Computation & Technology) at LSU. These pages describe Carpet and its current development. News
October 3, 2007: Carpet's timing infrastructure has been extended to automatically measure both time spent computing and time spent in I/O. The performance of large simulations depends not only on the computational efficiency and communication latency, but also on the throughput to file servers. These new statistics give a real-time overview and can point out performance problems. The statistics are collected in the existing Carpet::timing variables. August 30, 2007: So far this year, ten of the
publications from three research groups examining the dynamics
of binary black hole systems are based on simulations performed
with Cactus and Carpet:
August 15, 2007: We are happy to hear that our proposal ALPACA: Cactus tools for Application Level Profiling And Correctness Analysis will be funded by NSF's SDCI programme for three years. The ALPACA project is aiming at developing complex, collaborative scientific applications, appropriate for highly scalable hardware architectures, providing fault tolerance, advanced debugging, and transparency against new developments in communication, programming, and execution models. Such tools are especially rare at the application level, where they are most critically needed. July 31, 2007: We are happy to hear that our proposal XiRel: Cyberinfrastructure for Numerical Relativity will be funded by NSF's PIF programme for three years. XiRel is collaborative proposal by LSU, PSU, and UTB (now RIT). The central goal of XiRel is the development of a highly scalable, efficient, and accurate adaptive mesh refinement layer based on the current Carpet driver, which will be fully integrated and supported in Cactus and optimised for numerical relativity. February 26, 2007: The thorn LSUPETSc implements a generic elliptic solver for Carpet's multi-patch infrastructure, based on PETSc. It assumes touching (not overlapping) patches, and uses inter-patch interface conditions very similar to those developed by Harald Pfeiffer. LSUPETSc can solve "arbitrary" systems of coupled, non-linear elliptic equations. It does not support mesh refinement. January 12, 2007: In order to be able to restructure some of Carpet's internals without disturbing ongoing production simulations, we have created an experimental version. The main goals of this experimental version are to improve its performance on many (>100) processors and to re-arrange some internal details to simplify future development. Few new features are planned, but some of the changes may be incompatible. December 15, 2006: The AEI hosted a small workshop to improve the performance of the AEI/LSU CCATIE code for binary black hole simulations, which uses Carpet as AMR driver. We examined especially the effect of various grid structures on accuracy and speed and speeded up the wave extraction routine. We were able to improve the overall performance of the code by a factor of six for a certain benchmark problem simulating a QC-0 configuration. DocumentationWe have accumulated a few pieces of documentation:
Interacting with the developersMost discussions about Carpet, i.e. user questions, feature requests, and bug reports, are held on the Carpet developers' mailing list developers@lists.carpetcode.org. You can subscribe and unsubscribe from our list management web page. You will also find the mailing list archive there. We thank Daniel Kobras for managing the mailing list server. We have started to use Bugzilla to keep track of requested features or reported bugs in Carpet. You can submit or comment on issues from our Bugzilla pages once you have created an account there. The old list of missing features have not yet been moved over to Bugzilla. Pretty picturesHere are some pretty pictures of simulations that were performed with Carpet:
Moving pictures: We can show a movie (animated gif, 3.3 MB) of a scalar wave equation with adaptive mesh refinement. The refinement criterion is a very simplistic local truncation error estimate. We also have a movie (animated gif, 730 kB) of a moving refinement region tracking a black hole. Making sense of resultsThree-dimensional time-dependent simulation results are difficult enough to interpret when the grid is uniform. With mesh refinement, the sheer amount of available data makes it necessary to use professional tools to examine the data. This is not only the case for "big physics runs", where one (should) know in advance what to expect, but especially during development, where things do not always go as planned. Thomas Radke was kind enough to write an import module for the visualisation tool OpenDX. Related projectsErik Schnetter Last modified: Fri Oct 5 2007 |