aboutsummaryrefslogtreecommitdiff
path: root/doc/ThornGuide.tex
blob: 743066ef5669147e34ee02ef8642ca396e6843b0 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
% Thorn documentation template
\documentclass{article}
\begin{document}

\title{EllPETSc}
\author{Paul Walker, Gerd Lanfermann}
\date{\today}
\maketitle

\abstract{{\tt EllPETSc} provides 3D elliptic solvers for the various
classes of elliptic problems defined in {\tt EllBase}. {\tt EllPETSc}
using the ``Portable, Extendable Toolkit for Scientific computation'' (PETSc)
by Argonne National Lab. PETSc is a suite of routines and data
structures that can be employed for solving partial differential
equations in parallel. {\tt EllPETSc} t is called by the
interfaces provided in {\tt EllBase}.}

\section{Purpose}
This thorns provides sophisticated solvers based on PETSc
libraries. It supports all the interfaces defined in {\tt EllBase}.
At this point is not optimized for performance. Expect improvements as 
we develop the elliptic solver arrangement.

This thorn provides
 \begin{enumerate}
  \item No Pizza
  \item No Wine
  \item peace
 \end{enumerate}

\section{Technical Details}
This thorn supports three elliptic problem classes: {\bf LinFlat} for 
a standard 3D cartesian Laplace operator, using the standard 7-point
computational molecule. {\bf LinMetric} for a Laplace operator derived
from the metric, using 19-point stencil. {\bf LinConfMetric} for a
Laplace operator derived from the metric and a conformal factor, using 
a 19-point stencil. The code of the solvers differs for the classes
and is explained in the following section. 

\subsection{Installing PETSc}
PETSc needs to be installed on the machine and the environment
variables {\tt PETSC\_ARCH} and {\tt PETSC\_DIR} have to be set to compile
{\tt EllPETSc}. PETSc can obtained for free at {\tt
http://www-fp.mcs.anl.gov/petsc/}. Cactus needs to be compiled with 
MPI. While PETSc can  be compiled for single processor mode (without
MPI), Cactus has only been tested and used with the parallel version
of PETSc requiring MPI. For detailed information in how to install
PETSc refer to the documentation.

\subsection{{\bf LinFlat}}
For this class we employ the the 7-point stencil based on only. 
These values are constant at each gridpoint.

\subsection{{\bf LinMetric}}
For this class the standard 19-point stencil is initialized, taken the 
underlying metric into account. The values for the stencil function
differ at each gridpoints.

\subsection{{\bf LinConfMetric}}
For this class the standard 19-point stencil is initialized, taken the 
underlying metric and its conformal factor into account. The values
for the stencil function differ at each gridpoints.

\subsection{Interfacing PETSc}
The main task when interfacing PETSc consists of transferring data
from the Cactus parallel data structures (gridfunctions) to the
parallel structures provided by PETSc.

Here we explain the main steps, to be read with the code at hands.
\begin{enumerate}
\item{} The indices {\tt imin,imax \ldots} are calculated and describe
the starting/ending points  in {\em 3D local index space}: ghostzones
are not included here.
\item A {\em linear global index} is calculated describing the starting/ending
points{} in {\em linear global index space}. Ghostzone are not included
here.
\item{} A lookup gridfunction {\tt wsp} is loaded identifying the {\em 3D local
index} with the {\em linear global index}. Values of zero indicate boundaries.
\item{} PETSc matrices/vectors are created specifying the linear size: global
endpoint - global startpoint.
\item{} For the elliptic class {\tt LinFlat} the stencil functions are 
initialized with the standard 7-point stencil, the class {\tt
LinMetric} and {\tt LinConfMetric} require a more sophisticated
treatment described later.
\item{} Looping over the processor local grid points (in 3D local
index space) the PETSc vectors and coefficient matrix is loaded if no
boundary is present ({\tt wsp[i,j,k]} not equal zero.);
\item{} Starting the PETSc vector and matrix assembly, nested for
performance as recommended by PETSc.
\item{} Creation of the elliptic solver context and setting of
options, followed by the call to the PETSc solver.
\item{} Upon completion of the solve, the PETSc solution has to
transferred to the Cactus data structures.
\end{enumerate} 

\section{Comments}
The sizes of the arrays {\tt Mlinear} for the coefficient matrix and
{\tt Nsource} are passed in the solver. A storage flag is set if these 
variables are of a sized greater 1. In this case, the array can be
accessed.

%\section{My own section}

% Automatically created from the ccl files
% Do not worry for now.
\include{interface}
\include{param}
\include{schedule}

\end{document}