EpetraExt  Development
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Macros Groups Pages
Trilinos/PETSc Interface

Introduction

The Epetra_PETScAIJMatrix class is a lightweight wrapper class for encapsulating PETSc serial and parallel AIJ matrices. Its intended audience is PETSc users who would like to build and apply Trilinos preconditioners. This class derives from the Epetra_RowMatrix class.

Epetra_PETScAIJMatrix utilizes callbacks for as many access and apply functions as possible. In particular, fetching individual rows and matrix-vector multiplies are callbacks.

Requirements

This class was first released with Trilinos 9.0 and PETSc 2.3.3. Instructions for building this can be found in the Trilinos 9.0 doxygen comments. The most current version is available in the Trilinos development version as of 4/30/2009, and was released with Trilinos 10.0. It has been tested in parallel with PETSc versions 3.0.0-p4, 3.0.0-p7, and 3.0.0-p12, built in static library mode. It currently requires MPI, i.e., you cannot build a purely serial version. PETSc must be built with a C compiler, i.e., do not use the option "--with-clanguage=cxx". Note: This will not work with an installed version of PETSc (i.e., you have run "make install"). The reason is that the interface requires access to certain low level functions for good performance, but these functions are not exposed in an installed version of PETSc.

Configuring and building the Trilinos libraries

You must specify PETSC_DIR, PETSC_ARCH, and PETSC_LIB when configuring Trilinos. PETSC_LIB can be determined by simply typing "make getlinklibs" in the root directory where you built PETSc.

In the following, we assume that the PETSc build directory is ~/petsc-3.0.0-p4, the Trilinos source directory is ~/Trilinos, the Trilinos build directory is ~/TrilinosBuild, and that your mpi compilers are in /usr/local/mpich-1.2.7p1.

First set the following environmental variables.

setenv TRILINOS_HOME /home/joeblow/Trilinos
setenv PETSC_DIR /home/joeblow/petsc-3.0.0-p4
setenv PETSC_ARCH linux-gnu-c-debug
setenv PETSC_LIB "-Wl,-rpath,/home/joeblow/petsc-3.0.0-p4/linux-gnu-c-debug/lib
  -L/home/joeblow/petsc-3.0.0-p4/linux-gnu-c-debug/lib -lpetscts -lpetscsnes
  -lpetscksp -lpetscdm -lpetscmat -lpetscvec -lpetsc -lX11
  -lmkl_intel_lp64 -Wl,--start-group -lmkl_intel_thread -lmkl_core -Wl,--end-group -lguide -lpthread
  -lm -L/usr/local/mpich-1.2.7p1/lib -L/usr/local/intel/mkl/10.0.3.020/lib/em64t
  -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2 -L/usr/lib64 -L/lib64 -ldl
  -lmpich -lpthread -lrt -lgcc_s -lg2c -lm -L/usr/lib/gcc/x86_64-redhat-linux/3.4.6
  -lm -ldl -lmpich -lpthread -lrt -lgcc_s -ldl"

setenv PETSC_INCLUDE_PATH "${PETSC_DIR}/${PETSC_ARCH}/include;${PETSC_DIR}/include;${PETSC_DIR}"

Then change directories to TrilinosBuild and issue the configure line below.

cmake \
  -D Trilinos_ENABLE_STRONG_C_COMPILE_WARNINGS:BOOL=OFF \
  -D CMAKE_INSTALL_PREFIX:PATH="${PWD}" \
  -D TPL_ENABLE_MPI:BOOL=ON \
\
  -D Trilinos_ENABLE_ALL_OPTIONAL_PACKAGES:BOOL=OFF \
  -D Trilinos_ENABLE_Amesos:BOOL=ON \
  -D Trilinos_ENABLE_AztecOO:BOOL=ON \
  -D Trilinos_ENABLE_Belos:BOOL=ON \
  -D Trilinos_ENABLE_Epetra:BOOL=ON \
  -D Trilinos_ENABLE_EpetraExt:BOOL=ON \
  -D Trilinos_ENABLE_Galeri:BOOL=ON \
  -D Trilinos_ENABLE_Ifpack:BOOL=ON \
  -D Trilinos_ENABLE_Isorropia:BOOL=ON \
  -D Trilinos_ENABLE_ML:BOOL=ON \
  -D Trilinos_ENABLE_Teuchos:BOOL=ON \
  -D Trilinos_ENABLE_TrilinosCouplings:BOOL=ON \
  -D Trilinos_ENABLE_Triutils:BOOL=ON \
\
  -D EpetraExt_USING_PETSC:BOOL=ON \
  -D TPL_ENABLE_PETSC:BOOL=ON \
  -D PETSC_LIBRARY_DIRS:FILEPATH="${PETSC_LIB}" \
  -D PETSC_INCLUDE_DIRS:FILEPATH="${PETSC_INCLUDE_PATH}" \
  -D TPL_PETSC_LIBRARIES:STRING="${PETSC_LIB}" \
  -D TPL_PETSC_INCLUDE_DIRS:STRING="${PETSC_INCLUDE_PATH}" \
\
  -D TrilinosCouplings_ENABLE_TESTS:BOOL=ON \
  -D TrilinosCouplings_ENABLE_EXAMPLES:BOOL=ON \
  -D ML_ENABLE_TESTS:BOOL=ON \
  -D ML_ENABLE_EXAMPLES:BOOL=ON \
  -D DART_TESTING_TIMEOUT:STRING=600 \
  ${TRILINOS_HOME}

You may need to specify the location of your MPI compilers with these cmake options:

CMAKE_CXX_COMPILER:FILEPATH=/full/path/to/c++/compiler
CMAKE_C_COMPILER:FILEPATH=/full/path/to/c/compiler
CMAKE_Fortran_COMPILER:FILEPATH=/full/path/to/fortran/compiler

or, if your mpi is installed in a base directory with subdirectories "bin", "lib", and "include":

MPI_BASE_DIR:PATH=/full/path/to/mpi/base/directory

You may also need to specify the location and name of the blas and lapack libraries with these cmake options:

BLAS_LIBRARY_DIRS:FILEPATH="/full/path/to/blas"
LAPACK_LIBRARY_DIRS:FILEPATH="/full/path/to/blas"
BLAS_LIBRARY_NAMES:STRING="blasLibraryName"
LAPACK_LIBRARY_NAMES:STRING="lapackLibraryName"

Once Trilinos is configured successfully, in the directory "TrilinosBuild" type

make
make install

The following Trilinos packages must be enabled: Epetra (basic linear algebra), EpetraExt (contains the PETSc interface), and Teuchos (parameter lists, smart pointers, and other useful utility classes). We also strongly suggest enabling ML (algebraic multigrid), Ifpack (SOR, incomplete factorizations, and domain decomposition methods), and Amesos (sparse direct solvers) for a richer preconditioner set. If you want to build and run the example from within Trilinos, you must also enable TrilinosCouplings.

Linking a PETSc application to the Trilinos libraries

We would expect most applications to be linking to Trilinos as a set of third party libraries. Towards this end, in the Trilinos directory where you have installed the headers (~/TrilinosBuild/include in the above example), there is a stub file called "Makefile.export.XX", where XX is the package name. The stub defines variables that contain all of the include and link dependencies for that particular package. For example, "Makefile.export.EpetraExt" defines EPETRAEXT_TPL_INCLUDES and EPETRAEXT_TPL_LIBRARIES, which contain EpetraExt's include and link dependencies, respectively. An application developer simply includes this stub in her Makefile, and uses the variables in the appropriate compile and link lines.

The example in the following section can be built as an "external" application with the following simple Makefile, assuming that Trilinos was built with ML enabled:

TRILINOS_BUILD_DIR=/home/joeblow/TrilinosBuild
include $(TRILINOS_BUILD_DIR)/include/Makefile.export.ML

all:
    $(ML_CXX_COMPILER) -c -o EpetraExt_petsc.o $(CXXFLAGS) -I$(TRILINOS_BUILD_DIR)/include $(ML_TPL_INCLUDES) EpetraExt_petsc.cpp
    $(ML_CXX_COMPILER) -o petsc.exe EpetraExt_petsc.o $(ML_CXX_FLAGS) -L$(TRILINOS_BUILD_DIR)/lib $(ML_LIBRARIES) \
                 $(ML_TPL_LIBRARIES) $(ML_EXTRA_LD_FLAGS)

Example: Solving a PETSc linear system with PETSc's CG preconditioned by Trilinos' algebraic multigrid

We now dissect the example Trilinos/packages/trilinoscouplings/examples/epetraext/EpetraExt_petsc.cpp. In this example, a PETSc aij matrix corresponding to the 2D Laplacian and right-hand side corresponding to the solution of all ones are constructed. The resulting linear system is solved twice, the first time with CG from AztecOO, the second time with CG from PETSc. In both cases, the preconditioner is algebraic multigrid (AMG) from the Trilinos package ML.

The include file "ml_config.h" contains definitions for all the appropriate Trilinos preprocessor macros. Of the following include files, "EpetraExt_PETScAIJMatrix.h" and "ml_MultiLevelPreconditioner.h" are required to wrap the PETSc matrix and create the ML preconditioner.

#include "ml_config.h"
#include "EpetraExt_config.h"
#if defined(HAVE_PETSC) && defined(HAVE_ML_EPETRA) && defined(HAVE_ML_TEUCHOS) && defined(HAVE_ML_AZTECOO) && defined(HAVE_MPI)
#include "petscksp.h"
#include "Epetra_Vector.h"
#include "Epetra_Map.h"
#include "ml_MultiLevelPreconditioner.h"
#include "AztecOO.h"
#include "Epetra_LinearProblem.h"

Here, we have omitted the details of the aij matrix construction. After the PETSc matrix is constructed, it is wrapped as an Epetra_PETScAIJMatrix:

Note that this is a lightweight wrap – no matrix data is copied!

The parameter list for the multigrid preconditioner is created and populated. (For more information on multigrid options, please see the ML user's guide.)

Teuchos::ParameterList mlList;
/* Set recommended defaults for Poisson-like problems. */
ML_Epetra::SetDefaults("SA",mlList);
/* Specify how much information ML prints to screen.
0 is the minimum (no output), 10 is the maximum. */
mlList.set("ML output",10);

In this case, we are going to use symmetric Gauss-Seidel as the fine-grid smoother. Since this requires access to individual matrix rows, it is much more efficient to use the PETSc implementation, which will be optimized for the underlying aij data structure. (There is no performance penalty associated with Trilinos smoothers whose kernel is a matrix-vector multiply, as the multiply is simply a callback to PETSc's native implementation.) To do this, the KSP object kspSmoother is created and populated:

ierr = KSPCreate(comm,&kspSmoother);CHKERRQ(ierr);
ierr = KSPSetOperators(kspSmoother,A,A);CHKERRQ(ierr);
ierr = KSPSetType(kspSmoother,KSPRICHARDSON);CHKERRQ(ierr);
ierr = KSPSetTolerances(kspSmoother, 1e-12, 1e-50, 1e7,1);
ierr = KSPSetInitialGuessNonzero(kspSmoother,PETSC_TRUE);CHKERRQ(ierr);
ierr = KSPGetPC(kspSmoother,&pc);CHKERRQ(ierr);
ierr = PCSetType(pc, PCSOR);CHKERRQ(ierr);
ierr = PCSORSetSymmetric(pc,SOR_LOCAL_SYMMETRIC_SWEEP);CHKERRQ(ierr);
ierr = PCSetFromOptions(pc);CHKERRQ(ierr);
ierr = KSPSetUp(kspSmoother);CHKERRQ(ierr);

The fine grid smoother (level 0) is set to be of PETSc type

mlList.set("smoother: type (level 0)","petsc");

and kspSmoother is placed in the parameter list.

Note that trying to set a PETSc smoother on any other level will generate an error. Also note that ML has many other smoothers available, e.g., polynomial and incomplete factorizations. Please see the ML User's guide, available from the ML homepage or in Trilinos/packages/ml/doc/mlguide.pdf, for more information.

mlList.set("smoother: petsc ksp (level 0)",(void*)kspSmoother);

Finally, the ML AMG preconditioner is constructed.

ML_Epetra::MultiLevelPreconditioner *Prec = new ML_Epetra::MultiLevelPreconditioner(epA,mlList);

The linear system is solved first using CG from the Trilinos package AztecOO.

epu.PutScalar(0.0);
Epetra_LinearProblem Problem(&epA, &epu, &epb);
AztecOO solver(Problem);
solver.SetAztecOption(AZ_solver, AZ_cg);
solver.SetAztecOption(AZ_output, 1);
solver.SetAztecOption(AZ_conv, AZ_noscaled);
solver.SetPrecOperator(Prec);
solver.Iterate(30, 1e-12);

The system is solved a second time using CG from PETSc. The AMG preconditioner Prec must be wrapped as a PETSc shell preconditioner

ierr = KSPGetPC(ksp,&pc);CHKERRQ(ierr);
ierr = PCSetType(pc,PCSHELL);CHKERRQ(ierr);
ierr = PCShellSetApply(pc,ShellApplyML);CHKERRQ(ierr);
ierr = PCShellSetContext(pc,(void*)Prec);CHKERRQ(ierr);
ierr = PCShellSetName(pc,"ML AMG");CHKERRQ(ierr);

where the apply method is given by

Other preconditioners, such as those provided by Ifpack or AztecOO, can be wrapped in a similar fashion.