We now report and comment and example of usage of ML_Epetra::MultiLevelPreconditioner. The source code can be found in ml_preconditioner.cpp.
Trilinos (and hence ML) requires the variable HAVE_CONFIG_H
to be defined, either in the Makefile, or in the file itself. We suggest the following:
#ifndef HAVE_CONFIG_H
#define HAVE_CONFIG_H
#endif
Then, we need to include several header files. Note that the example works for MPI and non-MPI configurations. However, some coarse solver requires MPI (like for instance AMESOS-Superludist
and AMESOS-Mumps
). Trilinos_Util_CrsMatrixGallery.h
is required by this example, and not by ML.
#include "ml_include.h"
#ifdef HAVE_MPI
#include "mpi.h"
#include "Epetra_MpiComm.h"
#else
#include "Epetra_SerialComm.h"
#endif
#include "Epetra_Map.h"
#include "Epetra_IntVector.h"
#include "Epetra_SerialDenseVector.h"
#include "Epetra_Vector.h"
#include "Epetra_CrsMatrix.h"
#include "Epetra_VbrMatrix.h"
#include "Epetra_LinearProblem.h"
#include "Epetra_Time.h"
#include "AztecOO.h"
#include "ml_epetra_preconditioner.h"
#include "Trilinos_Util_CommandLineParser.h"
#include "Trilinos_Util_CrsMatrixGallery.h"
#include "ml_epetra_operator.h"
HAVE_ML_TRIUTILS
, and the header files are required by the example, and not by the ML source. Class Trilinos_Util::CrsMatrixGallery is used to create an example matrix. This Epetra_CrsMatrix is then converted to an ML_Operator (heavy-weight conversion).
- Note
- ML accepts Epetra_RowMatrix's; please refer to ml_preconditioner.cpp.
using namespace ML_Epetra;
using namespace Teuchos;
using namespace Trilinos_Util;
int main(int argc, char *argv[])
{
#ifdef EPETRA_MPI
MPI_Init(&argc,&argv);
#else
#endif
We parse the command line, to create the example matrix, get a pointer to this matrix.
CommandLineParser CLP(argc,argv);
CrsMatrixGallery
Gallery(
"", Comm);
if( CLP.Has("-problem_type") == false ) CLP.Add("-problem_type", "laplace_2d" );
if( CLP.Has("-problem_size") == false ) CLP.Add("-problem_size", "100" );
AztecOO solver(*Problem);
This is the beginning of the ML part. We need to create an ML_handle, convert the input matrix into ML_Operator format, create an ML_Aggregate structure (that will hold the data about the aggregates).
int nLevels = 10;
int maxMgLevels = 6;
ML_Set_PrintLevel(10);
ML_Create(&ml_handle, maxMgLevels);
ML_Aggregate_Create(&agg_object);
At this point we select out aggregation scheme (Uncoupled in this case), and we build the hierarchy.
ML_Aggregate_Set_CoarsenScheme_Uncoupled(agg_object);
nLevels = ML_Gen_MGHierarchy_UsingAggregation(ml_handle, maxMgLevels-1,
ML_DECREASING, agg_object);
\encode
Now, we define the ID of the coarsest level and set up the smoothers. We
suppose to deal with a symmetric problem. We also set a simple coarse solver
-- symmetric Gauss-Seidel.
\code
int coarsestLevel = maxMgLevels - nLevels;
int nits = 1;
for(int level = maxMgLevels-1; level > coarsestLevel; level--)
ML_Gen_Smoother_Cheby(ml_handle, level, ML_BOTH, 30., 3);
ML_Gen_Smoother_SymGaussSeidel(ml_handle, coarsestLevel, ML_BOTH,
nits, ML_DEFAULT);
ML_Gen_Solver(ml_handle, ML_MGV, maxMgLevels-1, coarsestLevel);
MultiLevelOperator MLPrec(ml_handle, Comm, *Map, *Map);
This is the end of the ML settings. We just need to instruct AztecOO about the existence of MLPrec
.
solver.SetPrecOperator(&MLPrec);
solver.SetAztecOption(AZ_solver, AZ_cg_condnum);
solver.SetAztecOption(AZ_output, 16);
solver.Iterate(500, 1e-5);
#ifdef EPETRA_MPI
MPI_Finalize() ;
#endif
return 0 ;
}
To execute this example,/from the command line, you may try something like that:
$ mpirun -np 4 ./ml_operator.exe -problem_type=laplace_3d
-problem_size=1000
For more options for Trilinos_Util::CrsMatrixGallery, consult the Trilinos 4.0 tutorial.