From: Wolfgang Bangerth Date: Wed, 7 May 2025 21:07:35 +0000 (-0600) Subject: Better document the difference between GMRES and FGMRES. X-Git-Url: https://gitweb.dealii.org/cgi-bin/gitweb.cgi?a=commitdiff_plain;h=fba48289d333a32524cb3b4e06a600c85b197fba;p=dealii.git Better document the difference between GMRES and FGMRES. --- diff --git a/include/deal.II/lac/solver_gmres.h b/include/deal.II/lac/solver_gmres.h index 1a91447193..29c92e61e9 100644 --- a/include/deal.II/lac/solver_gmres.h +++ b/include/deal.II/lac/solver_gmres.h @@ -293,7 +293,7 @@ namespace internal /** * Implementation of the Restarted Preconditioned Direct Generalized Minimal - * Residual Method. The stopping criterion is the norm of the residual. + * Residual Method (GMRES). The stopping criterion is the norm of the residual. * * The AdditionalData structure allows to control the size of the Arnoldi * basis used for orthogonalization (default: 30 vectors). It is related to @@ -308,13 +308,24 @@ namespace internal *

Left versus right preconditioning

* * @p AdditionalData allows you to choose between left and right - * preconditioning. As expected, this switches between solving for the systems - * P-1A and AP-1, respectively. + * preconditioning. Left preconditioning, conceptually, corresponds to + * replacing the linear system $Ax=b$ by $P^{-1}Ax=P^{-1}b$ where + * $P^{-1}$ is the preconditioner (i.e., an approximation of + * $A^{-1}$). In contrast, right preconditioning should be understood + * as replacing $Ax=b$ by $AP^{-1}y=b$, solving for $y$, and then + * computing the solution of the original problem as $x=P^{-1}y$. Note + * that in either case, $P^{-1}$ is simply an operator that can be + * applied to a vector; that is, it is not the inverse of some + * operator that also separately has to be available. In practice, + * $P^{-1}$ should be an operator that approximates multiplying by + * $A^{-1}$. * - * A second consequence is the type of residual used to measure - * convergence. With left preconditioning, this is the preconditioned - * residual, while with right preconditioning, it is the residual of the - * unpreconditioned system. + * The choice between left and right preconditioning also affects + * which kind of residual is used to measure convergence. With left + * preconditioning, this is the preconditioned residual + * $r_k=P^{-1}b-P^{-1}Ax_k$ given the approximate solution $x_k$ in + * the $k$th iteration, while with right preconditioning, it is the + * residual $r_k=b-Ax_k$ of the unpreconditioned system. * * Optionally, this behavior can be overridden by using the flag * AdditionalData::use_default_residual. A true value refers to the @@ -323,6 +334,47 @@ namespace internal * case, impeding the overall performance of the solver. * * + *

Preconditioners need to be linear operators

+ * + * GMRES expects the preconditioner to be a *linear* operator, i.e., + * the operator $P^{-1}$ used as preconditioner needs to satisfy + * $P^{-1}(x+y) = P^{-1}x + P^{-1}y$ and $P^{-1}(\alpha x) = \alpha + * P^{-1}x$. For many preconditioners, this is true. For example, if + * you used Jacobi preconditioning, then $P^{-1}$ is a diagonal matrix + * whose diagonal entries equal $\frac{1}{A_{ii}}$. In this case, the + * operator $P^{-1}$ is clearly linear since it is simply the + * multiplication of a given vector by a fixed matrix. + * + * On the other hand, if $P^{-1}$ involves more complicated + * operations, it is sometimes *not* linear. The typical case to + * illustrate this is where $A$ is a block matrix and $P^{-1}$ + * involves multiplication with blocks (as done, for example, in + * step-20, step-22, and several other preconditioners) where one + * block involves a linear solve. For example, in a Stokes problem, + * the preconditioner may involve a linear solve with the upper left + * $A_{uu}$ block. If this linear solve is done exactly (e.g., via a + * direct solver, or an iterative solver with a very tight tolerance), + * then the linear solve corresponds to multiplying by $A^{-1}_{uu}$, + * which is a linear operation. On the other hand, if one uses an + * iterative solver with a loose tolerance (e.g., + * `1e-3*right_hand_side.l2_norm()`), then many solvers like CG will + * find the solution in a Krylov subspace of fairly low dimension; + * crucially, this subspace is built iteratively starting with the + * initial residual -- in other words, the *subspace depends on the + * right hand side*, and consequently the solution returned by such a + * solver *is not a linear operation on the given right hand side* of + * the linear system being solved. + * + * In cases such as these, the preconditioner with its inner, inexact + * linear solve is not a linear operator. This violates the + * assumptions of GMRES, and often leads to unnecessarily many GMRES + * iterations. The solution is to use the SolverFGMRES class instead, + * which does not rely on the assumption that the preconditioner is a + * linear operator, and instead explicitly does the extra work + * necessary to satisfy the assumptions that lead GMRES to implicitly + * require a linear operator as preconditioner. + * + * *

The size of the Arnoldi basis

* * The maximal basis size is controlled by AdditionalData::max_basis_size. If @@ -816,10 +868,13 @@ private: * preconditioning (flexible GMRES or FGMRES). * * This flexible version of the GMRES method allows for the use of a - * different preconditioner in each iteration step. Therefore, it is also - * more robust with respect to inaccurate evaluation of the preconditioner. - * An important application is the use of a Krylov space method inside the - * preconditioner with low solver tolerance. + * different preconditioner in each iteration step; in particular, + * this also allows for the use of preconditioners that are not linear + * operators. Therefore, it is also more robust with respect to + * inaccurate evaluation of the preconditioner. An important + * application is the use of a Krylov space method inside the + * preconditioner with low solver tolerance. See the documentation of + * the SolverGMRES class for an elaboration of the issues involved. * * For more details see @cite Saad1991. *