From: Wolfgang Bangerth Date: Thu, 1 Apr 2004 16:26:47 +0000 (+0000) Subject: Add pieces of true experience to the documentation. X-Git-Tag: v8.0.0~15403 X-Git-Url: https://gitweb.dealii.org/cgi-bin/gitweb.cgi?a=commitdiff_plain;h=b97fd23a21fa38102adc87fcb990878ef51dd677;p=dealii.git Add pieces of true experience to the documentation. git-svn-id: https://svn.dealii.org/trunk@8945 0785d39b-7218-0410-832d-ea1e28bc413d --- diff --git a/deal.II/lac/include/lac/petsc_solver.h b/deal.II/lac/include/lac/petsc_solver.h index f0ebc9999d..ad0a56eed3 100644 --- a/deal.II/lac/include/lac/petsc_solver.h +++ b/deal.II/lac/include/lac/petsc_solver.h @@ -47,6 +47,14 @@ namespace PETScWrappers * control object and the MPI * communicator over which parallel * computations are to happen. + * + * Note that the communicator used here + * must match the communicator used in + * the system matrix, solution, and + * right hand side object of the solve + * to be done with this + * solver. Otherwise, PETSc will + * generate hard to track down errors. */ SolverBase (SolverControl &cn, MPI_Comm &mpi_communicator); @@ -185,6 +193,14 @@ namespace PETScWrappers * call in this default argument * because otherwise gcc 2.95 generates * a compiler fault. + * + * Note that the communicator used here + * must match the communicator used in + * the system matrix, solution, and + * right hand side object of the solve + * to be done with this + * solver. Otherwise, PETSc will + * generate hard to track down errors. */ SolverRichardson (SolverControl &cn, MPI_Comm &mpi_communicator = PETSC_COMM_SELF, @@ -242,6 +258,14 @@ namespace PETScWrappers * The last argument takes a structure * with additional, solver dependent * flags for tuning. + * + * Note that the communicator used here + * must match the communicator used in + * the system matrix, solution, and + * right hand side object of the solve + * to be done with this + * solver. Otherwise, PETSc will + * generate hard to track down errors. */ SolverChebychev (SolverControl &cn, MPI_Comm &mpi_communicator = PETSC_COMM_SELF, @@ -299,6 +323,14 @@ namespace PETScWrappers * The last argument takes a structure * with additional, solver dependent * flags for tuning. + * + * Note that the communicator used here + * must match the communicator used in + * the system matrix, solution, and + * right hand side object of the solve + * to be done with this + * solver. Otherwise, PETSc will + * generate hard to track down errors. */ SolverCG (SolverControl &cn, MPI_Comm &mpi_communicator = PETSC_COMM_SELF, @@ -356,6 +388,14 @@ namespace PETScWrappers * The last argument takes a structure * with additional, solver dependent * flags for tuning. + * + * Note that the communicator used here + * must match the communicator used in + * the system matrix, solution, and + * right hand side object of the solve + * to be done with this + * solver. Otherwise, PETSc will + * generate hard to track down errors. */ SolverBiCG (SolverControl &cn, MPI_Comm &mpi_communicator = PETSC_COMM_SELF, @@ -431,6 +471,14 @@ namespace PETScWrappers * call in this default argument * because otherwise gcc 2.95 generates * a compiler fault. + * + * Note that the communicator used here + * must match the communicator used in + * the system matrix, solution, and + * right hand side object of the solve + * to be done with this + * solver. Otherwise, PETSc will + * generate hard to track down errors. */ SolverGMRES (SolverControl &cn, MPI_Comm &mpi_communicator = PETSC_COMM_SELF, @@ -488,6 +536,14 @@ namespace PETScWrappers * The last argument takes a structure * with additional, solver dependent * flags for tuning. + * + * Note that the communicator used here + * must match the communicator used in + * the system matrix, solution, and + * right hand side object of the solve + * to be done with this + * solver. Otherwise, PETSc will + * generate hard to track down errors. */ SolverBicgstab (SolverControl &cn, MPI_Comm &mpi_communicator = PETSC_COMM_SELF, @@ -545,6 +601,14 @@ namespace PETScWrappers * The last argument takes a structure * with additional, solver dependent * flags for tuning. + * + * Note that the communicator used here + * must match the communicator used in + * the system matrix, solution, and + * right hand side object of the solve + * to be done with this + * solver. Otherwise, PETSc will + * generate hard to track down errors. */ SolverCGS (SolverControl &cn, MPI_Comm &mpi_communicator = PETSC_COMM_SELF, @@ -602,6 +666,14 @@ namespace PETScWrappers * The last argument takes a structure * with additional, solver dependent * flags for tuning. + * + * Note that the communicator used here + * must match the communicator used in + * the system matrix, solution, and + * right hand side object of the solve + * to be done with this + * solver. Otherwise, PETSc will + * generate hard to track down errors. */ SolverTFQMR (SolverControl &cn, MPI_Comm &mpi_communicator = PETSC_COMM_SELF, @@ -664,6 +736,14 @@ namespace PETScWrappers * The last argument takes a structure * with additional, solver dependent * flags for tuning. + * + * Note that the communicator used here + * must match the communicator used in + * the system matrix, solution, and + * right hand side object of the solve + * to be done with this + * solver. Otherwise, PETSc will + * generate hard to track down errors. */ SolverTCQMR (SolverControl &cn, MPI_Comm &mpi_communicator = PETSC_COMM_SELF, @@ -721,6 +801,14 @@ namespace PETScWrappers * The last argument takes a structure * with additional, solver dependent * flags for tuning. + * + * Note that the communicator used here + * must match the communicator used in + * the system matrix, solution, and + * right hand side object of the solve + * to be done with this + * solver. Otherwise, PETSc will + * generate hard to track down errors. */ SolverCR (SolverControl &cn, MPI_Comm &mpi_communicator = PETSC_COMM_SELF, @@ -778,6 +866,14 @@ namespace PETScWrappers * The last argument takes a structure * with additional, solver dependent * flags for tuning. + * + * Note that the communicator used here + * must match the communicator used in + * the system matrix, solution, and + * right hand side object of the solve + * to be done with this + * solver. Otherwise, PETSc will + * generate hard to track down errors. */ SolverLSQR (SolverControl &cn, MPI_Comm &mpi_communicator = PETSC_COMM_SELF,