From 5d85de85f54d33179c62d660a44bd0091c62430c Mon Sep 17 00:00:00 2001 From: Timo Heister Date: Thu, 27 Aug 2020 10:22:16 -0400 Subject: [PATCH] add step-50 DOI --- examples/step-50/doc/intro.dox | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/examples/step-50/doc/intro.dox b/examples/step-50/doc/intro.dox index cd8b05742d..21d34cdb1c 100644 --- a/examples/step-50/doc/intro.dox +++ b/examples/step-50/doc/intro.dox @@ -9,6 +9,7 @@ Infrastructure in Geodynamics initiative (CIG), through the NSF under Award EAR-0949446 and EAR-1550901 and The University of California -- Davis. +@dealiiTutorialDOI{10.5281/zenodo.4004166,https://zenodo.org/badge/DOI/10.5281/zenodo.4004166.svg} @note As a prerequisite of this program, you need to have both p4est and either the PETSc or Trilinos library installed. The installation of deal.II together with these additional @@ -28,6 +29,11 @@ in step-16 (but for parallel computations) and a matrix-free version discussed in step-37. The goal is to find out which approach leads to the best solver for large parallel computations. +This tutorial is based on one of the numerical examples in @cite +clevenger_par_gmg. Please see that publication for a detailed background on +the multigrid implementation in deal.II. We will summarize some of the results +in the following text. + Algebraic multigrid methods are obviously the easiest to implement with deal.II since classes such as TrilinosWrappers::PreconditionAMG and PETScWrappers::PreconditionBoomerAMG are, in essence, black box -- 2.39.5