+++ /dev/null
-<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
- "http://www.w3.org/TR/html4/loose.dtd">
-<html>
- <head>
- <title>The deal.II Readme on interfacing to PETSc and Trilinos</title>
- <link href="../screen.css" rel="StyleSheet">
- <meta name="author" content="the deal.II authors <authors @ dealii.org>">
- <meta name="copyright" content="Copyright (C) 2008, 2009, 2010, 2011, 2012 by the deal.II authors">
- <meta name="date" content="$Date$">
- <meta name="svn_id" content="$Id$">
- <meta name="keywords" content="deal.II">
- </head>
-
- <body>
-
- <h1>Interfacing <acronym>deal.II</acronym> to PETSc and Trilinos</h1>
-
- <h3>About PETSc, Trilinos, and the <acronym>deal.II</acronym>
- interfaces</h3>
-
- <p>
- <acronym>deal.II</acronym> can interface to the
- <a href="http://www.mcs.anl.gov/petsc/" target="_top">PETSc</a> and
- <a href="http://trilinos.sandia.gov" target="_top">Trilinos</a> software
- libraries. Both of these libraries provide lots of functions for linear
- algebra, among other things, for example implementations of a variety of
- linear solvers, as well as various different sparse and dense matrix and
- vector formats. Trilinos also has many subpackages that deal with
- problems that go far beyond linear algebra, for example nonlinear
- solvers, automatic differentiation packages, uncertainty propagation
- engines, etc. Of particular interest to deal.II is their ability to
- provide this functionality both on sequential and parallel (using MPI)
- computers. PETSc is written in C; Trilinos is written in C++ and can be
- considered to be a more modern version of PETSc though both packages are
- under continuing development at their respective national laboratories.
- </p>
-
- <p>
- <acronym>deal.II</acronym> has wrapper classes to the linear algebra
- parts of both packages that provide almost the
- same interfaces as the built-in <acronym>deal.II</acronym> linear
- algebra classes. We use these interfaces for parallel computations based
- on MPI since the native deal.II linear algebra classes lack this ability.
- </p>
-
- <h3>Configuring the interfaces to PETSc and Trilinos</h3>
-
- <p>
- The use of PETSc and Trilinos is optional. To use the wrapper classes,
- you first have to install these packages and
- point <acronym>deal.II</acronym>'s <code>./configure</code> to the
- installation directories. This happens in similar ways for the two
- packages:
- </p>
-
- <h4>PETSc</h4>
-
- <p>
- PETSc usually requires you to set the
- environment variables <code>PETSC_DIR</code> and <code>PETSC_ARCH</code>
- to a path to PETSc and denoting the architecture for which PETSc is
- compiled. If these environment variables are set, then
- <acronym>deal.II</acronym> will pick them up during
- configuration, and store them. It will then also recognize that
- PETSc shall be used, and enable the wrapper classes.
- </p>
-
- <p>
- Alternatively, the <code>-DPETSC_DIR=DIR</code> and
- <code>-DPETSC_ARCH=ARCH</code> options for <code>cmake</code>
- can be used to override the values of <code>PETSC_DIR</code>
- and <code>PETSC_ARCH</code> or if these environment
- variables are not set at all. If you do have a PETSc
- installation and have set the <code>PETSC_DIR</code> and
- <code>PETSC_ARCH</code> environment variables but do not wish
- <acronym>deal.II</acronym> to be configured for PETSc use, you
- should specify <code>-DDEAL_II_WITH_PETSC=OFF</code> as a flag
- during configuration.
- </p>
-
- <p>
- There is an additional caveat: PETSc appears not to co-operate
- well when using threads and some programs crash when deal.II is
- compiled in its usual mode supporting multithreading. If you see
- this sort of behavior, we recommend to try disabling
- multithreading upon configuration of <acronym>deal.II</acronym>
- using the <code>-DDEAL_II_WITH_THREADS=OFF</code> switch for
- <code>cmake</code>.
- </p>
-
- <p>
- Installing both PETSc and deal.II together can be a bit of a
- challenge. A good summary of the relevant steps can be found on
- the <a href="http://dealii.sourceforge.net/index.php?title=Deal.II_Questions_and_Answers"
- target="_top">Frequently Asked Questions</a> page.
- </p>
-
- <p><b>Note:</b> <acronym>deal.II</acronym> can be installed with both
- PETSc and Trilinos (see below) and they do not usually get in their
- respective ways. There are, however, occasions where this is not true
- and this fundamentally comes from the fact that both of these packages
- are built from subpackages that are developed by independent
- groups. Unfortunately, some of these sub-packages can be configured to
- be part of both PETSc and Trilinos, and if you try to
- use <acronym>deal.II</acronym> with versions of PETSc and Trilinos
- that <i>both</i> contain a particular sub-package, little good will come
- of it in general. In particular, we have experienced this with the ML
- package that can serve as an algebraic multigrid method to both PETSc
- and Trilinos. If both of these packages are configured to use ML, then
- difficult to understand error messages at compile or link time are
- almost inevitable, and there is little the <acronym>deal.II</acronym>
- build system can do to prevent this. Thus, <i>don't try to do such a
- thing!</i>
- </p>
-
-
- <h4>Trilinos</h4>
-
- <p>
- As above, set the <code>TRILINOS_DIR</code>
- environment variable to the path to an existing Trilinos installation,
- or use the <code>-DTRILINOS_DIR=/path/to/trilinos</code> switch of
- the deal.II <code>cmake</code> script. It should point to the path
- of which the include and lib directories are subdirs.
- </p>
-
- <p>
- <acronym>deal.II</acronym> and its tutorial programs use several of the
- Trilinos sub-packages.
- </p>
-
- <h5>Trilinos starting with version 10.0</h5>
-
- <p style="color: red">
- Note: Trilinos versions 10.6.x, 10.8.0, 10.8.1, 10.10.2, 10.12.1 are not
- compatible with deal.II. They contain subtle bugs related to (parallel)
- matrices and vectors. Versions tested to work are 10.4.2, 10.8.5, and
- 10.12.2. We recommend only using one of the tested versions for the time
- being.
- </p>
-
- <p>
- Trilinos uses <a href="http://cmake.org/">cmake</a> to configure and
- build. The following slightly longish set of commands will set up a
- reasonable configuration:
- <code>
- <pre>
- cd trilinos-10.4.2
- mkdir build
- cd build
-
- cmake -D Trilinos_ENABLE_OPTIONAL_PACKAGES:BOOL=ON \
- -D Trilinos_ENABLE_Sacado:BOOL=ON \
- -D Trilinos_ENABLE_Stratimikos:BOOL=ON \
- -D CMAKE_BUILD_TYPE:STRING=RELEASE \
- -D CMAKE_CXX_FLAGS:STRING="-g -O3" \
- -D CMAKE_C_FLAGS:STRING="-g -O3" \
- -D CMAKE_FORTRAN_FLAGS:STRING="-g -O5" \
- -D Trilinos_EXTRA_LINK_FLAGS:STRING="-lgfortran" \
- -D CMAKE_VERBOSE_MAKEFILE:BOOL=FALSE \
- -D Trilinos_VERBOSE_CONFIGURE:BOOL=FALSE \
- -D TPL_ENABLE_MPI:BOOL=OFF \
- -D CMAKE_INSTALL_PREFIX:PATH=/w/bangerth/share/x86_64/trilinos-10.4.2 \
- -D BUILD_SHARED_LIBS:BOOL=ON ..
-
- make
- make install
- </pre>
- </code>
- Again, the path into which you want to install Trilinos in the second to
- last line of the cmake command needs to be adjusted. Obviously, if you
- want to use Trilinos with MPI on parallel machines, you also need to
- flip the value of the <code>TPL_ENABLE_MPI</code> flag above. Finally,
- if your computer has more than one processor core, you may want to add
- the <code>-jN</code> flag to the calls to <code>make</code>
- where <code>N</code> is the number of compile jobs you want to run in
- parallel.
- </p>
-
- <p>
- Trilinos sometimes searches for other libraries but can't find
- them if they are not in the usual directories or have other
- names. A common example are BLAS or LAPACK. In a case like
- this, you may have to specifically pass the directories and/or
- library names under which they can be found
- to <code>cmake</code>. For example, this may mean to add the
- following flags to the call above:
- <code>
- <pre>
-
- -D BLAS_LIBRARY_NAMES:STRING=goto \
- -D BLAS_LIBRARY_DIRS:STRING=/apps/GotoBLAS/lib64 \
- -D LAPACK_LIBRARY_NAMES:STRING=lapack \
- -D LAPACK_LIBRARY_DIRS:STRING=/apps/lapack-3.2.1/lib64
- </pre>
- </code>
- </p>
-
- <h3>Configuring for installed Trilinos packages</h3>
-
- <p>
- A system wide installation Support for Trilinos will be enabled
- automatically if a system wide installation of Trilinos can
- be found. If the Trilinos installation does not reside under a
- default system location provide TRILINOS_DIR in the same fashion as
- for the self compiled case:
- <pre>
-
- -DTRILINOS_DIR=/path/to/trilinos
- </pre>
- </p>
-
- <hr>
-
- <address>
- <a href="mail.html" target="body">The deal.II Group</a>
- $Date$
- </address>
- </body>
-</html>
--- /dev/null
+<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
+ "http://www.w3.org/TR/html4/loose.dtd">
+<html>
+ <head>
+ <title>The deal.II Readme on interfacing to PETSc</title>
+ <link href="../screen.css" rel="StyleSheet">
+ <meta name="author" content="the deal.II authors <authors @ dealii.org>">
+ <meta name="copyright" content="Copyright (C) 2008, 2009, 2010, 2011, 2012 by the deal.II authors">
+ <meta name="date" content="$Date$">
+ <meta name="svn_id" content="$Id$">
+ <meta name="keywords" content="deal.II">
+ </head>
+
+ <body>
+
+ <h1>Interfacing <acronym>deal.II</acronym> to PETSc</h1>
+
+ <p>
+ <a href="http://www.mcs.anl.gov/petsc/"
+ target="_top">PETSc</a> is a
+ software package that provides lots of functionality for linear
+ algebra, among other things. For example, it includes implementations of a variety of
+ linear solvers, as well as various different sparse and dense matrix and
+ vector formats. Of particular interest to deal.II is their ability to
+ provide this functionality both on sequential and parallel (using MPI)
+ computers.
+ </p>
+
+ <p>
+ <acronym>deal.II</acronym> has wrapper classes to the linear algebra
+ parts of PETSc that provide almost the
+ same interfaces as the built-in <acronym>deal.II</acronym> linear
+ algebra classes. We use these interfaces for parallel computations based
+ on MPI since the native deal.II linear algebra classes lack this
+ ability. They are used, among other programs, in step-17, step-18 and
+ step-40.
+ </p>
+
+ <h4>Installing <acronym>deal.II</acronym> with PETSc</h4>
+
+ <p>
+ For a general overview of building <acronym>deal.II</acronym> with
+ PETSc, first see the <a href="../readme.html">ReadMe file</a>.
+ </p>
+
+ <p>
+ PETSc usually requires you to set the
+ environment variables <code>PETSC_DIR</code> and <code>PETSC_ARCH</code>
+ to a path to PETSc and denoting the architecture for which PETSc is
+ compiled. If these environment variables are set, then
+ <acronym>deal.II</acronym> will pick them up during
+ configuration, and store them. It will then also recognize that
+ PETSc shall be used, and enable the wrapper classes.
+ </p>
+
+ <p>
+ Alternatively, the <code>-DPETSC_DIR=DIR</code> and
+ <code>-DPETSC_ARCH=ARCH</code> options for <code>cmake</code>
+ can be used to override the values of <code>PETSC_DIR</code>
+ and <code>PETSC_ARCH</code> or if these environment
+ variables are not set at all. If you do have a PETSc
+ installation and have set the <code>PETSC_DIR</code> and
+ <code>PETSC_ARCH</code> environment variables but do not wish
+ <acronym>deal.II</acronym> to be configured for PETSc use, you
+ should specify <code>-DDEAL_II_WITH_PETSC=OFF</code> as a flag
+ during configuration.
+ </p>
+
+ <p>
+ <b>Note:</b> PETSc appears not to co-operate
+ well when using threads and some programs crash when deal.II is
+ compiled in its usual mode supporting multithreading. If you see
+ this sort of behavior, disable
+ multithreading upon configuration of <acronym>deal.II</acronym>
+ using the <code>-DDEAL_II_WITH_THREADS=OFF</code> switch for
+ <code>cmake</code>.
+ </p>
+
+ <p><b>Note:</b> <acronym>deal.II</acronym> can be installed with both
+ PETSc and Trilinos and they do not usually get in their
+ respective ways. There are, however, occasions where this is not true
+ and this fundamentally comes from the fact that both of these packages
+ are built from subpackages that are developed by independent
+ groups. Unfortunately, some of these sub-packages can be configured to
+ be part of both PETSc and Trilinos, and if you try to
+ use <acronym>deal.II</acronym> with versions of PETSc and Trilinos
+ that <i>both</i> contain a particular sub-package, little good will come
+ of it in general. In particular, we have experienced this with the ML
+ package that can serve as an algebraic multigrid method to both PETSc
+ and Trilinos. If both of these packages are configured to use ML, then
+ difficult to understand error messages at compile or link time are
+ almost inevitable, and there is little the <acronym>deal.II</acronym>
+ build system can do to prevent this. Thus, <i>don't try to do that!</i>
+ </p>
+
+
+ <h4>Installing PETSc</h4>
+
+
+ <p>
+ Installing PETSc correctly can be a bit of a
+ challenge. To start, take a look at
+ the <a href="http://www.mcs.anl.gov/petsc/documentation/installation.html"
+ target="_top">PETSc installation instructions</a>. We have found that
+ the following steps generally appear to work where we simply unpack and
+ build PETSc in its final location (i.e., we do not first build and then
+ install it into a separate directory):
+ <pre>
+
+ tar xvzf petsc-x-y-z.tar.gz
+ cd petsc-x-y-z
+ export PETSC_DIR=`pwd`
+ export PETSC_ARCH=x86_64 # or any other identifying text for your machine
+ export LD_LIBRARY_PATH=$PETSC_DIR/$PETSC_ARCH/lib:$LD_LIBRARY_PATH
+ ./config/configure.py --with-shared=1 --with-x=0 --with-mpi=1 --download-hypre=1
+ make
+ </pre>
+ </p>
+
+ <p>
+ This automatically builds PETSc with both MPI and the algebraic
+ multigrid preconditioner package Hypre (which we use in step-40). You
+ may wish to put the <code>export</code> commands into
+ your <code>~/.bashrc</code> or <code>~/.cshrc</code> files, with the
+ first one replaced by something of the kind
+ <pre>
+
+ export PETSC_DIR=/path/to/petsc-x-y-z
+ </pre>
+ </p>
+
+ <hr>
+
+ <address>
+ <a href="../mail.html" target="body">The deal.II Group</a>
+ $Date$
+ </address>
+ </body>
+</html>
--- /dev/null
+<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
+ "http://www.w3.org/TR/html4/loose.dtd">
+<html>
+ <head>
+ <title>The deal.II Readme on interfacing to Trilinos</title>
+ <link href="../screen.css" rel="StyleSheet">
+ <meta name="author" content="the deal.II authors <authors @ dealii.org>">
+ <meta name="copyright" content="Copyright (C) 2008, 2009, 2010, 2011, 2012 by the deal.II authors">
+ <meta name="date" content="$Date$">
+ <meta name="svn_id" content="$Id$">
+ <meta name="keywords" content="deal.II">
+ </head>
+
+ <body>
+
+ <h1>Interfacing <acronym>deal.II</acronym> to Trilinos</h1>
+
+ <p>
+ <a href="http://trilinos.sandia.gov" target="_top">Trilinos</a> is a
+ software package that provides lots of functionality for linear
+ algebra, among other things. For example, it includes implementations of a variety of
+ linear solvers, as well as various different sparse and dense matrix and
+ vector formats. Trilinos also has many subpackages that deal with
+ problems that go far beyond linear algebra, for example nonlinear
+ solvers, automatic differentiation packages, uncertainty propagation
+ engines, etc. Of particular interest to deal.II is their ability to
+ provide this functionality both on sequential and parallel (using MPI)
+ computers. Compared to <a href="http://www.mcs.anl.gov/petsc/"
+ target="_top">PETSc</a>, which is written in C, Trilinos is written in
+ C++ and can be
+ considered to be a more modern version of PETSc though both packages are
+ under continuing development at their respective national laboratories.
+ </p>
+
+ <p>
+ <acronym>deal.II</acronym> has wrapper classes to the linear algebra
+ parts of Trilinos that provide almost the
+ same interfaces as the built-in <acronym>deal.II</acronym> linear
+ algebra classes. We use these interfaces for parallel computations based
+ on MPI since the native deal.II linear algebra classes lack this
+ ability. They are used, among other programs, in step-31 and step-32.
+ </p>
+
+ <p>
+ While building deal.II with Trilinos is covered in
+ the <a href="../readme.html">ReadMe file</a>, we here give an
+ introduction to building Trilinos in such a way that it contains
+ everything that we need from the <acronym>deal.II</acronym> side.
+ </p>
+
+
+ <h5>Installing Trilinos</h5>
+
+ <p style="color: red">
+ Note: Trilinos versions 10.6.x, 10.8.0, 10.8.1, 10.10.2, 10.12.1 are not
+ compatible with deal.II. They contain subtle bugs related to (parallel)
+ matrices and vectors. Versions tested to work are 10.4.2, 10.8.5, and
+ 10.12.2. We recommend only using one of the tested versions for the time
+ being.
+ </p>
+
+ <p>
+ Trilinos uses <a href="http://cmake.org/">cmake</a> to configure and
+ build. The following slightly longish set of commands will set up a
+ reasonable configuration:
+ <pre>
+
+ cd trilinos-10.12.2
+ mkdir build
+ cd build
+
+ cmake -D Trilinos_ENABLE_OPTIONAL_PACKAGES:BOOL=ON \
+ -D Trilinos_ENABLE_Sacado:BOOL=ON \
+ -D Trilinos_ENABLE_Stratimikos:BOOL=ON \
+ -D CMAKE_BUILD_TYPE:STRING=RELEASE \
+ -D CMAKE_CXX_FLAGS:STRING="-g -O3" \
+ -D CMAKE_C_FLAGS:STRING="-g -O3" \
+ -D CMAKE_FORTRAN_FLAGS:STRING="-g -O5" \
+ -D Trilinos_EXTRA_LINK_FLAGS:STRING="-lgfortran" \
+ -D CMAKE_VERBOSE_MAKEFILE:BOOL=FALSE \
+ -D Trilinos_VERBOSE_CONFIGURE:BOOL=FALSE \
+ -D TPL_ENABLE_MPI:BOOL=OFF \
+ -D CMAKE_INSTALL_PREFIX:PATH=/w/bangerth/share/x86_64/trilinos-10.12.2 \
+ -D BUILD_SHARED_LIBS:BOOL=ON ..
+
+ make
+ make install
+ </pre>
+ You will need to adjust the path into which you want to install Trilinos
+ in the second to last line of the cmake command.
+ </p>
+
+ <p>
+ <b>Using MPI:</b> If you
+ want to use Trilinos with MPI on parallel machines,
+ use <code>TPL_ENABLE_MPI=ON</code> instead.
+ </p>
+
+ <p>
+ <b>Parallel builds:</b>
+ If your computer has more than one processor core, use
+ <code>make -jN</code> instead of <code>make</code> in the last two lines
+ above, where <code>N</code> is the number of processors you have.
+ </p>
+
+ <p>
+ Trilinos sometimes searches for other libraries but can't find
+ them if they are not in the usual directories or have other
+ names. A common example are BLAS or LAPACK. In a case like
+ this, you may have to specifically pass the directories and/or
+ library names under which they can be found
+ to <code>cmake</code>. For example, this may mean to add the
+ following flags to the call above:
+ <pre>
+
+ -D BLAS_LIBRARY_NAMES:STRING=goto \
+ -D BLAS_LIBRARY_DIRS:STRING=/apps/GotoBLAS/lib64 \
+ -D LAPACK_LIBRARY_NAMES:STRING=lapack \
+ -D LAPACK_LIBRARY_DIRS:STRING=/apps/lapack-3.2.1/lib64
+ </pre>
+ </p>
+
+ <hr>
+
+ <address>
+ <a href="../mail.html" target="body">The deal.II Group</a>
+ $Date$
+ </address>
+ </body>
+</html>
To disable the PETSc interfaces in cases where <code>cmake</code>
automatically finds it, use <code>-DDEAL_II_WITH_PETSC=OFF</code>
More information on configuring and building PETSC can be
- found <a href="external-libs/petsc-trillinos.html" target="body">here</a>.
+ found <a href="external-libs/petsc.html" target="body">here</a>.
</p>
</dd>
<code>cmake</code> automatically finds it, use
<code>-DDEAL_II_WITH_TRILINOS=OFF</code> More details about
compatibility and configuration can be found
- <a href="external-libs/petsc-trillinos.html" target="body">here</a>.
+ <a href="external-libs/trilinos.html" target="body">here</a>.
</p>
</dd>