]> https://gitweb.dealii.org/ - dealii.git/commitdiff
Fix some spelling errors in documentation of AD module 8120/head
authorJean-Paul Pelteret <jppelteret@gmail.com>
Sun, 12 May 2019 22:22:26 +0000 (00:22 +0200)
committerJean-Paul Pelteret <jppelteret@gmail.com>
Sun, 12 May 2019 23:10:27 +0000 (01:10 +0200)
doc/doxygen/headers/automatic_and_symbolic_differentiation.h

index 6832ff3e7ceeaeb7ca59752d6f1f372a92270460..ff8f13cec973f82454d2d7db4861c8ea911fb9af 100644 (file)
@@ -65,7 +65,7 @@
  * the Tensor and SymmetricTensor classes should support calculations performed with these specialized
  * numbers.
  * (In theory an entire program could be made differentiable. This could be useful in, for example,
- * the sentitivity analysis of solutions with respect to input parameters. However, to date this has
+ * the sensitivity analysis of solutions with respect to input parameters. However, to date this has
  * not been tested.)
  *
  * Implementations of specialized frameworks based on <em>operator overloading</em> typically fall into
  *    function.
  *
  * Each of these methods, of course, has its advantages and disadvantages, and one may be more appropriate
- * than another for a given problem that is to be solved. As the aforemetioned implementational details
+ * than another for a given problem that is to be solved. As the aforementioned implementational details
  * (and others not discussed) may be hidden from the user, it may still be important to understand the
  * implications, run-time cost,  and potential limitations, of using any one of these "black-box"
  * auto-differentiable numbers.
  * In the most practical sense, any of the above categories exploit the chain-rule to compute the total
  * derivative of a composite function. To perform this action, they typically use one of two mechanisms to
  * compute derivatives, specifically
- * - <em>forward-mode</em> (or <em>forward accumulation</em>) auto-differentation, or
- * - <em>reverse-mode</em> (or <em>reverse accumulation</em>) auto-differentation.
+ * - <em>forward-mode</em> (or <em>forward accumulation</em>) auto-differentiation, or
+ * - <em>reverse-mode</em> (or <em>reverse accumulation</em>) auto-differentiation.
  *
  * As a point of interest, the <em>optimal Jacobian accumulation</em>, which performs a minimal set of
  * computations, lies somewhere between these two limiting cases. Its computation for a general composite

In the beginning the Universe was created. This has made a lot of people very angry and has been widely regarded as a bad move.

Douglas Adams


Typeset in Trocchi and Trocchi Bold Sans Serif.