summaryrefslogtreecommitdiff
path: root/docs/tutorial
diff options
context:
space:
mode:
authorAdam Siembida <w1res@users.noreply.github.com>2015-11-12 16:03:41 -0500
committerAdam Siembida <w1res@users.noreply.github.com>2015-11-12 16:03:41 -0500
commitd81ffbff8cda56de1fe6c41b7156d781f775c7b3 (patch)
tree980331b90a085148c13fb56750a42bd1e821c4b8 /docs/tutorial
parent19028e7975fa464ce6835d513a25111e6aedda47 (diff)
downloadcaffeonacl-d81ffbff8cda56de1fe6c41b7156d781f775c7b3.tar.gz
caffeonacl-d81ffbff8cda56de1fe6c41b7156d781f775c7b3.tar.bz2
caffeonacl-d81ffbff8cda56de1fe6c41b7156d781f775c7b3.zip
Add parentheses to backward_{cpu,gpu} method.
Diffstat (limited to 'docs/tutorial')
-rw-r--r--docs/tutorial/forward_backward.md2
1 files changed, 1 insertions, 1 deletions
diff --git a/docs/tutorial/forward_backward.md b/docs/tutorial/forward_backward.md
index a645f002..528b993b 100644
--- a/docs/tutorial/forward_backward.md
+++ b/docs/tutorial/forward_backward.md
@@ -29,7 +29,7 @@ The backward pass begins with the loss and computes the gradient with respect to
These computations follow immediately from defining the model: Caffe plans and carries out the forward and backward passes for you.
- The `Net::Forward()` and `Net::Backward()` methods carry out the respective passes while `Layer::Forward()` and `Layer::Backward()` compute each step.
-- Every layer type has `forward_{cpu,gpu}()` and `backward_{cpu,gpu}` methods to compute its steps according to the mode of computation. A layer may only implement CPU or GPU mode due to constraints or convenience.
+- Every layer type has `forward_{cpu,gpu}()` and `backward_{cpu,gpu}()` methods to compute its steps according to the mode of computation. A layer may only implement CPU or GPU mode due to constraints or convenience.
The [Solver](solver.html) optimizes a model by first calling forward to yield the output and loss, then calling backward to generate the gradient of the model, and then incorporating the gradient into a weight update that attempts to minimize the loss. Division of labor between the Solver, Net, and Layer keep Caffe modular and open to development.