summaryrefslogtreecommitdiff
path: root/docs/tutorial
diff options
context:
space:
mode:
authorEvan Shelhamer <shelhamer@imaginarynumber.net>2017-02-27 11:54:37 -0800
committerGitHub <noreply@github.com>2017-02-27 11:54:37 -0800
commit85ab6100a122042c7dfd4adaf06f4c0b2e71148d (patch)
tree8058758a9bb9ed908c5c52dd498dadadbdf47964 /docs/tutorial
parent16467ff149c880f752414ee2c241c01040d1a05f (diff)
downloadcaffeonacl-85ab6100a122042c7dfd4adaf06f4c0b2e71148d.tar.gz
caffeonacl-85ab6100a122042c7dfd4adaf06f4c0b2e71148d.tar.bz2
caffeonacl-85ab6100a122042c7dfd4adaf06f4c0b2e71148d.zip
fix broken link to hinge loss
Diffstat (limited to 'docs/tutorial')
-rw-r--r--docs/tutorial/layers.md2
1 files changed, 1 insertions, 1 deletions
diff --git a/docs/tutorial/layers.md b/docs/tutorial/layers.md
index a903d5ac..2faacc58 100644
--- a/docs/tutorial/layers.md
+++ b/docs/tutorial/layers.md
@@ -128,7 +128,7 @@ Layers:
* [Infogain Loss](layers/infogainloss.html) - a generalization of MultinomialLogisticLossLayer.
* [Softmax with Loss](layers/softmaxwithloss.html) - computes the multinomial logistic loss of the softmax of its inputs. It's conceptually identical to a softmax layer followed by a multinomial logistic loss layer, but provides a more numerically stable gradient.
* [Sum-of-Squares / Euclidean](layers/euclideanloss.html) - computes the sum of squares of differences of its two inputs, $$\frac 1 {2N} \sum_{i=1}^N \| x^1_i - x^2_i \|_2^2$$.
-* [Hinge / Margin](layers/hiddenloss.html) - The hinge loss layer computes a one-vs-all hinge (L1) or squared hinge loss (L2).
+* [Hinge / Margin](layers/hingeloss.html) - The hinge loss layer computes a one-vs-all hinge (L1) or squared hinge loss (L2).
* [Sigmoid Cross-Entropy Loss](layers/sigmoidcrossentropyloss.html) - computes the cross-entropy (logistic) loss, often used for predicting targets interpreted as probabilities.
* [Accuracy / Top-k layer](layers/accuracy.html) - scores the output as an accuracy with respect to target -- it is not actually a loss and has no backward step.
* [Contrastive Loss](layers/contrastiveloss.html)