summaryrefslogtreecommitdiff
path: root/docs/source/optim.rst
diff options
context:
space:
mode:
authorKai Arulkumaran <Kaixhin@users.noreply.github.com>2017-12-18 07:43:08 +0000
committerSoumith Chintala <soumith@gmail.com>2017-12-18 02:43:08 -0500
commite9ef20eab5e5cf361bdc7a425c7f8b873baad9d3 (patch)
tree8e4564b78f2e5880d40d21db453b59415808233f /docs/source/optim.rst
parent847c56aeb5857fc4d3f5df88b9e8f937939bb8cc (diff)
downloadpytorch-e9ef20eab5e5cf361bdc7a425c7f8b873baad9d3.tar.gz
pytorch-e9ef20eab5e5cf361bdc7a425c7f8b873baad9d3.tar.bz2
pytorch-e9ef20eab5e5cf361bdc7a425c7f8b873baad9d3.zip
Add Cosine Annealing LR Scheduler (#3311)
* Add Cosine Annealing LR Scheduler * Update eta_min in tests to prevent numerical mistakes * Use non-zero min_eta in test_cos_anneal_lr
Diffstat (limited to 'docs/source/optim.rst')
-rw-r--r--docs/source/optim.rst4
1 files changed, 3 insertions, 1 deletions
diff --git a/docs/source/optim.rst b/docs/source/optim.rst
index 2125d043d1..f44f51a8b8 100644
--- a/docs/source/optim.rst
+++ b/docs/source/optim.rst
@@ -130,7 +130,7 @@ How to adjust Learning Rate
---------------------------
:mod:`torch.optim.lr_scheduler` provides several methods to adjust the learning
-rate based on the number of epoches. :class:`torch.optim.lr_scheduler.ReduceLROnPlateau`
+rate based on the number of epochs. :class:`torch.optim.lr_scheduler.ReduceLROnPlateau`
allows dynamic learning rate reducing based on some validation measurements.
.. autoclass:: torch.optim.lr_scheduler.LambdaLR
@@ -141,5 +141,7 @@ allows dynamic learning rate reducing based on some validation measurements.
:members:
.. autoclass:: torch.optim.lr_scheduler.ExponentialLR
:members:
+.. autoclass:: torch.optim.lr_scheduler.CosineAnnealingLR
+ :members:
.. autoclass:: torch.optim.lr_scheduler.ReduceLROnPlateau
:members: