diff options
author | jekbradbury <jekbradbury@gmail.com> | 2017-08-14 18:35:40 -0700 |
---|---|---|
committer | Adam Paszke <adam.paszke@gmail.com> | 2017-08-15 10:01:36 +0530 |
commit | 5e088da5ba021735ccb8e562ecd2f39fdd1162a7 (patch) | |
tree | 7f97e560877eb4525e0745dddbaf56f8ae78e43e /docs/source | |
parent | b0da5bf0fbfc43c9e6ce94b38889b7f6fb871425 (diff) | |
download | pytorch-5e088da5ba021735ccb8e562ecd2f39fdd1162a7.tar.gz pytorch-5e088da5ba021735ccb8e562ecd2f39fdd1162a7.tar.bz2 pytorch-5e088da5ba021735ccb8e562ecd2f39fdd1162a7.zip |
Add DistributedDataParallel to docs
DataParallel was included twice.
Diffstat (limited to 'docs/source')
-rw-r--r-- | docs/source/nn.rst | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/docs/source/nn.rst b/docs/source/nn.rst index 221e8be7d5..7de95cc95a 100644 --- a/docs/source/nn.rst +++ b/docs/source/nn.rst @@ -620,7 +620,7 @@ DataParallel layers (multi-GPU, distributed) :hidden:`DistributedDataParallel` ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -.. autoclass:: torch.nn.parallel.DataParallel +.. autoclass:: torch.nn.parallel.DistributedDataParallel :members: |