summaryrefslogtreecommitdiff
path: root/torch/nn
diff options
context:
space:
mode:
authoralbanD <alban@robots.ox.ac.uk>2018-11-28 15:25:09 -0800
committerFacebook Github Bot <facebook-github-bot@users.noreply.github.com>2018-11-28 15:28:17 -0800
commitf80d34a1c8e3b413d134c9648090b123f042fd2b (patch)
tree6e4ad56cee85f67a9f671f5fcaf7ed931c88b495 /torch/nn
parentfb7e40b7eb395c812327018415050bb25b1c5b6b (diff)
downloadpytorch-f80d34a1c8e3b413d134c9648090b123f042fd2b.tar.gz
pytorch-f80d34a1c8e3b413d134c9648090b123f042fd2b.tar.bz2
pytorch-f80d34a1c8e3b413d134c9648090b123f042fd2b.zip
Update Tensor doc (#14339)
Summary: Add to the Tensor doc info about `.device`, `.is_cuda`, `.requires_grad`, `.is_leaf` and `.grad`. Update the `register_backward_hook` doc with a warning stating that it does not work in all cases. Add support in the `_add_docstr` function to add docstring to attributes. There is an explicit cast here but I am not sure how to handle it properly. The thing is that the doc field for getsetdescr is written as being a const char * (as all other doc fields in descriptors objects) in cpython online documentation. But in the code, it is the only one that is not const. I assumed here that it is a bug in the code because it does not follow the doc and the convention of the others descriptors and so I cast out the const. EDIT: the online doc I was looking at is for 3.7 and in that version both the code and the doc are const. For older versions, both are non const. Please let me know if this should not be done. And if it should be done if there is a cleaner way to do it ! Pull Request resolved: https://github.com/pytorch/pytorch/pull/14339 Differential Revision: D13243266 Pulled By: ezyang fbshipit-source-id: 75b7838f7cd6c8dc72b0c61950e7a971baefaeeb
Diffstat (limited to 'torch/nn')
-rw-r--r--torch/nn/modules/module.py10
1 files changed, 10 insertions, 0 deletions
diff --git a/torch/nn/modules/module.py b/torch/nn/modules/module.py
index 4f020df7b4..2bd6775e73 100644
--- a/torch/nn/modules/module.py
+++ b/torch/nn/modules/module.py
@@ -398,6 +398,16 @@ class Module(object):
:class:`torch.utils.hooks.RemovableHandle`:
a handle that can be used to remove the added hook by calling
``handle.remove()``
+
+ .. warning ::
+
+ The current implementation will not have the presented behavior
+ for complex :class:`Module` that perform many operations.
+ In some failure cases, :attr:`grad_input` and :attr:`grad_output` will only
+ contain the gradients for a subset of the inputs and outputs.
+ For such :class:`Module`, you should use :func:`torch.Tensor.register_hook`
+ directly on a specific input or output to get the required gradients.
+
"""
handle = hooks.RemovableHandle(self._backward_hooks)
self._backward_hooks[handle.id] = hook