diff options
author | Tongzhou Wang <SsnL@users.noreply.github.com> | 2018-04-30 21:21:56 +0800 |
---|---|---|
committer | GitHub <noreply@github.com> | 2018-04-30 21:21:56 +0800 |
commit | d9aeb7e71ba3a17c14d437cdd2b8e1e3ca2f09a1 (patch) | |
tree | 3e888e4a8c6f0b432eb75c9f14d0915015dcbb99 /tools | |
parent | 8fbab83c2a036b3ba00944d6c95504b430734db3 (diff) | |
download | pytorch-d9aeb7e71ba3a17c14d437cdd2b8e1e3ca2f09a1.tar.gz pytorch-d9aeb7e71ba3a17c14d437cdd2b8e1e3ca2f09a1.tar.bz2 pytorch-d9aeb7e71ba3a17c14d437cdd2b8e1e3ca2f09a1.zip |
clamp now has subgradient 1 at min and max (#7049)
* subgradient 1 at min and max for clamp
* clamp max and clamp min too
* add comment
Diffstat (limited to 'tools')
-rw-r--r-- | tools/autograd/derivatives.yaml | 9 |
1 files changed, 6 insertions, 3 deletions
diff --git a/tools/autograd/derivatives.yaml b/tools/autograd/derivatives.yaml index ff24c9d08e..a711b040ea 100644 --- a/tools/autograd/derivatives.yaml +++ b/tools/autograd/derivatives.yaml @@ -163,14 +163,17 @@ - name: ceil(Tensor self) self: zeros_like(grad) +# For clamp, clamp_min, and clamp_max, gradient is not defined at the +# boundaries. But empirically it's helpful to be able to get gradient on min and +# max, so we return the subgradient 1 for these cases. - name: clamp(Tensor self, Scalar min, Scalar max) - self: grad * (self > min).type_as(grad) * (self < max).type_as(grad) + self: grad * ((self >= min) * (self <= max)).type_as(grad) - name: clamp_min(Tensor self, Scalar min) - self: grad * (self > min).type_as(grad) + self: grad * (self >= min).type_as(grad) - name: clamp_max(Tensor self, Scalar max) - self: grad * (self < max).type_as(grad) + self: grad * (self <= max).type_as(grad) - name: clone(Tensor self) self: grad |