diff options
author | Tongzhou Wang <SsnL@users.noreply.github.com> | 2018-05-23 11:03:12 -0400 |
---|---|---|
committer | GitHub <noreply@github.com> | 2018-05-23 11:03:12 -0400 |
commit | e3e15b5d9534f1c10170e169f1423e9298648e86 (patch) | |
tree | d74cb50a982ca753d74e539a27d66731189130e2 /torch/optim | |
parent | 6a604f16cc452424c58ce456da7f46e1619b9e18 (diff) | |
download | pytorch-e3e15b5d9534f1c10170e169f1423e9298648e86.tar.gz pytorch-e3e15b5d9534f1c10170e169f1423e9298648e86.tar.bz2 pytorch-e3e15b5d9534f1c10170e169f1423e9298648e86.zip |
[PyTorch] [gradcheck] change backward() to grad() (#7710)
* Change backward calls to grad to avoid memory leak from #7343; Replace unnecesary create_graph=True with retain_graph=True
* fix gradgradcheck use of make_non_contiguous
* allow non-contguous target
* remove unnecessray .grad.zero_()
* remove contiguous_detach
* fix PReLU double backward always returning ggW as a scalar
* let noncontig gO require grad
* move requires_grad to return
Diffstat (limited to 'torch/optim')
0 files changed, 0 insertions, 0 deletions