summaryrefslogtreecommitdiff
path: root/tools/autograd/derivatives.yaml
diff options
context:
space:
mode:
authorElias Ellison <eellison@fb.com>2018-11-28 19:14:16 -0800
committerFacebook Github Bot <facebook-github-bot@users.noreply.github.com>2018-11-28 19:16:38 -0800
commit6d63e9dbfffba9f925ac3af5232390a76aa54dce (patch)
tree7924d57fc8f42429c12a911e3234c7ce49cbd991 /tools/autograd/derivatives.yaml
parent105fa58748076a19682a4c0c9dee878a9575d7ed (diff)
downloadpytorch-6d63e9dbfffba9f925ac3af5232390a76aa54dce.tar.gz
pytorch-6d63e9dbfffba9f925ac3af5232390a76aa54dce.tar.bz2
pytorch-6d63e9dbfffba9f925ac3af5232390a76aa54dce.zip
Support Embedding + EmbeddingBag in Script + (Ignore flakey test) (#14509)
Summary: Resubmitting PR #14415 The tests added for Embedding + EmbeddingBag had random numbers as input, which affected the random number generator & caused the flakey test to break. Everything but the last two commits have already been accepted Pull Request resolved: https://github.com/pytorch/pytorch/pull/14509 Differential Revision: D13247917 Pulled By: eellison fbshipit-source-id: ea6963c47f666c07687787e2fa82020cddc6aa15
Diffstat (limited to 'tools/autograd/derivatives.yaml')
-rw-r--r--tools/autograd/derivatives.yaml3
1 files changed, 3 insertions, 0 deletions
diff --git a/tools/autograd/derivatives.yaml b/tools/autograd/derivatives.yaml
index ad83012a3b..1ea59ec93d 100644
--- a/tools/autograd/derivatives.yaml
+++ b/tools/autograd/derivatives.yaml
@@ -898,6 +898,9 @@
- name: embedding_renorm_(Tensor self, Tensor indices, double max_norm, double norm_type)
self: not_implemented("embedding_renorm")
+- name: no_grad_embedding_renorm_(Tensor self, Tensor indices, double max_norm, double norm_type)
+ output_differentiability: [False, False, False, False]
+
- name: kl_div(Tensor self, Tensor target, int64_t reduction)
self: kl_div_backward(grad, self, target, reduction)
target: kl_div_target_backward(grad, self, target, reduction)