diff options
author | HE, Tao <sighingnow@gmail.com> | 2018-01-12 20:44:56 +0800 |
---|---|---|
committer | Soumith Chintala <soumith@gmail.com> | 2018-01-12 07:44:56 -0500 |
commit | b42f1638351259675edc6aa234c1304a98e2c3ce (patch) | |
tree | 39e9755a7c92d361c98cb9cec0f31801810cd22c /docs | |
parent | 7e3da987346edd3b83844b268c29f1f9a87e821d (diff) | |
download | pytorch-b42f1638351259675edc6aa234c1304a98e2c3ce.tar.gz pytorch-b42f1638351259675edc6aa234c1304a98e2c3ce.tar.bz2 pytorch-b42f1638351259675edc6aa234c1304a98e2c3ce.zip |
[ONNX] export sum, prod, sqrt improve log_softmax. (#4579)
* ONNX: export sum, prod, sqrt improve log_softmax and fix a typo in doc.
Signed-off-by: HE, Tao <sighingnow@gmail.com>
* Add new exported op to doc.
Signed-off-by: HE, Tao <sighingnow@gmail.com>
* Double quotes.
Signed-off-by: HE, Tao <sighingnow@gmail.com>
* Update trace log of log_softmax.
Signed-off-by: HE, Tao <sighingnow@gmail.com>
* Improve export when dim is None and axes_i should be a list of ints.
Signed-off-by: HE, Tao <sighingnow@gmail.com>
* Fix prod when no dim given.
Signed-off-by: HE, Tao <sighingnow@gmail.com>
* Update line ends in test expected file.
Signed-off-by: HE, Tao <sighingnow@gmail.com>
Diffstat (limited to 'docs')
-rw-r--r-- | docs/source/onnx.rst | 5 |
1 files changed, 4 insertions, 1 deletions
diff --git a/docs/source/onnx.rst b/docs/source/onnx.rst index d1d8c9ffcf..fb8baf4357 100644 --- a/docs/source/onnx.rst +++ b/docs/source/onnx.rst @@ -130,9 +130,12 @@ The following operators are supported: * mm * addmm * neg +* sqrt * tanh * sigmoid * mean +* sum +* prod * t * expand (only when used before a broadcasting ONNX operator; e.g., add) * transpose @@ -189,7 +192,7 @@ for installing PyTorch from source. If the wanted operator is standardized in ONNX, it should be easy to add support for exporting such operator (adding a symbolic function for the operator). To confirm whether the operator is standardized or not, please check the -`ONNX operator list <http://https://github.com/onnx/onnx/blob/master/docs/Operators.md>`_. +`ONNX operator list <https://github.com/onnx/onnx/blob/master/docs/Operators.md>`_. If the operator is an ATen operator, which means you can find the declaration of the function in ``torch/csrc/autograd/generated/VariableType.h`` |