diff options
author | Sebastian Raschka <mail@sebastianraschka.com> | 2017-10-17 04:16:06 -0400 |
---|---|---|
committer | Adam Paszke <adam.paszke@gmail.com> | 2017-10-17 10:16:06 +0200 |
commit | f176c864f0d04a8b5d24a1774b3b766cb89b8a29 (patch) | |
tree | f91a72e2bfb0c86ed053e01750cf59c82c35f2a4 /README.md | |
parent | 9d4d0640f26b044747927aa8cfc818d6e7861b7a (diff) | |
download | pytorch-f176c864f0d04a8b5d24a1774b3b766cb89b8a29.tar.gz pytorch-f176c864f0d04a8b5d24a1774b3b766cb89b8a29.tar.bz2 pytorch-f176c864f0d04a8b5d24a1774b3b766cb89b8a29.zip |
minor autograd reference change in readme (#3144)
Diffstat (limited to 'README.md')
-rw-r--r-- | README.md | 2 |
1 files changed, 1 insertions, 1 deletions
@@ -89,7 +89,7 @@ Changing the way the network behaves means that one has to start from scratch. With PyTorch, we use a technique called reverse-mode auto-differentiation, which allows you to change the way your network behaves arbitrarily with zero lag or overhead. Our inspiration comes from several research papers on this topic, as well as current and past work such as -[autograd](https://github.com/twitter/torch-autograd), +[torch-autograd](https://github.com/twitter/torch-autograd), [autograd](https://github.com/HIPS/autograd), [Chainer](http://chainer.org), etc. |