summaryrefslogtreecommitdiff
path: root/README.md
diff options
context:
space:
mode:
authorEdgar Riba <edgar.riba@gmail.com>2017-02-21 15:33:48 +0100
committerSoumith Chintala <soumith@gmail.com>2017-02-21 12:58:04 -0500
commit6073f9b46ccc1bc42aa0b9fbe49270d124a42cbf (patch)
tree661c8014bd7df841ceb3e0e417d38a581a01bd95 /README.md
parent240372a991f380283d99bf4638855b6fac92aa27 (diff)
downloadpytorch-6073f9b46ccc1bc42aa0b9fbe49270d124a42cbf.tar.gz
pytorch-6073f9b46ccc1bc42aa0b9fbe49270d124a42cbf.tar.bz2
pytorch-6073f9b46ccc1bc42aa0b9fbe49270d124a42cbf.zip
update table in README.md
it removes the empty top row
Diffstat (limited to 'README.md')
-rw-r--r--README.md35
1 files changed, 26 insertions, 9 deletions
diff --git a/README.md b/README.md
index 52de9cd7b1..ae2d86feaa 100644
--- a/README.md
+++ b/README.md
@@ -30,15 +30,32 @@ We are in an early-release Beta. Expect some adventures and rough edges.
At a granular level, PyTorch is a library that consists of the following components:
-| \_ | \_ |
-| ------------------------ | --- |
-| torch | a Tensor library like NumPy, with strong GPU support |
-| torch.autograd | a tape based automatic differentiation library that supports all differentiable Tensor operations in torch |
-| torch.nn | a neural networks library deeply integrated with autograd designed for maximum flexibility |
-| torch.optim | an optimization package to be used with torch.nn with standard optimization methods such as SGD, RMSProp, LBFGS, Adam etc. |
-| torch.multiprocessing | python multiprocessing, but with magical memory sharing of torch Tensors across processes. Useful for data loading and hogwild training. |
-| torch.utils | DataLoader, Trainer and other utility functions for convenience |
-| torch.legacy(.nn/.optim) | legacy code that has been ported over from torch for backward compatibility reasons |
+<table>
+<tr>
+ <td><b> torch </b></td>
+ <td> a Tensor library like NumPy, with strong GPU support </td>
+</tr>
+<tr>
+ <td><b> torch.autograd </b></td>
+ <td> a tape based automatic differentiation library that supports all differentiable Tensor operations in torch </td>
+</tr>
+<tr>
+ <td><b> torch.nn </b></td>
+ <td> a neural networks library deeply integrated with autograd designed for maximum flexibility </td>
+</tr>
+<tr>
+ <td><b> torch.multiprocessing </b></td>
+ <td> python multiprocessing, but with magical memory sharing of torch Tensors across processes. Useful for data loading and hogwild training. </td>
+</tr>
+<tr>
+ <td><b> torch.utils </b></td>
+ <td> DataLoader, Trainer and other utility functions for convenience </td>
+</tr>
+<tr>
+ <td><b> torch.legacy(.nn/.optim) </b></td>
+ <td> legacy code that has been ported over from torch for backward compatibility reasons </td>
+</tr>
+</table>
Usually one uses PyTorch either as: