diff options
author | Zachary DeVito <zdevito@fb.com> | 2019-04-11 13:30:42 -0700 |
---|---|---|
committer | Facebook Github Bot <facebook-github-bot@users.noreply.github.com> | 2019-04-11 13:55:48 -0700 |
commit | ef406ee925b0cca35d227f458eab9af0b927d6ac (patch) | |
tree | f95f6a963f7f2c9937c089d12e426ea98c0664ec /aten/src/ATen/SparseTensorImpl.cpp | |
parent | b6ee83a5b4a9706f6abde011aea158a07d4d76f4 (diff) | |
download | pytorch-ef406ee925b0cca35d227f458eab9af0b927d6ac.tar.gz pytorch-ef406ee925b0cca35d227f458eab9af0b927d6ac.tar.bz2 pytorch-ef406ee925b0cca35d227f458eab9af0b927d6ac.zip |
First class modules in the compiler, round 2 (#19167)
Summary:
This PR propagates where we use first-class modules objects into the compiler. This creates a transitionary state where:
* compiler.cpp creates Graphs where `self` is a Module class and attributes/parameters/buffers/submodules are looked up with `prim::GetAttr`
* GraphExecutor still runs "lowered graphs" where the self object has been removed by a compiler pass `lower_first_class_method`.
* Tracing still creates "lowered graphs", and a pass "lift_lowered_method" creates a first-class method graph for things.
* This PR separates out Method and Function. A script::Function is a pure Graph with no `self` bound. Similar to Python, a script::Method is just a bound `self` and its underlying `script::Function`.
* This PR also separates CompilationUnit from Module. A CompilationUnit is just a list of named script::Functions. Class's have a CompilationUnit holding the class methods, and Modules also have a CompilationUnit holding their Methods. This avoids the weird circular case Module --has a-> Class -> has a -> Module ...
Details:
* In this transitionary state, we maintain two copies of a Graph, first-class module and lowered. Th first-class one has a self argument that is the module's class type. The lowered one is the lowered graph that uses the initial_ivalues inputs.
* When defining lowered methods using `_defined_lowered` we immediately create the first-class equivalent. The reverse is done lazily, creating lowered_methods on demand from the class.
* The two way conversions will be deleted in a future PR when the executor itself runs first-class objects. However this requires more changes to (1) the traces, (2) the python bindings, and (3) the onnx export pass and would make this PR way to large.
Pull Request resolved: https://github.com/pytorch/pytorch/pull/19167
Differential Revision: D14891966
Pulled By: zdevito
fbshipit-source-id: 0b5f03118aa65448a15c7a7818e64089ec93d7ea
Diffstat (limited to 'aten/src/ATen/SparseTensorImpl.cpp')
0 files changed, 0 insertions, 0 deletions