summaryrefslogtreecommitdiff
path: root/docs/nnfw/howto/HowToUseNNFWAPI.md
diff options
context:
space:
mode:
Diffstat (limited to 'docs/nnfw/howto/HowToUseNNFWAPI.md')
-rw-r--r--docs/nnfw/howto/HowToUseNNFWAPI.md63
1 files changed, 0 insertions, 63 deletions
diff --git a/docs/nnfw/howto/HowToUseNNFWAPI.md b/docs/nnfw/howto/HowToUseNNFWAPI.md
deleted file mode 100644
index e09343275..000000000
--- a/docs/nnfw/howto/HowToUseNNFWAPI.md
+++ /dev/null
@@ -1,63 +0,0 @@
-# Prepare nnpackage
-
-## Convert tensorflow pb file to nnpackage
-Follow the [compiler guide](https://github.sec.samsung.net/STAR/nnfw/blob/master/docs/nncc/Release_2019/tutorial.md) to generate nnpackge from tensorflow pb file
-
-## Convert tflite file to nnpackage
-Please see [model2nnpkg](https://github.sec.samsung.net/STAR/nnfw/tree/master/tools/nnpackage_tool/model2nnpkg) for converting from tflite model file.
-
-# Build app with nnfw API
-
-Here are basic steps to build app with [nnfw C API](https://github.sec.samsung.net/STAR/nnfw/blob/master/runtime/neurun/api/include/nnfw.h)
-
-1) Initialize nnfw_session
-``` c
-nnfw_session *session = nullptr;
-nnfw_create_session(&session);
-```
-2) Load nnpackage
-``` c
-nnfw_load_model_from_file(session, nnpackage_path);
-```
-3) (Optional) Assign a specific backend to operations
-``` c
- // Use acl_neon backend for CONV_2D and acl_cl for otherwise.
- // Note that defalut backend is acl_cl
- nnfw_set_op_backend(session, "CONV_2D", "acl_neon");
-```
-
-4) Compilation
-``` c
- // Compile model
- nnfw_prepare(session);
-```
-
-5) Prepare Input/Output
-``` c
- // Prepare input. Here we just allocate dummy input arrays.
- std::vector<float> input;
- nnfw_tensorinfo ti;
- nnfw_input_tensorinfo(session, 0, &ti); // get first input's info
- uint32_t input_elements = num_elems(&ti);
- input.resize(input_elements);
- // TODO: Please add initialization for your input.
- nnfw_set_input(session, 0, ti.dtype, input.data(), sizeof(float) * input_elements);
-
- // Prepare output
- std::vector<float> output;
- nnfw_output_tensorinfo(session, 0, &ti); // get first output's info
- uint32_t output_elements = num_elems(&ti);
- output.resize(output_elements);
- nnfw_set_output(session, 0, ti.dtype, output.data(), sizeof(float) * output_elements);
-```
-6) Inference
-``` c
- // Do inference
- nnfw_run(session);
-```
-## Run Inference with app on the target devices
-reference app : [minimal app](https://github.sec.samsung.net/STAR/nnfw/blob/master/runtime/neurun/sample/minimal)
-
-```
-$ ./minimal path_to_nnpackage_directory
-```