summaryrefslogtreecommitdiff
path: root/debian/control
blob: e4da2958c64ea75127d00b3c88536d2c22925dfa (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
Source: openvino
Maintainer: Wook Song <wook16.song@samsung.com>
Section: libs
Priority: optional
Build-Depends: debhelper (>= 9.0.0), quilt,
 cmake, pkg-config,
 gcc-9 | gcc-8 | gcc-7 | gcc-6 | gcc-5,
 g++-9 | g++-8 | g++-7 | g++-6 | g++-5,
 libtbb-dev, libusb-1.0-0-dev
Standards-Version: 4.1.4
Vcs-Git: git://git.tizen.org/platform/upstream/dldt
Vcs-Browser: https://git.tizen.org/cgit/platform/upstream/dldt

Package: openvino
Section: libs
Priority: optional
Architecture: any
Multi-Arch: same
Depends: ${shlibs:Depends}, ${misc:Depends}, libtbb
Recommends: openvino-cpu-mkldnn [amd64]
Description: OpenVINO™ Toolkit - Deep Learning Deployment Toolkit
 OpenVINO™ toolkit, short for Open Visual Inference and Neural network
 Optimization toolkit, provides developers with improved neural network
 performance on a variety of Intel® processors and helps them further
 unlock cost-effective, real-time vision applications.
 .
 The toolkit enables deep learning inference and easy heterogeneous
 execution across multiple Intel® platforms (CPU, Intel® Processor Graphics)
 —providing implementations across cloud architectures to edge devices.
 This open source distribution provides flexibility and availability to
 the developer community to innovate deep learning and AI solutions.

Package: openvino-dev
Section: libs
Priority: optional
Architecture: any
Multi-Arch: same
Depends: ${shlibs:Depends}, ${misc:Depends}, openvino
Description: OpenVINO™ Toolkit Development package
 This is development package for Intel® OpenVINO™ Toolkit.

Package: openvino-cpu-mkldnn
Section: libs
Priority: optional
Architecture: amd64
Multi-Arch: same
Depends: ${shlibs:Depends}, ${misc:Depends}, openvino
Description: CPU plugin for OpenVINO™ Toolkit
 This package contains the CPU plugin, which was developed in order to provide
 opportunity for high performance scoring of neural networks on CPU, using the
 Intel® Math Kernel Library for Deep Neural Networks (Intel® MKL-DNN),
 of the OpenVINO™ toolkit