Skip to content

Latest commit

 

History

History
143 lines (115 loc) · 12.6 KB

Readme.md

File metadata and controls

143 lines (115 loc) · 12.6 KB

Inference Engine Demos

The Inference Engine demo applications are simple console applications that demonstrate how you can use the Intel's Deep Learning Inference Engine in your applications.

The Deep Learning Inference Engine release package provides the following demo applications available in the demos directory in the Inference Engine installation directory:

Few demos referenced above have simplified equivalents in Python (python_demos subfolder).

Demos that Support Pre-Trained Models Shipped with the Product

(!) Important Note: Inference Engine MYRIAD and FPGA plugins are available in proprietary distribution only.

The product includes several pre-trained [models] (../intel_models/index.html). The table below shows the correlation between models and demos/plugins (the plugins names are exactly as they are passed to the demos with -d option).

Model Demoss supported on the model CPU GPU HETERO:FPGA,CPU MYRIAD
face-detection-adas-0001 Interactive Face Detection Demo Supported Supported Supported Supported
age-gender-recognition-retail-0013 Interactive Face Detection Demo Supported Supported Supported Supported
head-pose-estimation-adas-0001 Interactive Face Detection Demo Supported Supported Supported Supported
emotions-recognition-retail-0003 Interactive Face Detection Demo Supported Supported Supported Supported
facial-landmarks-35-adas-0001 Interactive Face Detection Demo Supported Supported Supported
vehicle-license-plate-detection-barrier-0106 Security Barrier Camera Demo Supported Supported Supported Supported
vehicle-attributes-recognition-barrier-0039 Security Barrier Camera Demo Supported Supported Supported Supported
license-plate-recognition-barrier-0001 Security Barrier Camera Demo Supported Supported Supported Supported
person-detection-retail-0001 Object Detection Demo Supported Supported Supported
person-vehicle-bike-detection-crossroad-0078 Crossroad Camera Demo Supported Supported Supported Supported
person-attributes-recognition-crossroad-0031 Crossroad Camera Demo Supported Supported Supported Supported
person-reidentification-retail-0031 Crossroad Camera Demo
Pedestrian Tracker Demo
Supported Supported Supported Supported
person-reidentification-retail-0076 Crossroad Camera Demo Supported Supported Supported Supported
person-reidentification-retail-0079 Crossroad Camera Demo Supported Supported Supported Supported
road-segmentation-adas-0001 Image Segmentation Demo Supported Supported
semantic-segmentation-adas-0001 Image Segmentation Demo Supported Supported
person-detection-retail-0013 any demo that supports SSD*-based models, above
Pedestrian Tracker Demo
Supported Supported Supported Supported
face-detection-retail-0004 any demo that supports SSD*-based models, above Supported Supported Supported Supported
face-person-detection-retail-0002 any demo that supports SSD*-based models, above Supported Supported Supported Supported
pedestrian-detection-adas-0002 any demo that supports SSD*-based models, above Supported Supported Supported
vehicle-detection-adas-0002 any demo that supports SSD*-based models, above Supported Supported Supported Supported
pedestrian-and-vehicle-detector-adas-0001 any demo that supports SSD*-based models, above Supported Supported Supported
person-detection-action-recognition-0003 Smart Classroom Demo Supported Supported Supported
landmarks-regression-retail-0009 Smart Classroom Demo Supported Supported Supported
face-reidentification-retail-0071 Smart Classroom Demo Supported Supported Supported Supported
human-pose-estimation-0001 Human Pose Estimation Demo Supported Supported Supported
single-image-super-resolution-0034 Super Resolution Demo Supported

Few demos referenced above have simplified equivalents in Python (python_demos subfolder).

Notice that the FPGA support comes through a [heterogeneous execution](@ref PluginHETERO), for example, when the post-processing is happening on the CPU.

Building the Demo Applications

To be able to build demos you need to source InferenceEngine and OpenCV environment from a binary package which is available as proprietary distribution. Please run the following command (assuming that the binary package was installed to <INSTALL_DIR>):

source <INSTALL_DIR>/deployment_tools/bin/setupvars.sh

Also, you can build IE binaries from the dldt repo. In this case please set InferenceEngine_DIR to a CMake folder you built the dldt binaries from. Please also set the OpenCV_DIR variable pointing to the required OpenCV package.

Linux* OS

The officially supported Linux build environment is the following:

  • Ubuntu* 16.04 LTS 64-bit or CentOS* 7.4 64-bit
  • GCC* 5.4.0 (for Ubuntu* 16.04) or GCC* 4.8.5 (for CentOS* 7.4)
  • CMake* version 2.8 or higher.
  • OpenCV 3.3 or later (required for some demos and demos)


You can build the demo applications using the CMake file in the demos directory.

Create a new directory and change your current directory to the new one:

mkdir build
cd build

Run CMake to generate Make files:

cmake -DCMAKE_BUILD_TYPE=Release <path_to_inference_engine_demos_directory>

To build demos with debug information, use the following command:

cmake -DCMAKE_BUILD_TYPE=Debug <path_to_inference_engine_demos_directory>

Run Make to build the application:

make

After that you can find binaries for all demos applications in the intel64/Release subfolder.

Microsoft Windows* OS

The recommended Windows build environment is the following:

  • Microsoft Windows* 10
  • Microsoft* Visual Studio* 2015 including Microsoft Visual Studio 2015 Community or Microsoft Visual Studio 2017
  • CMake* version 2.8 or later
  • OpenCV* 3.3 or later

Generate Microsoft Visual Studio solution file using create_msvc2015_solution.bat file or create_msvc2017_solution.bat file and then build the resulting solution Demos.sln in the Microsoft Visual Studio 2015 or Microsoft Visual Studio 2015 accordingly.

Running the Demo Applications

Before running compiled binary files, make sure your application can find the Inference Engine libraries. Use the setupvars.sh script (or setupvars.bat on Windows), which will set all necessary environment variables pointing to the binaries from the installed binary package you installed to the <INSTALL_DIR>.

For that, run:

source <INSTALL_DIR>/deployment_tools/bin/setupvars.sh

on Linux or

source <INSTALL_DIR>/deployment_tools/bin/setupvars.bat

to source required environment on Windows.
If you are using Inference Engine binaries from the dldt repository then you need to configure LD_LIBRARY_PATH variable (or PATH on Windows) manually.

What is left is running the required demo with appropriate commands, providing IR information (typically with "-m" command-line option). Please note that Inference Engine assumes that weights are in the same folder as .xml file.


* Other names and brands may be claimed as the property of others.