The ML Suite Docker container provides users with a largely self contained software environment for running inference on Xilinx FPGAs. The only external dependencies are:
- docker-ce
- Xilinx XRT 2018.2 (Xilinx Run Time)
-
Install Docker
Note: Ensure /var/lib/docker has sufficient space (Should be > 5GB), if not move your Docker Root elsewhere.
-
Download the appropriate ML Suite Container from xilinx.com
-
Load the appropriate container
# May need sudo $ docker load < xilinx-ml-suite-ubuntu-16.04-xrt-2018.2-tensorflow-1.12.0-mls-1.5.tar.gz $ docker load < xilinx-ml-suite-ubuntu-16.04-xrt-2018.2-caffe-1.0-mls-1.5.tar.gz
-
Use the provided script to launch and interact with the container
# May need sudo, use appropriate argument, which is the image tag, ex: $ ./docker_run.sh xilinx-ml-suite-ubuntu-16.04-xrt-2018.2-tensorflow-1.12.0-mls-1.5 $ ./docker_run.sh xilinx-ml-suite-ubuntu-16.04-xrt-2018.2-caffe-1.0-mls-1.5 # The script by default will launch a container with the --rm flag, meaning after exiting, the container will be removed. # Changes to source files will be lost. # A directory $MLSUITE_ROOT/share will be mounted in the container and can be used for easy file transfer between container and host.
-
Follow the Jupyter notebook or command line examples in the container
-
Follow Container Pipeline Example for example on how to stitch a container pipeline for preparing the model and running an inference server