Skip to content
This repository has been archived by the owner on May 11, 2024. It is now read-only.

Latest commit

 

History

History
23 lines (12 loc) · 949 Bytes

inference_testing.md

File metadata and controls

23 lines (12 loc) · 949 Bytes

Evaluating Experiments with Inference Testing

Nauta provides you with the ability to test your trained model using TensorFlow Serving and OpenVINO Model Server (OVMS). OVMS is an OpenVINO serving component intended to provide hosting for the OpenVINO inference runtime.

For guidance on using Inference Testing to evaluate an experiment, refer to the topics shown below.

  • For nctl predict command, its subcommands, and parameter information, refer to predict Commands.

For How-to instructions for TensorFlow Serving:

To run prediction on OpenVINO Model Server, refer to Inference Example on OpenVINO Model Server


Return to Start of Document