You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Get best run output - best_run, onnx_mdl = remote_run.get_output (return_onnx_model=True)
Save ONNX model - from azureml.automl.runtime.onnx_convert import OnnxConverter
onnx_fl_path = "./best_model.onnx" OnnxConverter.save_onnx_model(onnx_mdl, onnx_fl_path)
Ideally this should just be controlled in 1 place, perhaps when getting the model (step 2). Step #1 should go away once we have 100% ONNX support for AutoML models, so for short term it's ok.
It's unclear why step 3 is needed with a separate ONNXConverter. Can this step be merged with #2? The mechanism/convention to save an ONNX model should be the same as saving a non-ONNX model.
1st working version of deployment notebook #2 Return onnx model as part of the best outputs with best_run, onnx_mdl = remote_run.get_output (return_onnx_model=True) is not needed, but optional.
When trying to get an ONNX model from AutoML, you need to set configurations in 3 places.
onnx_fl_path = "./best_model.onnx"
OnnxConverter.save_onnx_model(onnx_mdl, onnx_fl_path)
Ideally this should just be controlled in 1 place, perhaps when getting the model (step 2). Step #1 should go away once we have 100% ONNX support for AutoML models, so for short term it's ok.
It's unclear why step 3 is needed with a separate ONNXConverter. Can this step be merged with #2? The mechanism/convention to save an ONNX model should be the same as saving a non-ONNX model.
(reference notebook: https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/classification-bank-marketing-all-features/auto-ml-classification-bank-marketing-all-features.ipynb)
The text was updated successfully, but these errors were encountered: