You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What should be the output_names of the custom model be?
emotion_labels = get_labels('fer2013')
emotion_model_path = 'trained_models/emotion_models/fer2013_mini_XCEPTION.102-0.66.hdf5'
emotion_classifier = load_model(emotion_model_path, compile=False)
'emotions': {
'model': emotion_classifier ,
'arg_scope': emotion_arg_scope, # what should this be changed to
'num_classes': 7,
'input_name': 'input', # what should this be changed to
'output_names': ['InceptionResnetV2/Logits/Logits/BiasAdd'],, # what should this be changed to
'input_width': 64,
'input_height': 64,
'input_channels': 1,
'preprocess_fn': preprocess_emotion, #preprocessing
'postprocess_fn': postprocess_emotion, #postprocessing
'checkpoint_filename': CHECKPOINT_DIR + 'emotions.ckpt',
'frozen_graph_filename': FROZEN_GRAPHS_DIR + 'emotions.pb',
'trt_convert_status': "works", # what should this be changed to
'plan_filename': PLAN_DIR + 'inception_resnet_v2.plan' # what should this be changed to
}}
The text was updated successfully, but these errors were encountered:
Trying the first step tf_to_trt_image_classification/scripts/models_to_frozen_graphs.py
Please guide on the second part of this question
Trying to convert emotion detection into TensorRT
What should be the
output_names
of the custom model be?The text was updated successfully, but these errors were encountered: