single frame inference slow #970
-
Hi!
Any ideas for how I could speed it up? Thanks! Details: bottom up (~same time for top-down) |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi @RubenTeunisse, Can you try This lower level API doesn't do some preprocessing steps, but this might not be necessary for your data anyway. After the first call(s), the model will get traced and inference should be significantly faster as well. We have some info on this topic in the Interactive and realtime inference notebook. This Discussion thread might also be relevant. Let us know if any of that works for you! Cheers, Talmo |
Beta Was this translation helpful? Give feedback.
Hi @RubenTeunisse,
Can you try
self.sleap_model.inference_model.predict_on_batch(img)
?This lower level API doesn't do some preprocessing steps, but this might not be necessary for your data anyway. After the first call(s), the model will get traced and inference should be significantly faster as well.
We have some info on this topic in the Interactive and realtime inference notebook. This Discussion thread might also be relevant.
Let us know if any of that works for you!
Cheers,
Talmo