Skip to content

single frame inference slow #970

Answered by talmo
RubenTeunisse asked this question in Help!
Discussion options

You must be logged in to vote

Hi @RubenTeunisse,

Can you try self.sleap_model.inference_model.predict_on_batch(img)?

This lower level API doesn't do some preprocessing steps, but this might not be necessary for your data anyway. After the first call(s), the model will get traced and inference should be significantly faster as well.

We have some info on this topic in the Interactive and realtime inference notebook. This Discussion thread might also be relevant.

Let us know if any of that works for you!

Cheers,

Talmo

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@RubenTeunisse
Comment options

Answer selected by RubenTeunisse
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Help!
Labels
None yet
2 participants