-
Notifications
You must be signed in to change notification settings - Fork 60
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is it possible to use nn-meter for TFLite micro? #61
Comments
Hi, Ramson, |
Hi @Lynazhang , Note that you need to implement the followings: (i) set up the device and the connection; (ii) profiling a model/kernel and get the latency results on your edge device |
Hi Ramson, |
Hi @JiahangXu , |
Hi Ramson, |
Hi @JiahangXu , |
Hi @Ramsonjehu We plan to release the quantize predictor after submitting our research paper, in about October or later. 😃 |
Close this issue given no updates in a long time. Please reach out if there is anything we can assist with. |
Hi,
I would like to measure the inference time for my edge device which supports TensorFlow Lite Micro framework. Is it possible to add support for my edge device? If so it is possible to do it from my side? or does it requires updates in the library itself?
Thanks & Regards,
Ramson Jehu K
The text was updated successfully, but these errors were encountered: