Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible to use nn-meter for TFLite micro? #61

Closed
Ramsonjehu opened this issue Mar 22, 2022 · 8 comments
Closed

Is it possible to use nn-meter for TFLite micro? #61

Ramsonjehu opened this issue Mar 22, 2022 · 8 comments

Comments

@Ramsonjehu
Copy link

Hi,
I would like to measure the inference time for my edge device which supports TensorFlow Lite Micro framework. Is it possible to add support for my edge device? If so it is possible to do it from my side? or does it requires updates in the library itself?

Thanks & Regards,
Ramson Jehu K

@Lynazhang
Copy link
Contributor

Hi, Ramson,
Thanks for your interest! nn-Meter 2.0 provides the building tools for users to build latency predictors for custom devices. You can download and try the latest main branch. Currently, we support tflite/openvino backends. Maybe you can try to connect the tflite micro framework via the nn-Meter backend interface doc. Note that you need to implement the followings: (i) set up the device and the connection; (ii) profiling a model/kernel and get the latency results on your edge device

@Ramsonjehu
Copy link
Author

Hi @Lynazhang ,
Can you elaborate a little bit on the following?

Note that you need to implement the followings: (i) set up the device and the connection; (ii) profiling a model/kernel and get the latency results on your edge device

@JiahangXu
Copy link
Collaborator

Hi Ramson,
nn-Meter building tool provides a whole pipeline for users to build customized latency predictors. The first step, set up the device and the connection, is to build a backend class for TFLite micro, here is a guidance: https://github.com/microsoft/nn-Meter/blob/main/docs/builder/prepare_backend.md#-build-customized-backend-.
The second step, profiling a model/kernel and get the latency results on your edge device, is to sample several kernel model and profiling the kernel on device to get their latency, and use the profiled results to train a latency predictor. Here is a guidance: https://github.com/microsoft/nn-Meter/blob/main/docs/builder/build_kernel_latency_predictor.md. Note that we only support devices which could profile kernel models with tensors of different shape as inputs.
In addition, there are also some tasks to do, such as creating workspace and detecting fusion rules. Here is a overview of the whole pipeline: https://github.com/microsoft/nn-Meter/blob/main/docs/builder/overview.md. If you have any question in using nn-Meter, please feel free to contact us.

@Ramsonjehu
Copy link
Author

Hi @JiahangXu ,
I have implemented the backend class and detected fusion rules. Currently building kernel latency predictors. I would like to know if any updates on reference code for dataset generation as mentioned in the issue #53 .

@JiahangXu JiahangXu reopened this Jul 11, 2022
@JiahangXu
Copy link
Collaborator

Hi Ramson,
I'm sorry that we are still working on it. Actually, the dataset generation code contains two features, the code to generate tensorflow models, and a converter to generate tf2 models (keras h5 file) to nn-meter ir. The first part is almost ready by now, and the second part is preparing for PR and still need to test its stability. We plan to complete arrange the full dataset generation code this week.

@Ramsonjehu
Copy link
Author

Hi @JiahangXu ,
Thanks for the update. I'm also curious about quantize predictor, when is it planned to release?

@JiahangXu
Copy link
Collaborator

Hi @Ramsonjehu We plan to release the quantize predictor after submitting our research paper, in about October or later. 😃

@JiahangXu
Copy link
Collaborator

Close this issue given no updates in a long time. Please reach out if there is anything we can assist with.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants