Skip to content
This repository has been archived by the owner on Aug 28, 2023. It is now read-only.

How to use whisper-small.tflite? #25

Open
wall3001 opened this issue Mar 15, 2023 · 32 comments
Open

How to use whisper-small.tflite? #25

wall3001 opened this issue Mar 15, 2023 · 32 comments

Comments

@wall3001
Copy link

I used the whisper-small.tflite to transcribe,but the asr result contains some like '[_extra_token_50258] ,[_extra_token_50260] ,,[_extra_token_50358]'.The result is not accurate at all.

I just replace the whisper.tflite with whisper-small.tflite.

@wall3001
Copy link
Author

OK.I replace filters_vocab_gen.bin with filters_vocab_multilingual.bin
and add the "&& (output_int[i] !=50258)&& (output_int[i] !=50260)&& (output_int[i] !=50358)" in native-lib.cpp to filter that.

@wall3001
Copy link
Author

I found that the result was translated into English. Can I control not to translate?

@nyadla-sys
Copy link
Contributor

Can you try whisper-medium.tflite which may work for transcribe,however single models has some issues with multilanguage transcribe.
Please refer the bottom of the meesage to see more details.

Please see some comments below about transcribe and translation feature
This also might be of interest to you @nyadla-sys The base model do translation to english where as the tiny and small models just returned the language detected.

thanks for your information and I will look into it

mycroft@OpenVoiceOS-e3830c:~/whisper $ minimal models/whisper-tiny.tflite de_speech_thorsten_sample03_8s.wav

n_vocab:50257

mel.n_len3000

mel.n_mel:80
INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
Inference time 7 seconds

[_extra_token_50258][_extra_token_50261][_extra_token_50359][BEG] Für mich sind alle Menschen gleich unabhängig von Geschlecht, sexuelle Orientierung, Religion, Hautfarbe oder Geo-Kordinaten der Geburt.[SOT]

mycroft@OpenVoiceOS-e3830c:~/whisper $ minimal models/whisper-base.tflite de_speech_thorsten_sample03_8s.wav

n_vocab:50257

mel.n_len3000

mel.n_mel:80
INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
Inference time 12 seconds

[_extra_token_50258][_extra_token_50261][_extra_token_50358][BEG] For me, all people are equally independent of gender, sex, orientation, religion, hate, or gender coordinates of birth.[SOT]

mycroft@OpenVoiceOS-e3830c:~/whisper $ minimal models/whisper-small.tflite de_speech_thorsten_sample03_8s.wav

n_vocab:50257

mel.n_len3000

mel.n_mel:80
INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
Inference time 43 seconds

[_extra_token_50258][_extra_token_50261][_extra_token_50359][BEG] Für mich sind alle Menschen gleich, unabhängig von Geschlecht, sexueller Orientierung, Religion, Hautfarbe oder Geo-Koordinaten der Geburt.[SOT]

IF you are really looking for Transcribe for all languages I recommend to use below generated models https://colab.research.google.com/github/usefulsensors/openai-whisper/blob/main/notebooks/whisper_encoder_decoder_tflite.ipynb

Please see the below link to use multilanguage models on C++
https://github.com/ipsilondev/whisper-cordova/blob/main/android/cpp/native-lib.cpp

@wall3001
Copy link
Author

I can't use whisper-medium.tflite in my phone.My phone's RAM is only 4GB.The medium need 5GB RAM aleast.
And then I use another phone which has 12G RAM but it also can not work and the memory has raise to 8G.

@nyadla-sys
Copy link
Contributor

I guess the best approach to use a multilanguage model is to use an encoder/decoder model from here
https://colab.research.google.com/github/usefulsensors/openai-whisper/blob/main/notebooks/whisper_encoder_decoder_tflite.ipynb

@nyadla-sys
Copy link
Contributor

@wall3001
Copy link
Author

OK.I try the tiny mode. Only one is recognized, and the others are translated into English

@wall3001
Copy link
Author

I user the https://github.com/ipsilondev/whisper-cordova/blob/main/android/cpp/native-lib.cpp
It cannot work.

2023-03-16 11:35:22.540 11364-11364/com.whisper.android.tflitecpp D/TFLiteASRDemo: On Record Stop Click
2023-03-16 11:35:22.541 11364-11364/com.whisper.android.tflitecpp E/HwCustAudioRecordImpl: isOpenEC : false
2023-03-16 11:35:22.626 11364-11364/com.whisper.android.tflitecpp I/HwAudioRecordImpl: sendStateChangedIntent, state=1
2023-03-16 11:35:22.628 11364-11364/com.whisper.android.tflitecpp E/HwCustAudioRecordImpl: isOpenEC : false
2023-03-16 11:35:22.630 11364-11364/com.whisper.android.tflitecpp I/HwAudioRecordImpl: sendStateChangedIntent, state=1
2023-03-16 11:35:22.638 11364-11364/com.whisper.android.tflitecpp E/HwCustAudioRecordImpl: isOpenEC : false
2023-03-16 11:35:22.656 11364-11364/com.whisper.android.tflitecpp E/HwCustAudioRecordImpl: isOpenEC : false
2023-03-16 11:35:22.657 11364-11364/com.whisper.android.tflitecpp I/HwAudioRecordImpl: sendStateChangedIntent, state=1
2023-03-16 11:35:22.663 11364-11364/com.whisper.android.tflitecpp E/HwCustAudioRecordImpl: isOpenEC : false
2023-03-16 11:35:22.681 11364-11364/com.whisper.android.tflitecpp D/HwAppInnerBoostImpl: asyncReportData com.whisper.android.tflitecpp,2,1,1,0 interval=256
2023-03-16 11:35:22.681 11364-11364/com.whisper.android.tflitecpp E/TFLiteASRDemo: /storage/emulated/0/Android/data/com.whisper.android.tflitecpp/cache/android_record.wav
2023-03-16 11:35:23.612 11364-11364/com.whisper.android.tflitecpp I/tflite: Initialized TensorFlow Lite runtime.
2023-03-16 11:35:23.613 11364-11364/com.whisper.android.tflitecpp E/libc: Access denied finding property "ro.hardware.chipname"
2023-03-16 11:35:23.614 11364-11364/com.whisper.android.tflitecpp I/tflite: Created TensorFlow Lite XNNPACK delegate for CPU.
2023-03-16 11:35:23.614 11364-11364/com.whisper.android.tflitecpp E/tflite: Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency. See instructions: https://www.tensorflow.org/lite/guide/ops_select
2023-03-16 11:35:23.614 11364-11364/com.whisper.android.tflitecpp E/tflite: Node number 7 (FlexErf) failed to prepare.
2023-03-16 11:35:23.614 11364-11364/com.whisper.android.tflitecpp E/tflite: Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency. See instructions: https://www.tensorflow.org/lite/guide/ops_select
2023-03-16 11:35:23.614 11364-11364/com.whisper.android.tflitecpp E/tflite: Node number 7 (FlexErf) failed to prepare.

@nyadla-sys
Copy link
Contributor

Need to generate flex tflite library using bazel build

@wall3001
Copy link
Author

// NEW: Prepare GPU delegate.
    //  auto* delegate = TfLiteGpuDelegateV2Create(nullptr);
    // if (interpreter->ModifyGraphWithDelegate(delegate) != kTfLiteOk) {
    //     __android_log_print(ANDROID_LOG_VERBOSE, "Whisper ASR", "gpu delegate failed \n");
    // }

    // Allocate tensor buffers.
    TFLITE_MINIMAL_CHECK(g_whisper_tflite_params.interpreter->AllocateTensors() == kTfLiteOk);

@wall3001
Copy link
Author

It happened at line 290 . TFLITE_MINIMAL_CHECK(g_whisper_tflite_params.interpreter->AllocateTensors() == kTfLiteOk);

@wall3001
Copy link
Author

Need to generate flex tflite library using bazel build

I can't use this. I'll try it later. You mean that I need build a .tflite library called flex?

@nyadla-sys
Copy link
Contributor

bazel build -c opt --config=monolithic tensorflow/lite:libtensorflowlite
bazel build -c opt --config=monolithic tensorflow/lite:libtensorflowlite_flex

@nyadla-sys
Copy link
Contributor

Follow below link for more details
https://www.tensorflow.org/lite/guide/ops_select

@wall3001
Copy link
Author

undefined reference to `tflite::ops::builtin::BuiltinOpResolver::BuiltinOpResolver()'

@wall3001
Copy link
Author

CMakeFiles/native-lib.dir/native-lib.cpp.o: In function whisper_tflite': ndefined reference to tflite::ops::builtin::BuiltinOpResolver::BuiltinOpResolver()'

@wall3001
Copy link
Author

CMakeLists add set_target_properties( tflite PROPERTIES IMPORTED_LOCATION
${CMAKE_CURRENT_LIST_DIR}/tf-lite-api/generated-libs/${ANDROID_ABI}/libtensorflowlite_flex_jni.so )

@wall3001
Copy link
Author

ERROR: Skipping 'tensorflow/lite:libtensorflowlite_flex': error loading package 'tensorflow/lite': Every .bzl file must have a corresponding package, but '//tensorflow:tensorflow.bzl' does not have one. Please create a BUILD file in the same or any
parent directory. Note that this BUILD file does not need to do anything except exist.
WARNING: Target pattern parsing failed.
ERROR: error loading package 'tensorflow/lite': Every .bzl file must have a corresponding package, but '//tensorflow:tensorflow.bzl' does not have one. Please create a BUILD file in the same or any parent directory. Note that this BUILD file does no
t need to do anything except exist.
INFO: Elapsed time: 0.074s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
currently loading: tensorflow/lite

@nyadla-sys
Copy link
Contributor

take new tensorflowlite clone and using bazel first build tflite flex library and then include newly generated library as part minimal make build

@nyadla-sys
Copy link
Contributor

nyadla-sys commented Mar 16, 2023

Rfer below issue:
tensorflow/tensorflow#55536

I have build tensorflowlite_flex lib as below using bazel command:
bazel build -c opt --config=monolithic tensorflow/lite/delegates/flex:tensorflowlite_flex
Followed as you mentioned above and it worked for me on linux PC.!
diff --git a/tensorflow/lite/examples/minimal/CMakeLists.txt b/tensorflow/lite/examples/minimal/CMakeLists.txt
index 7f8301162bb..1dd8ae05089 100644
--- a/tensorflow/lite/examples/minimal/CMakeLists.txt
+++ b/tensorflow/lite/examples/minimal/CMakeLists.txt
@@ -35,10 +35,14 @@ add_subdirectory(
EXCLUDE_FROM_ALL
)

+find_library(TF_LIB_FLEX tensorflowlite_flex HINTS "${TENSORFLOW_SOURCE_DIR}/bazel-bin/tensorflow/lite/delegates/flex/")
+
set(CMAKE_CXX_STANDARD 17)
add_executable(minimal
minimal.cc
)
target_link_libraries(minimal

-Wl,--no-as-needed # Need --no-as-needed to link tensorflowlite_flex
tensorflow-lite
${TF_LIB_FLEX}
)

@wall3001
Copy link
Author

I have build succeeded tensorflowlite_flex lib on Linux. Then put the .so into android,it cannot work.Have you tried it on an Android phone?

@wall3001
Copy link
Author

How to do real-time transcription? I need real-time transcription.

@nyadla-sys
Copy link
Contributor

I have build succeeded tensorflowlite_flex lib on Linux. Then put the .so into android,it cannot work.Have you tried it on an Android phone?

You may need to cross compile lib for arm cores

@nyadla-sys
Copy link
Contributor

How to do real-time transcription? I need real-time transcription.

Refer stream_standalone for this

@wall3001
Copy link
Author

image

@wall3001
Copy link
Author

use this command bazel build -c opt --config=elinux_aarch64 tensorflow/lite/delegates/flex:tensorflowlite_flex

@wall3001
Copy link
Author

bazel build -c opt --fat_apk_cpu=x86,x86_64,arm64-v8a,armeabi-v7a
--host_crosstool_top=@bazel_tools//tools/cpp:toolchain
//tensorflow/lite/java:tensorflow-lite

@wall3001
Copy link
Author

image

@wall3001
Copy link
Author

I pick libtensorflowlite_jni.so out from https://central.sonatype.com/artifact/org.tensorflow/tensorflow-lite/2.11.0
but build error show the error: undefined reference to 'tflite::FlatBufferModel::~FlatBufferModel()'

@wall3001
Copy link
Author

image

fixed it via use jdk 11

@nyadla-sys
Copy link
Contributor

are yu able to build flex tflite lib for android phones?

@wall3001
Copy link
Author

image
failed to build flex

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants