Skip to content

Commit

Permalink
v0.3.0
Browse files Browse the repository at this point in the history
See https://github.com/quic/ai-hub-apps/releases/v0.3.0 for changelog.

Signed-off-by: QAIHM Team <[email protected]>
  • Loading branch information
qaihm-bot committed Oct 8, 2024
1 parent 60a8d7e commit 88d9771
Show file tree
Hide file tree
Showing 80 changed files with 97,673 additions and 96 deletions.
26 changes: 18 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,11 @@ With this repository, you can...

### Supported runtimes
* [TensorFlow Lite](https://www.tensorflow.org/lite)
* [ONNX](https://onnxruntime.ai/)

### Supported Deployment Targets
* Android 11 Red Velvet Cake & Newer, API v30+
* Windows 11

### Supported compute units
* CPU, GPU, NPU (includes [hexagon HTP](https://developer.qualcomm.com/hardware/qualcomm-innovators-development-kit/ai-resources-overview/ai-hardware-cores-accelerators))
Expand All @@ -38,15 +40,23 @@ __NOTE: These apps will run without NPU acceleration on non-Snapdragon® chipset

2. The README of the selected app will contain build & installation instructions.

## App Directory
## _Android_ App Directory

| Task | OS | Language | Inference API | Special Tags
| -- | -- | -- | -- | --
| | | |
| [Image Classification](apps/android/ImageClassification) | Android | Java | TensorFlow Lite |
| [Super Resolution](apps/android/SuperResolution) | Android | Java | TensorFlow Lite |
| [Semantic Segmentation](apps/android/SemanticSegmentation) | Android | Java | TensorFlow Lite | OpenCV, Live Camera Feed |
| [Object Detection](apps/windows/ObjectDetection) | Windows | C++ | ONNX Runtime | OpenCV |
| Task | Language | Inference API | Special Tags |
| -- | -- | -- | -- |
| [Image Classification](apps/android/ImageClassification) | Java | TensorFlow Lite |
| [Semantic Segmentation](apps/android/SemanticSegmentation) | Java | TensorFlow Lite | OpenCV, Live Camera Feed |
| [Super Resolution](apps/android/SuperResolution) | Java | TensorFlow Lite |

## _Windows_ App Directory

| Task | Language | Inference API | Special Tags |
| -- | -- | -- | -- |
| [Image Classification](apps/windows/cpp/Classification) | C++ | ONNX | OpenCV |
| [Llama 2 Chat](apps/windows/cpp/ChatApp) | C++ | ONNX |
| [Object Detection](apps/windows/cpp/ObjectDetection) | C++ | ONNX | OpenCV |
| [Super Resolution](apps/windows/cpp/SuperResolution) | C++ | ONNX | OpenCV |
| [Whisper Speech-to-Text](apps/windows/python/Whisper) | Python | ONNX |

## LICENSE

Expand Down
2 changes: 1 addition & 1 deletion apps/android/ImageClassification/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ The app aims to showcase best practices for using **TF Lite** for model inferenc
## Build the APK
1. Download or export a [compatible model](#compatible-ai-hub-models) from [AI Hub Models](https://aihub.qualcomm.com/mobile/models).
2. Copy the `.tflite` file to `src/main/assets/<your_model>.tflite`
3. In [../gradle.properties](../gradle.properties), modify the value of `imageclassification_tfLiteModelAsset` to the name of your model file (`<your_model>.tflite`)
3. In [../gradle.properties](../gradle.properties), modify the value of `classification_tfLiteModelAsset` to the name of your model file (`<your_model>.tflite`)
4. Open **the PARENT folder (`android`) (NOT THIS FOLDER)** in Android Studio, run gradle sync, and build the `ImageClassification` target.

## Supported Hardware (TF Lite Delegates)
Expand Down
9 changes: 2 additions & 7 deletions apps/android/ImageClassification/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -23,12 +23,6 @@ android {
}
}

sourceSets {
main {
jniLibs.srcDirs = [rootProject.ext.qnnJniLibs]
}
}

compileOptions {
sourceCompatibility JavaVersion.valueOf("VERSION_$javaSourceCompatibilityVersion")
targetCompatibility JavaVersion.valueOf("VERSION_$javaTargetCompatibilityVersion")
Expand Down Expand Up @@ -61,7 +55,8 @@ dependencies {
implementation 'org.tensorflow:tensorflow-lite-gpu:2.16.1'
implementation 'org.tensorflow:tensorflow-lite-gpu-api:2.16.1'
implementation 'org.tensorflow:tensorflow-lite-gpu-delegate-plugin:0.4.4'
implementation project(":qnn_sdk")
implementation "com.qualcomm.qti:qnn-runtime:$qnnVersion"
implementation "com.qualcomm.qti:qnn-tflite-delegate:$qnnVersion"
}

if (System.getProperty("user.dir") != project.rootDir.path) {
Expand Down
9 changes: 2 additions & 7 deletions apps/android/SemanticSegmentation/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,6 @@ android {
}
}

sourceSets {
main {
jniLibs.srcDirs = [rootProject.ext.qnnJniLibs]
}
}

compileOptions {
sourceCompatibility JavaVersion.valueOf("VERSION_$javaSourceCompatibilityVersion")
targetCompatibility JavaVersion.valueOf("VERSION_$javaTargetCompatibilityVersion")
Expand All @@ -51,7 +45,8 @@ dependencies {
implementation 'org.tensorflow:tensorflow-lite-gpu-api:2.16.1'
implementation 'org.tensorflow:tensorflow-lite-gpu-delegate-plugin:0.4.4'
implementation 'org.opencv:opencv:4.10.0'
implementation project(":qnn_sdk")
implementation "com.qualcomm.qti:qnn-runtime:$qnnVersion"
implementation "com.qualcomm.qti:qnn-tflite-delegate:$qnnVersion"
}

if (System.getProperty("user.dir") != project.rootDir.path) {
Expand Down
9 changes: 2 additions & 7 deletions apps/android/SuperResolution/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,6 @@ android {
}
}

sourceSets {
main {
jniLibs.srcDirs = [rootProject.ext.qnnJniLibs]
}
}

compileOptions {
sourceCompatibility JavaVersion.valueOf("VERSION_$javaSourceCompatibilityVersion")
targetCompatibility JavaVersion.valueOf("VERSION_$javaTargetCompatibilityVersion")
Expand Down Expand Up @@ -60,7 +54,8 @@ dependencies {
implementation 'org.tensorflow:tensorflow-lite-gpu:2.16.1'
implementation 'org.tensorflow:tensorflow-lite-gpu-api:2.16.1'
implementation 'org.tensorflow:tensorflow-lite-gpu-delegate-plugin:0.4.4'
implementation project(":qnn_sdk")
implementation "com.qualcomm.qti:qnn-runtime:$qnnVersion"
implementation "com.qualcomm.qti:qnn-tflite-delegate:$qnnVersion"
}

if (System.getProperty("user.dir") != project.rootDir.path) {
Expand Down
4 changes: 0 additions & 4 deletions apps/android/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,3 @@
plugins {
id 'com.android.application' version '8.2.0' apply false
}

ext {
qnnJniLibs = project(":qnn_sdk").getLayout().buildDirectory.dir("libs")
}
2 changes: 1 addition & 1 deletion apps/android/gradle.properties
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ androidTargetSDK=34
androidCompileSDK=34

# QNN Settings
qnnVersion=2.22.6.240515
qnnVersion=2.27.0

# Classifier Application Settings
includeClassificationApp=true
Expand Down
48 changes: 0 additions & 48 deletions apps/android/qnn_sdk/build.gradle

This file was deleted.

12 changes: 0 additions & 12 deletions apps/android/settings.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -10,15 +10,6 @@ dependencyResolutionManagement {
repositories {
google()
mavenCentral()
ivy {
url 'https://qaihub-public-assets.s3.us-west-2.amazonaws.com/'
patternLayout {
artifact '/qai-hub-apps/[organization]_[module]/v[revision].[ext]'
}
metadataSources {
artifact()
}
}
}
}

Expand All @@ -36,6 +27,3 @@ if (Boolean.valueOf(properties["includeSuperResolutionApp"])) {
if (Boolean.valueOf(properties["includeSemanticSegmentationApp"])) {
include ':SemanticSegmentation'
}

/** Qualcomm Helper Libraries **/
include ":qnn_sdk"
173 changes: 173 additions & 0 deletions apps/windows/cpp/ChatApp/ChatApp.cpp
Original file line number Diff line number Diff line change
@@ -0,0 +1,173 @@
// ---------------------------------------------------------------------
// Copyright (c) 2024 Qualcomm Innovation Center, Inc. All rights reserved.
// SPDX-License-Identifier: BSD-3-Clause
// ---------------------------------------------------------------------
#include "ChatApp.hpp"
#include "PromptHandler.hpp"
#include <fstream>
#include <iostream>
#include <regex>

using namespace App;

namespace
{

constexpr const int c_chat_separater_length = 80;

//
// ChatSplit - Line to split during Chat for UX
// Adds split line to separate out sections in output.
//
void ChatSplit(bool end_line = true)
{
std::string split_line(c_chat_separater_length, '-');
std::cout << "\n" << split_line;
if (end_line)
{
std::cout << "\n";
}
}

//
// GenieCallBack - Callback to handle response from Genie
// - Captures response from Genie into user_data
// - Print response to stdout
// - Add ChatSplit upon sentence completion
//
void GenieCallBack(const char* response_back, const GenieDialog_SentenceCode_t sentence_code, const void* user_data)
{
std::string* user_data_str = static_cast<std::string*>(const_cast<void*>(user_data));
user_data_str->append(response_back);

// Write user response to output.
std::cout << response_back;
if (sentence_code == GenieDialog_SentenceCode_t::GENIE_DIALOG_SENTENCE_END)
{
ChatSplit(false);
}
}

//
// LoadModelConfig - Loads model config file
// - Loads config file in memory
// - Replaces place-holders with user provided values
//
std::string LoadModelConfig(const std::string& model_config_path,
const std::string& models_path,
const std::string& htp_model_config_path,
const std::string& tokenizer_path)
{
std::string config;
// Read config file into memory
std::getline(std::ifstream(model_config_path), config, '\0');

// Replace place-holders in config file with user provided paths
config = std::regex_replace(config, std::regex("<models_path>"), models_path);
config = std::regex_replace(config, std::regex("<htp_backend_ext_path>"), htp_model_config_path);
config = std::regex_replace(config, std::regex("<tokenizer_path>"), tokenizer_path);
return config;
}

} // namespace

ChatApp::ChatApp(const std::string& model_config_path,
const std::string& models_path,
const std::string& htp_config_path,
const std::string& tokenizer_path)
{

// Load model config in-memory
std::string config = LoadModelConfig(model_config_path, models_path, htp_config_path, tokenizer_path);

// Create Genie config
if (GENIE_STATUS_SUCCESS != GenieDialogConfig_createFromJson(config.c_str(), &m_config_handle))
{
throw std::runtime_error("Failed to create the Genie Dialog config. Please check config file.");
}

// Create Genie dialog handle
if (GENIE_STATUS_SUCCESS != GenieDialog_create(m_config_handle, &m_dialog_handle))
{
throw std::runtime_error("Failed to create the Genie Dialog.");
}
}

ChatApp::~ChatApp()
{
if (m_config_handle != nullptr)
{
if (GENIE_STATUS_SUCCESS != GenieDialogConfig_free(m_config_handle))
{
std::cerr << "Failed to free the Genie Dialog config.";
}
}

if (m_dialog_handle != nullptr)
{
if (GENIE_STATUS_SUCCESS != GenieDialog_free(m_dialog_handle))
{
std::cerr << "Failed to free the Genie Dialog.";
}
}
}

void ChatApp::ChatWithUser(const std::string& user_name)
{
AppUtils::Llama2PromptHandler prompt_handler;

// Initiate Chat with infinite loop.
// User to provide `exit` as a prompt to exit.
while (true)
{
std::string user_prompt;
std::string model_response;

// Input user prompt
ChatSplit();
std::cout << user_name << ": ";
std::getline(std::cin, user_prompt);

// Exit prompt
if (user_prompt.compare(c_exit_prompt) == 0)
{
std::cout << "Exiting chat as per " << user_name << "'s request.";
return;
}
// User provides an empty prompt
if (user_prompt.empty())
{
std::cout << "\nPlease enter prompt.\n";
continue;
}

std::string tagged_prompt = prompt_handler.GetPromptWithTag(user_prompt);

// Bot's response
std::cout << c_bot_name << ":";
if (GENIE_STATUS_SUCCESS != GenieDialog_query(m_dialog_handle, tagged_prompt.c_str(),
GenieDialog_SentenceCode_t::GENIE_DIALOG_SENTENCE_COMPLETE,
GenieCallBack, &model_response))
{
throw std::runtime_error("Failed to get response from GenieDialog. Please restart the ChatApp.");
}

if (model_response.empty())
{
// If model response is empty, reset dialog to re-initiate dialog.
// During local testing, we found that in certain cases,
// model response bails out after few iterations during chat.
// If that happens, just reset Dialog handle to continue the chat.
if (GENIE_STATUS_SUCCESS != GenieDialog_reset(m_dialog_handle))
{
throw std::runtime_error("Failed to reset Genie Dialog.");
}
if (GENIE_STATUS_SUCCESS != GenieDialog_query(m_dialog_handle, tagged_prompt.c_str(),
GenieDialog_SentenceCode_t::GENIE_DIALOG_SENTENCE_COMPLETE,
GenieCallBack, &model_response))
{
throw std::runtime_error("Failed to get response from GenieDialog. Please restart the ChatApp.");
}
}
}
}
Loading

0 comments on commit 88d9771

Please sign in to comment.