Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added exportFormat parameter to LLMChatView #48

Merged
6 changes: 4 additions & 2 deletions Sources/SpeziLLM/SpeziLLM.docc/SpeziLLM.md
Original file line number Diff line number Diff line change
Expand Up @@ -158,13 +158,15 @@ struct LLMDemoView: View {

### LLM Chat View

The ``LLMChatView`` and ``LLMChatViewSchema`` present a basic chat views that enables users to chat with a Spezi LLM in a typical chat-like fashion. The input can be either typed out via the iOS keyboard or provided as voice input and transcribed into written text.
The ``LLMChatView`` and ``LLMChatViewSchema`` present basic chat views that enable users to chat with a Spezi LLM in a typical chat-like fashion. The input can be either typed out via the iOS keyboard or provided as voice input and transcribed into written text.
PSchmiedmayer marked this conversation as resolved.
Show resolved Hide resolved
The ``LLMChatViewSchema`` takes an ``LLMSchema`` instance to define which LLM in what configuration should be used for the text inference.
The ``LLMChatView`` is passed an ``LLMSession`` that represents the LLM in execution containing state and context.
The ``LLMChatView`` is passed an ``LLMSession`` that represents the LLM in execution containing state and context, and an optional `ChatView/ChatExportFormat` that defines the format of the to-be-exported `SpeziChat/Chat` (can be any of `.pdf`, `.text`, `.json`).

> Tip: The ``LLMChatView`` and ``LLMChatViewSchema`` build on top of the [SpeziChat package](https://swiftpackageindex.com/stanfordspezi/spezichat/documentation).
For more details, please refer to the DocC documentation of the [`ChatView`](https://swiftpackageindex.com/stanfordspezi/spezichat/documentation/spezichat/chatview).

> Tip: By default, the ``LLMChatView`` presents no share button in the toolbar that exports the current `SpeziChat/Chat`. To add this element or change the export functionality, pass the desired export format for the `exportFormat` parameter in ``LLMChatView/init(session:exportFormat:)``.

#### Usage

An example usage of the ``LLMChatViewSchema`` can be seen in the following example.
Expand Down
19 changes: 13 additions & 6 deletions Sources/SpeziLLM/Views/LLMChatView.swift
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ import SwiftUI

/// Chat view that enables users to interact with an LLM based on an ``LLMSession``.
///
/// The ``LLMChatView`` takes an ``LLMSession`` instance as parameter within the ``LLMChatView/init(session:)``. The ``LLMSession`` is the executable version of the LLM containing context and state as defined by the ``LLMSchema``.
/// The ``LLMChatView`` takes an ``LLMSession`` instance and an optional `ChatView/ChatExportFormat` as parameters within the ``LLMChatView/init(session:exportFormat:)``. The ``LLMSession`` is the executable version of the LLM containing context and state as defined by the ``LLMSchema``.
///
/// The input can be either typed out via the iOS keyboard or provided as voice input and transcribed into written text.
///
Expand All @@ -28,6 +28,7 @@ import SwiftUI
/// The next code examples demonstrate how to use the ``LLMChatView`` with ``LLMSession``s.
///
/// The ``LLMChatView`` must be passed a ``LLMSession``, meaning a ready-to-use LLM, resulting in the need for the developer to manually allocate the ``LLMSession`` via the ``LLMRunner`` and ``LLMSchema`` (which includes state management).
/// The ``LLMChatView`` may also be passed a `ChatView/ChatExportFormat` to enable the chat export functionality and define the format of the to-be-exported `SpeziChat/Chat`; possible export formats are `.pdf`, `.text`, and `.json`.
///
/// In order to simplify the usage of an ``LLMSession``, SpeziLLM provides the ``LLMSessionProvider`` property wrapper that conveniently instantiates an ``LLMSchema`` to an ``LLMSession``.
/// The `@LLMSessionProvider` wrapper abstracts away the necessity to use the ``LLMRunner`` from the SwiftUI `Environment` within a `.task()` view modifier to instantiate the ``LLMSession``.
Expand All @@ -42,7 +43,7 @@ import SwiftUI
/// @State var muted = true
///
/// var body: some View {
/// LLMChatView(session: $llm)
/// LLMChatView(session: $llm, exportFormat: .pdf)
PSchmiedmayer marked this conversation as resolved.
Show resolved Hide resolved
/// .speak(llm.context, muted: muted)
/// .speechToolbarButton(muted: $muted)
/// }
Expand All @@ -51,18 +52,19 @@ import SwiftUI
public struct LLMChatView<Session: LLMSession>: View {
/// The LLM in execution, as defined by the ``LLMSchema``.
@Binding private var llm: Session

/// Indicates if the input field is disabled.
@MainActor private var inputDisabled: Bool {
llm.state.representation == .processing
}
/// Defines the export format of the to-be-exported `SpeziChat/Chat`
private let exportFormat: ChatView.ChatExportFormat?


public var body: some View {
ChatView(
$llm.context,
disableInput: inputDisabled,
exportFormat: .pdf,
exportFormat: exportFormat,
messagePendingAnimation: .automatic
)
.viewStateAlert(state: llm.state)
Expand Down Expand Up @@ -92,9 +94,14 @@ public struct LLMChatView<Session: LLMSession>: View {
/// Creates a ``LLMChatView`` with a `Binding` of a ``LLMSession`` that provides developers with a basic chat view to interact with a Spezi LLM.
///
/// - Parameters:
/// - model: A `Binding` of a ``LLMSession`` that contains the ready-to-use LLM to generate outputs based on user input.
public init(session: Binding<Session>) {
/// - session: A `Binding` of a ``LLMSession`` that contains the ready-to-use LLM to generate outputs based on user input.
philippzagar marked this conversation as resolved.
Show resolved Hide resolved
/// - exportFormat: An optional `ChatView/ChatExportFormat` to enable the chat export functionality and define the format of the to-be-exported `SpeziChat/Chat`.
public init(
session: Binding<Session>,
exportFormat: ChatView.ChatExportFormat? = nil
PSchmiedmayer marked this conversation as resolved.
Show resolved Hide resolved
) {
self._llm = session
self.exportFormat = exportFormat
}
}

Expand Down
15 changes: 12 additions & 3 deletions Sources/SpeziLLM/Views/LLMChatViewSchema.swift
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
// SPDX-License-Identifier: MIT
//

import SpeziChat
import SwiftUI


Expand All @@ -32,21 +33,29 @@ import SwiftUI
/// }
/// }
/// ```
///
/// The ``LLMChatViewSchema`` may also be passed a `ChatView/ChatExportFormat` to enable the chat export functionality and define the format of the to-be-exported `SpeziChat/Chat`; possible export formats are `.pdf`, `.text`, and `.json`.
public struct LLMChatViewSchema<Schema: LLMSchema>: View {
@LLMSessionProvider<Schema> var llm: Schema.Platform.Session
private let exportFormat: ChatView.ChatExportFormat?


public var body: some View {
LLMChatView(session: $llm)
LLMChatView(session: $llm, exportFormat: exportFormat)
}


/// Creates a ``LLMChatViewSchema`` with an ``LLMSchema`` that provides developers with a basic chat view to interact with a Spezi LLM.
///
///
/// - Parameters:
/// - schema: The ``LLMSchema`` that defines the to-be-used LLM to generate outputs based on user input.
public init(with schema: Schema) {
/// - exportFormat: An optional `ChatView/ChatExportFormat` to enable the chat export functionality and define the format of the to-be-exported `SpeziChat/Chat`.
public init(
with schema: Schema,
exportFormat: ChatView.ChatExportFormat? = nil
) {
self._llm = LLMSessionProvider(schema: schema)
self.exportFormat = exportFormat
}
}

Expand Down
Loading