[BUG]: Error occurred while streaming response. Streaming error #2302
Labels
needs info / can't replicate
Issues that require additional information and/or cannot currently be replicated, but possible bug
How are you running AnythingLLM?
Docker (remote machine)
What happened?
I'm using OLLAMA's nomic-embed-text as the embedding model and llama3.1:8b as the LLM.
After I upload the documents, when asked a question, it is generating the sentence partially and then giving the streaming error
On checking the logs of the docker are as below:
[Event Logged] - workspace_documents_added
[OllamaEmbedder] Embedding 1 chunks of text with nomic-embed-text:latest.
[STREAM ABORTED] Client requested to abort stream. Exiting LLM stream handler early.
[TELEMETRY SENT] {
event: 'sent_chat',
distinctId: 'xxxxxxxx-xxxx-xxxxx-xxxxxx-xxxxxxxxxxxxxxxx',
properties: {
multiUserMode: false,
LLMSelection: 'ollama',
Embedder: 'ollama',
VectorDbSelection: 'lancedb',
runtime: 'docker'
}
}
[Event Logged] - sent_chat
Are there known steps to reproduce?
No response
The text was updated successfully, but these errors were encountered: