Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when using more than 999 characters in prompt. #143

Open
ebildebil opened this issue Oct 6, 2024 · 0 comments
Open

Error when using more than 999 characters in prompt. #143

ebildebil opened this issue Oct 6, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@ebildebil
Copy link

ebildebil commented Oct 6, 2024

Which version of assistant are you using?

1.1.0 as configured by Nextcloud AIO

Which version of Nextcloud are you using?

29.0.7

Which browser are you using? In case you are using the phone App, specify the Android or iOS version and device please.

Chrome or Firefox latest

Describe the Bug

May be connected to #59 .

I have configured an external API for use with Nextcloud Assistant. In this case Mistral.
When I tried a prompt today I get the below error:
My promt has a total of: 1009 characters and 182 words.

Text generation error: An exception occurred while executing a query: SQLSTATE[22001]: String data, right truncated: 7 ERROR: value too long for type character varying(1000)

In the logs I see below error:

{"reqId":"V85FoTrYUb0uA2GqxpJj","level":3,"time":"2024-10-01T16:46:29+00:00","remoteAddr":"127.0.0.1","user":"bisu","app":"PHP","method":"POST","url":"/apps/assistant/f/process_prompt","message":"Undefined array key 7 at /var/www/html/lib/private/AppFramework/Http.php#128","userAgent":"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/129.0.0.0 Safari/537.36","version":"29.0.7.1","data":{"app":"PHP"},"id":"66fc27c87b130"}

General googling suggests that it is likely a limitation in number of charecters that can be stored in the DB for that field? (I could be wrong)
If so, the newer models, have a context size of 128k, roughly 80k English words or even 2Millon tokens (Gemini Pro)

I have set 18000 as the prompt token limit.

When I edit my promt to 991 characters and 178 words, I do not get an error.

Expected Behavior

There should not be any errors.
It Should be able to handle large token sizes in relation to the models being used.

To Reproduce

Try a prompt with more than 1000 characters.

@ebildebil ebildebil added the bug Something isn't working label Oct 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant