Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make GPT-Model configurable #56

Open
fleesty-bfs opened this issue Apr 25, 2024 · 0 comments
Open

Make GPT-Model configurable #56

fleesty-bfs opened this issue Apr 25, 2024 · 0 comments

Comments

@fleesty-bfs
Copy link

Hi there,

it would be niceto be able to configure the GPT model.
Also, it would be helpful to have the error messages of the OpenAI API displayed in the logs if the status code is not 200. I had the problem that the standard model was not activated in my account, which is why the API only returned a 403 error. After printing the output of the API I could see that the error message was that the selected model is not allowed.
There are also two errors with bootstrap and ob_flush(), which I have removed.

My patch is quiet ugly for setting the GPT-model but it worked for me. Maybe you have a better idea how to do.

--- private/app/php/stream-api.php.orig 2024-04-25 16:51:11.169003708 +0000
+++ private/app/php/stream-api.php      2024-04-25 16:52:22.938023452 +0000
@@ -1,5 +1,4 @@
 <?php
-define('BOOTSTRAP_PATH',  '../../bootstrap.php');
 require_once BOOTSTRAP_PATH;

 session_start();
@@ -31,6 +30,11 @@

 // Read the request payload from the client
 $requestPayload = file_get_contents('php://input');
+if (isset($env) && isset($env['GPT_MODEL'])) {
+       $dataRequest = json_decode($requestPayload, true);
+       $dataRequest["model"] = $env['GPT_MODEL'];
+       $requestPayload = json_encode($dataRequest);
+}

 $ch = curl_init();
 curl_setopt($ch, CURLOPT_URL, $apiUrl);
@@ -45,7 +49,9 @@
 ]);
 curl_setopt($ch, CURLOPT_WRITEFUNCTION, function($ch, $data) {
        echo $data;
-       ob_flush();
+        if (curl_getinfo($ch, CURLINFO_HTTP_CODE) != 200) {
+               error_log($data);
+        }
        flush();
        return strlen($data);
 });

Then you should add the GPT_MODULE option to the .env.example

;GPT model to be used
;GPT_MODEL="gpt-4-turbo-preview"

Kind regards
Felix

@fleesty-bfs fleesty-bfs changed the title Support Proxy and make GPT-Model configurable Make GPT-Model configurable Apr 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant