You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If its costing you $1.80 for 2 questions, thats $0.9 a question meaning its around 15k tokens for 1 question because every 1000 tokens cost 0.06 with gpt-4. At this point you're question prompts are too heavy eating up too many tokens, I suggest reducing prompt tokens when questioning by:
reducing system prompt for gpt
the length of context you're bringing back from cognitive search by chunking the searchable data in to smaller text chunks
We have deployed this Repository using our own data. GPT-4-32k is the model that is used.
We figured that the cost of Azure OpenAI is extremely expensive when using the solution.
One example:
2 questions --> 1.80$
Even the Azure Cognitive Search is around 7$ a day and the frontend and backend app each around 15$ a day.
If more people in the organisation were about to use this tool it will get expensive very fast..
Is there an idea on how to lowen the cost of using this solution?
Thank you in advance :)
The text was updated successfully, but these errors were encountered: