Force SK to make LLM call #8704
-
I'm having difficulty finding out how to stop the LLM caching that is going on with SK. I sometimes need to force a call to the LLM. I have a plugin function that is being called correctly the first time. That function is changing state and will deliver a different result the next time it is called. When I repeat the prompt, the function is not getting called. How can I fix this? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
This is interesting, seems that when you make your second call your are reusing the chathistory from the first. |
Beta Was this translation helpful? Give feedback.
This is interesting, seems that when you make your second call your are reusing the chathistory from the first.
When you reuse the chathistory it also contains in it, the last response from that function and the model may think that the last response is actually valid for further calls, my suggesting if that's the case, is to remove from chat history, any messages there are related to previous function call responses and see if the model will enforce a second call.