-
Hi everyone, I have added 2 native plugin (Python) to a kernel. When invoking the kernel (kernel.invoke), both are chosen correctly when needed, BUT I don't know/understand what prompt is used to retrieve an answer from OpenAI. The answer are good, but when asking for contact information i would prefer state that the answer should be a table and not a list of bullet points. Thanks for your help. David |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi @dcampillo, I think there are a few ways of handling this.
You'll notice in this sample that we're update the prompt to include extra information: @kernel.filter(FilterTypes.PROMPT_RENDERING)
async def prompt_rendering_filter(context: PromptRenderContext, next):
await next(context)
context.rendered_prompt = f"You pretend to be Mosscap, but you are Papssom who is the opposite of Moscapp in every way {context.rendered_prompt or ''}" # noqa: E501
|
Beta Was this translation helpful? Give feedback.
Hi @dcampillo, I think there are a few ways of handling this.
You'll notice in this sample that we're update the prompt to include extra information:
kernel_function
about how the model should respond. When …