Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

System role? #35

Open
beccac00ke opened this issue Aug 11, 2023 · 3 comments
Open

System role? #35

beccac00ke opened this issue Aug 11, 2023 · 3 comments

Comments

@beccac00ke
Copy link

Does anyone with experience with OpenAI know if it possible to assign a role to the system to prevent it being able to answer questions outside the information supplied via the data files?

For example, a more simple example found online follows this format (but I don't believe this is possible while using langchain as in the script in the repo):

import openai

openai.api_key = "YOUR_API_KEY"

prompt = “You're a nutritionist chatbot that creates customized meal plan
Only answer questions related to nutrition.
Only ask questions related to nutrition, health and meal plans.

messages = [
{
"role": "system",
"content": prompt
}
]

def get_completion (messages, model="gpt-3.5-turbo"):
response = openai. ChatCompletion. create(
mode l=model,
messages=messages, temperature=0
)
return response. choices [0].message ("content"]

print(get_completion (messages) )

note: I'm specifically referring to the 'prompt' and 'messages = [ { "role": "system", "content": prompt } ]' lines in the examples.

@aaronysl
Copy link

hello have you problem fininshed?

@beccac00ke
Copy link
Author

hello have you problem fininshed?

Hi! Sort of, for my project I'm working on specifically I was able to create a work around which was to have a prefix (invisible to the user) appended to the user input. In terms of the program answering questions out of the scope, honestly when I ran the script again and input a question it shouldn't be able to answer (i.e. what is the capital on England?', it output the desired output, letting the user know it cannot answer that question. So to be honest I'm not sure what the issue there was but I have noticed I do have alot of odd problems like that when I use VSCode.

@aaronysl
Copy link

Thank you for your reply. I have the same problem as you. My need is, for example, when a user asks if Dr. Wang is going to work today? Then the content of my local document is: You can check the working hours of Dr. XXX here. I hope that when I send the content read by the local langchain to chatgpt, he can understand that whether it is Dr. Wang or Dr. Li, he can reply to the user where to check. At present, it seems that Dr. Wang and Dr. Li must be designated, and there is no way to use it. XXX refers to as a variable

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants