Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
…xamples into enhancement/streamlit-components
  • Loading branch information
MarcSkovMadsen committed Oct 20, 2023
2 parents fb40662 + 13f7e6e commit 63787a1
Show file tree
Hide file tree
Showing 54 changed files with 40 additions and 33 deletions.
Binary file modified docs/assets/thumbnails/basic_chat.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/basic_streaming_chat.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/basic_streaming_chat_async.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/component_environment_widget.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/feature_chained_response.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/feature_delayed_placeholder.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/feature_replace_response.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/feature_slim_interface.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/langchain_llama_and_mistral.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/langchain_math_assistant.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/langchain_pdf_assistant.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/langchain_with_memory.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/assets/thumbnails/mistral_and_llama.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/mistral_chat.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/mistral_with_memory.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/openai_async_chat.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/openai_authentication.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/openai_chat.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/openai_hvplot.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/openai_image_generation.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/thumbnails/openai_two_bots.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/assets/videos/basic_chat.mp4
Binary file not shown.
Binary file modified docs/assets/videos/basic_streaming_chat.mp4
Binary file not shown.
Binary file modified docs/assets/videos/basic_streaming_chat_async.mp4
Binary file not shown.
Binary file modified docs/assets/videos/component_environment_widget.mp4
Binary file not shown.
Binary file modified docs/assets/videos/feature_chained_response.mp4
Binary file not shown.
Binary file modified docs/assets/videos/feature_delayed_placeholder.mp4
Binary file not shown.
Binary file modified docs/assets/videos/feature_replace_response.mp4
Binary file not shown.
Binary file modified docs/assets/videos/feature_slim_interface.mp4
Binary file not shown.
Binary file modified docs/assets/videos/langchain_llama_and_mistral.mp4
Binary file not shown.
Binary file modified docs/assets/videos/langchain_math_assistant.mp4
Binary file not shown.
Binary file modified docs/assets/videos/langchain_pdf_assistant.mp4
Binary file not shown.
Binary file modified docs/assets/videos/langchain_with_memory.mp4
Binary file not shown.
Binary file added docs/assets/videos/mistral_and_llama.mp4
Binary file not shown.
Binary file modified docs/assets/videos/mistral_chat.mp4
Binary file not shown.
Binary file modified docs/assets/videos/mistral_with_memory.mp4
Binary file not shown.
Binary file modified docs/assets/videos/openai_async_chat.mp4
Binary file not shown.
Binary file modified docs/assets/videos/openai_authentication.mp4
Binary file not shown.
Binary file modified docs/assets/videos/openai_chat.mp4
Binary file not shown.
Binary file modified docs/assets/videos/openai_hvplot.mp4
Binary file not shown.
Binary file modified docs/assets/videos/openai_image_generation.mp4
Binary file not shown.
Binary file modified docs/assets/videos/openai_two_bots.mp4
Binary file not shown.
8 changes: 4 additions & 4 deletions docs/examples/features/feature_chained_response.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,16 +18,16 @@ async def callback(contents: str, user: str, instance: pn.chat.ChatInterface):
yield {
"user": ARM_BOT,
"avatar": "🦾",
"value": f"Hey, {LEG_BOT}! Did you hear the user?",
"object": f"Hey, {LEG_BOT}! Did you hear the user?",
}
instance.respond()
elif user == ARM_BOT:
user_message = instance.value[-2]
user_contents = user_message.value
user_message = instance.objects[-2]
user_contents = user_message.object
yield {
"user": LEG_BOT,
"avatar": "🦿",
"value": f'Yeah! They said "{user_contents}".',
"object": f'Yeah! They said "{user_contents}".',
}


Expand Down
2 changes: 1 addition & 1 deletion docs/examples/langchain/langchain_math_assistant.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@

async def callback(contents: str, user: str, instance: pn.chat.ChatInterface):
final_answer = await llm_math.arun(question=contents)
instance.stream(final_answer, message=instance.value[-1])
instance.stream(final_answer, message=instance.objects[-1])


chat_interface = pn.chat.ChatInterface(callback=callback, callback_user="Langchain")
Expand Down
6 changes: 3 additions & 3 deletions docs/examples/langchain/langchain_pdf_assistant.py
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,7 @@ def _send_not_ready_message(chat_interface) -> bool:
message = _get_validation_message()

if message:
chat_interface.send({"user": "System", "value": message}, respond=False)
chat_interface.send({"user": "System", "object": message}, respond=False)
return bool(message)


Expand All @@ -142,14 +142,14 @@ async def respond(contents, user, chat_interface):
if chat_interface.active == 0:
chat_interface.active = 1
chat_interface.active_widget.placeholder = "Ask questions here!"
yield {"user": "OpenAI", "value": "Let's chat about the PDF!"}
yield {"user": "OpenAI", "object": "Let's chat about the PDF!"}
return

response, documents = _get_response(contents)
pages_layout = pn.Accordion(*documents, sizing_mode="stretch_width", max_width=800)
answers = pn.Column(response["result"], pages_layout)

yield {"user": "OpenAI", "value": answers}
yield {"user": "OpenAI", "object": answers}


chat_interface = pn.chat.ChatInterface(
Expand Down
8 changes: 4 additions & 4 deletions docs/examples/mistral/mistral_with_memory.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,12 +18,12 @@ def apply_template(history):
prompt = ""
for i, message in enumerate(history):
if i == 0:
prompt += f"<s>[INST]{SYSTEM_INSTRUCTIONS} {message.value}[/INST]"
prompt += f"<s>[INST]{SYSTEM_INSTRUCTIONS} {message.object}[/INST]"
else:
if message.user == "Mistral":
prompt += f"{message.value}</s>"
prompt += f"{message.object}</s>"
else:
prompt += f"""[INST]{message.value}[/INST]"""
prompt += f"""[INST]{message.object}[/INST]"""
return prompt


Expand All @@ -42,7 +42,7 @@ async def callback(contents: str, user: str, instance: pn.chat.ChatInterface):
)

llm = llms["mistral"]
history = [message for message in instance.value]
history = [message for message in instance.objects]
prompt = apply_template(history)
response = llm(prompt, stream=True)
message = ""
Expand Down
2 changes: 1 addition & 1 deletion docs/examples/openai/openai_hvplot.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ async def respond_with_executor(code: str):
plot = exec_with_return(code=code, global_context=context)
return {
"user": "Executor",
"value": pn.Tabs(
"object": pn.Tabs(
("Plot", plot),
("Code", code_block),
),
Expand Down
4 changes: 2 additions & 2 deletions docs/examples/openai/openai_two_bots.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,9 +32,9 @@ async def callback(
message = ""
async for chunk in response:
message += chunk["choices"][0]["delta"].get("content", "")
yield {"user": callback_user, "avatar": callback_avatar, "value": message}
yield {"user": callback_user, "avatar": callback_avatar, "object": message}

if len(instance.value) % 6 == 0: # stop at every 6 messages
if len(instance.objects) % 6 == 0: # stop at every 6 messages
instance.send(
"That's it for now! Thanks for chatting!", user="System", respond=False
)
Expand Down
8 changes: 4 additions & 4 deletions docs/features.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,16 +37,16 @@ async def callback(contents: str, user: str, instance: pn.chat.ChatInterface):
yield {
"user": ARM_BOT,
"avatar": "🦾",
"value": f"Hey, {LEG_BOT}! Did you hear the user?",
"object": f"Hey, {LEG_BOT}! Did you hear the user?",
}
instance.respond()
elif user == ARM_BOT:
user_message = instance.value[-2]
user_contents = user_message.value
user_message = instance.objects[-2]
user_contents = user_message.object
yield {
"user": LEG_BOT,
"avatar": "🦿",
"value": f'Yeah! They said "{user_contents}".',
"object": f'Yeah! They said "{user_contents}".',
}


Expand Down
8 changes: 4 additions & 4 deletions docs/langchain.md
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,7 @@ pn.extension(design="material")

async def callback(contents: str, user: str, instance: pn.chat.ChatInterface):
final_answer = await llm_math.arun(question=contents)
instance.stream(final_answer, message=instance.value[-1])
instance.stream(final_answer, message=instance.object[-1])


chat_interface = pn.chat.ChatInterface(callback=callback, callback_user="Langchain")
Expand Down Expand Up @@ -302,7 +302,7 @@ def _send_not_ready_message(chat_interface) -> bool:
message = _get_validation_message()

if message:
chat_interface.send({"user": "System", "value": message}, respond=False)
chat_interface.send({"user": "System", "object": message}, respond=False)
return bool(message)


Expand All @@ -312,14 +312,14 @@ async def respond(contents, user, chat_interface):
if chat_interface.active == 0:
chat_interface.active = 1
chat_interface.active_widget.placeholder = "Ask questions here!"
yield {"user": "OpenAI", "value": "Let's chat about the PDF!"}
yield {"user": "OpenAI", "object": "Let's chat about the PDF!"}
return

response, documents = _get_response(contents)
pages_layout = pn.Accordion(*documents, sizing_mode="stretch_width", max_width=800)
answers = pn.Column(response["result"], pages_layout)

yield {"user": "OpenAI", "value": answers}
yield {"user": "OpenAI", "object": answers}


chat_interface = pn.chat.ChatInterface(
Expand Down
8 changes: 4 additions & 4 deletions docs/mistral.md
Original file line number Diff line number Diff line change
Expand Up @@ -175,12 +175,12 @@ def apply_template(history):
prompt = ""
for i, message in enumerate(history):
if i == 0:
prompt += f"<s>[INST]{SYSTEM_INSTRUCTIONS} {message.value}[/INST]"
prompt += f"<s>[INST]{SYSTEM_INSTRUCTIONS} {message.object}[/INST]"
else:
if message.user == "Mistral":
prompt += f"{message.value}</s>"
prompt += f"{message.object}</s>"
else:
prompt += f"""[INST]{message.value}[/INST]"""
prompt += f"""[INST]{message.object}[/INST]"""
return prompt


Expand All @@ -199,7 +199,7 @@ async def callback(contents: str, user: str, instance: pn.chat.ChatInterface):
)

llm = llms["mistral"]
history = [message for message in instance.value]
history = [message for message in instance.objects]
prompt = apply_template(history)
response = llm(prompt, stream=True)
message = ""
Expand Down
8 changes: 4 additions & 4 deletions docs/openai.md
Original file line number Diff line number Diff line change
Expand Up @@ -263,7 +263,7 @@ async def respond_with_executor(code: str):
plot = exec_with_return(code=code, global_context=context)
return {
"user": "Executor",
"value": pn.Tabs(
"object": pn.Tabs(
("Plot", plot),
("Code", code_block),
),
Expand Down Expand Up @@ -402,9 +402,9 @@ async def callback(
message = ""
async for chunk in response:
message += chunk["choices"][0]["delta"].get("content", "")
yield {"user": callback_user, "avatar": callback_avatar, "value": message}
yield {"user": callback_user, "avatar": callback_avatar, "object": message}

if len(instance.value) % 6 == 0: # stop at every 6 messages
if len(instance.objects) % 6 == 0: # stop at every 6 messages
instance.send(
"That's it for now! Thanks for chatting!", user="System", respond=False
)
Expand All @@ -420,4 +420,4 @@ chat_interface.send(
)
chat_interface.servable()
```
</details>
</details>
2 changes: 1 addition & 1 deletion tests/ui/test_all.py
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ def test_app(server, app_path, port, page):
# zoom and run should be defined for all examples
# even if we don't run the video
run = ACTION[name]
zoom = ZOOM[name]
zoom = ZOOM.get(name, 1.5)

# We cannot run these tests in pipelines etc. as they require models downloaded,
# api keys etc.
Expand Down
9 changes: 8 additions & 1 deletion tests/ui/user.py
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,7 @@ def langchain_llama_and_mistral(page: Page):
# Could not get this working as it always starts by downloading models
chat = ChatInterface(page)
chat.send("Please explain what kind of model you are in one sentence")
page.wait_for_timeout(10000)
page.wait_for_timeout(15000)


def langchain_with_memory(page: Page):
Expand Down Expand Up @@ -148,6 +148,12 @@ def langchain_pdf_assistant(page: Page):
page.wait_for_timeout(10000)


def mistral_and_llama(page: Page):
chat = ChatInterface(page)
chat.send("What do you think about HoloViz in a single sentence?")
page.wait_for_timeout(15000)


def mistral_chat(page: Page):
chat = ChatInterface(page)
chat.send("What is HoloViz Panel in one sentence")
Expand Down Expand Up @@ -229,6 +235,7 @@ def openai_two_bots(page: Page):
"langchain_math_assistant.py": langchain_math_assistant,
"langchain_pdf_assistant.py": langchain_pdf_assistant,
"langchain_with_memory.py": langchain_with_memory,
"mistral_and_llama.py": mistral_and_llama,
"mistral_chat.py": mistral_chat,
"mistral_with_memory.py": mistral_with_memory,
"openai_async_chat.py": openai_async_chat,
Expand Down

0 comments on commit 63787a1

Please sign in to comment.