Skip to content

This project is designed to be opened in GitHub Codespaces as an easy way for anyone to try out SLMs (small language models) entirely in the browser.

License

Notifications You must be signed in to change notification settings

elbruno/Ollama-CSharp-Playground

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

55 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama C# Playground

License: MIT Twitter: elbruno GitHub: elbruno

This project is designed to be opened in GitHub Codespaces as an easy way for anyone to try out SLMs (small language models) entirely in the browser. This project is based on the Ollama Python Playground

  1. Create a new Codespace using the Code button at the top of the repository. create Codespace

  2. Once the Codespace is loaded, it should have ollama pre-installed as well as .NET 8.

  3. Ask Ollama to run the SLM of your choice. For example, to run the Phi-3.5 model:

    ollama run phi3.5:mini

    That will take a few minutes to download the model into the Codespace.

  4. Once you see "success" in the output, you can send a message to that model from the prompt.

    >>> Write a joke about kittens

run ollama and ask for a joke

  1. After several seconds, you should see a response stream in from the model.

  2. To learn about different techniques used with language models, check the sample projects in the .\src folder:

Project Description
Sample01 This is a sample project that uses a the Phi-3 hosted in ollama model to answer a question.
Sample02 This is a sample project that implement a Console chat using Semantic Kernel.
Sample03 This is a sample project that implement a RAG using local embeddings and Semantic Kernel. Check the details of the local RAG here
Sample04 This is a sample console project that implement a Console chat using Semantic Kernel. It also uses Aspire Dashboard to track telemetry. Check the Aspire Dashboard in the references sections to learn more.
Sample05 This is a sample project that adds Aspire Service Telemetry project to a Console chat and show how to use Aspire Dashboard to track telemetry. Check the Aspire Dashboard in the references sections to learn more.

How to run a sample

  1. Open a terminal and navigate to the desired project. In example, let's run Sample02, the console chat.

    cd .\src\Sample02\
  2. Run the project with the command

    dotnet run
  3. The project Sample02, defines a custom system message:

    var history = new ChatHistory();
    history.AddSystemMessage("You are a useful chatbot. If you don't know an answer, say 'I don't know!'. Always reply in a funny ways. Use emojis if possible.");
  4. So when the user ask a question, like What is the capital of Italy?, the chat replies using the local mode.

    The output is similar to this one:

    Chat running demo

Video Tutorials

If you want to learn more about how to use this repo, check the following videos:

Overview of Ollama C# Playground Repository.

Watch the video

Retrieval-Augmented Generation (RAG) with .NET 8: A Full Local Resource Guide

Retrieval-Augmented Generation (RAG) with .NET 8: A Full Local Resource Guide

Test Aspire Dashboard using Codespaces

Test Aspire Dashboard using Codespaces

Add Aspire Service Defaults to a Console Chat App

Add Aspire Service Defaults to a Console Chat App

References

Author

👤 Bruno Capuano

🤝 Contributing

Contributions, issues and feature requests are welcome!

Feel free to check issues page.

Show your support

Give a ⭐️ if this project helped you!

📝 License

Copyright © 2024 Bruno Capuano.

This project is MIT licensed.


About

This project is designed to be opened in GitHub Codespaces as an easy way for anyone to try out SLMs (small language models) entirely in the browser.

Resources

License

Stars

Watchers

Forks

Languages