Skip to content

A simple interface for interacting with LLMs via a local installation of Ollama

License

Notifications You must be signed in to change notification settings

longevity-genie/just-chatui

 
 

Repository files navigation

Super Simple ChatUI

Super Simple ChatUI Screenshot

Overview

Super Simple ChatUI is a React/TypeScript project that provides a simple and intuitive frontend UI for interacting with a local LLM (Large Language Model) on through Ollama. This project enables users to interact with their own LLMs locally, ensuring privacy and control over their data.

This project was setup using Vite, which allows for rapid development thanks to features like Hot Module Replacement, support for TypeScript, CSS modules and more.

Installation

Prerequisites

  • Node (v18.17.0 or later)
  • Ollama

Steps

  1. Clone the repository
> git clone https://github.com/longevity-genie/just-chatui.git
> cd just-chatui
  1. Install dependencies
> npm install
  1. Run development server
> npm run dev
  1. Access application by visiting the link in your terminal (I believe Vite uses: http://localhost:5173)

Usage

  1. Ensure that Ollama is running on your machine and exposes its API at: http://localhost:11434
  2. Interact with LLM: Use the super-simple-chatui interface to send queries to Ollama and receive responses.

TODOs

  • Add support for IndexedDb via Dexie (longer term storage for conversations, system prompts, various settings, etc)
  • Add support for picking from available models via Ollama
  • Add support for chatting with models via the AI Horde
  • Add support for OpenAI's ChatGPT API via API key
  • Write tests! Always with the tests.

Contributing

Contributions are welcome! Please follow these steps to contribute:

  1. Fork the repository.
  2. Create a new branch (git checkout -b feature-branch).
  3. Make your changes.
  4. Commit your changes (git commit -m 'Add new feature').
  5. Push to the branch (git push origin feature-branch).
  6. Open a pull request.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Acknowledgments

About

A simple interface for interacting with LLMs via a local installation of Ollama

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 83.8%
  • CSS 12.4%
  • HTML 2.1%
  • JavaScript 1.7%