Skip to content

Litchi is a jupyter lab extension for chat with ollama or others ai client

License

Notifications You must be signed in to change notification settings

MarchLiu/litchi

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

33 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

litchi

Github Actions Status Binder

Litchi is an AI extension for jupyter lab

Requirements

  • JupyterLab >= 4.0.0
  • nodejs 20
  • yarn

Install

To install the extension, execute:

pip install jupyter_litchi

Uninstall

To remove the extension, execute:

pip uninstall jupyter_litchi

How to use it

After install success. Just start jupyter lab in your computer and create a notebook.

You can see the toolbar in jupyterlab notebook:

Loaded

Now, we can write content and choice a model from model list in toolbar.

Loaded

And then use command palette or click the "send activate cell" button

Wait a moment. The replay will place into a new cell below current.

Loaded

At default, Litchi use ollama at http://localhost:11434 . But you can set it connect to any OpenAI like api.

Contributing

Development install

Note: You will need NodeJS to build the extension package.

The jlpm command is JupyterLab's pinned version of yarn that is installed with JupyterLab. You may use yarn or npm in lieu of jlpm below.

# Clone the repo to your local environment
# Change directory to the litchi directory
# Install package in development mode
pip install -e "."
# Link your development version of the extension with JupyterLab
jupyter labextension develop . --overwrite
# Rebuild extension Typescript source after making changes
jlpm build

You can watch the source directory and run JupyterLab at the same time in different terminals to watch for changes in the extension's source and automatically rebuild the extension.

# Watch the source directory in one terminal, automatically rebuilding when needed
jlpm watch
# Run JupyterLab in another terminal
jupyter lab

With the watch command running, every saved change will immediately be built locally and available in your running JupyterLab. Refresh JupyterLab to load the change in your browser (you may need to wait several seconds for the extension to be rebuilt).

By default, the jlpm build command generates the source maps for this extension to make it easier to debug using the browser dev tools. To also generate source maps for the JupyterLab core extensions, you can run the following command:

jupyter lab build --minimize=False

Development uninstall

pip uninstall jupyter-litchi

In development mode, you will also need to remove the symlink created by jupyter labextension develop command. To find its location, you can run jupyter labextension list to figure out where the labextensions folder is located. Then you can remove the symlink named litchi within that folder.

Testing the extension

Frontend tests

This extension is using Jest for JavaScript code testing.

To execute them, execute:

jlpm
jlpm test

Integration tests

This extension uses Playwright for the integration tests (aka user level tests). More precisely, the JupyterLab helper Galata is used to handle testing the extension in JupyterLab.

More information are provided within the ui-tests README.

Packaging the extension

See RELEASE

What's new

0.1.1

  • rename project as jupyter-litchi

0.1.0

  • chat with ollama in localhost:11434
  • select model in list

0.1.3

  • installer fixed

0.1.4

  • add settings

0.2.0

  • Add clean command for clean session
  • Settings for list model api and chat api. Litchi could connect any openai api

0.3.0

I remove the implicit session of chat. Now we use notebook as chat session.

  • command Litchi Chat just send current cell content and reply into below
  • command Litchi Contextual set current cell content, and with every message above activated cell
  • command Litchi Historical set current cell content, and with all cells of above

Very message send or received will marked their 'role' into metadata of the cell.

As command Litchi Contextual, the messages only include the cells were marked.

If we want to see the cells role information, could use command Litchi Show Roles Toggle.

0.3.1

  • Modify the "send activate cell" button to three: Chat, Contextual, Historical.
  • Add Litchi Chat Selected command

0.3.2

  • Show message's role by prompt

0.3.4

  • disable toolbar when litchi is waiting response.
  • bugs fixed

0.3.5

  • add chat commands to main menu
  • fixed show roles toggle command's state

0.3.6

  • bugs fixed
  • now the pip package worked!

0.3.8

The bug of models selector fixed.

0.4.0

Settings page has been improved. Now we use textarea as system prompt editor.

0.4.1

  • Merge litchi toolbar into notebook toolbar
  • The problem what toolbar missed if new notebook created had been fixed
  • Uniformed the chat buttons as notebook toolbar style

0.4.2

Throw a alert dialog if the communication failed.

0.4.3

  • Add translate To English/Chinese command and cell button in markdown/raw cell.
  • Support add more language translators in settings. They will be added into command palette.
  • Add Unit Test Command and the cell button in code cell

0.4.5

  • Add split cell command. The command split markdown cell content to markdown/mermaid and code cells. It is useful if the AI response mixed markdown text and code
  • Add continue mode. If continue mode is activated, add and active a new markdown cell below the AI response.
  • Add a cell toolbar button for continuous historical chat until current cell even if continuous mode is deactivated.

About Me

My name is Liu Xin, and my English name is Mars Liu and previously used March Liu. I translated the Python 2.2/2.3/2.4/2.5/2.7 Tutorial under this pseudonym.

In recent years, I published a book titled "Construction and Implementation of Micro Lisp Interpreter", which is based on my Jaskell Core library (https://github.com/MarchLiu/jaskell-core). The book introduces some knowledge about interpreter development.

I am one of the earliest users in both the Python Chinese Community and PostgreSQL Chinese Community. At QCon, I demonstrated a neural network algorithm implemented using SQL CTE syntax: SQL CTE.

Donate

Your sponsorship will contribute to the healthy growth of this project.

paypal

About

Litchi is a jupyter lab extension for chat with ollama or others ai client

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published