Skip to content

Latest commit

 

History

History
78 lines (51 loc) · 3.99 KB

README.md

File metadata and controls

78 lines (51 loc) · 3.99 KB

Obsidian plugin: AI Chat as Markdown

AI Chat as Markdown lets GPT-4 Omni / Claude 3.5 talk directly into your Obsidian Markdown notes.

It relies on nesting of headings, and thus you can have multiple conversations and even branching conversations in the same note.

Please see the documented branching conversation example to understand how that works.

The plugin supports images, so that you can talk about the images and diagrams that are embedded in your markdown, with models that support this, such os Omni and Claude 3.5.

It can be configured via the Obsidian plugin settings to use any OpenAI-compatible API server.

Demos

Screenshots and examples

Please go to the dedicated screenshots page

Features

  • Multiple branching chats as nested headings anywhere in your notes, see nesting example
  • Optionally configure different AI models for each markdown file via the frontmatter, in addition to the conventional plugin config
  • Use markdown files as system prompts. Use this for example to build up a library of system prompts for different applications.
  • Optionally configure a different system prompt file for each note via the frontmatter, e.g. aicmd-system-prompt-file: ai-chats/system prompt productivity.md
  • Use Obsidian embeds to construct modular and dynamic system prompts, see screenshots

Quickstart

  • Install plugin via community plugins
  • In Obsidian settings, under AI Chat as Markdown configure API Host (e.g. https://api.openai.com/), API key (sk-xxxxx), model name (e.g. gpt-4o)
  • In your Obsidian note, add example text # My heading\nAre you there?, position cursor inside, then, in edit mode, invoke via command palette AI Chat as Markdown: Send current thread to AI. Answer should appear under new sub-heading titled ## AI.
    • You could also just select some text, and then invoke AI Chat as Markdown: Send selected text to AI and append the response.

Manually installing the plugin

  • Copy over main.js, manifest.json to your vault VaultFolder/.obsidian/plugins/your-plugin-id/.

Inspired by

Dev quickstart

  • Clone this repo.
  • Make sure your NodeJS is at least v16 (node --version).
  • corepack enable
  • yarn to install dependencies (we use Yarn PnP)
  • yarn run dev to start compilation in watch mode.

Dev Tasks

  • support Obsidian embeds (aka transclusion) so that system prompts can be enriched with additional notes (planned for 1.5.0)
  • Send embedded images to the model
  • settings for default model, key, etc.
  • setup yarn PnP style packages
  • Add README section explaining the nesting to conversation algorithm
  • Make debug mode configurable, then feature-flag logging behind that

Maybe

  • implement user-friendly file selector for the system prompt file setting
  • enable per-document / yaml-header overrides of model, system prompt, etc.
  • Optionally add used model to each AI heading
  • ignore %...% comment blocks

Dev publish new version

See Create a new release