- Plain chat command to send a message without code context
- Always add selected code to the prompt, even if no flag is provided
- Moving token window to respect the context limit
- Use LangChain4j as the default LLM framework
- Compatibility with the latest Jetbrains Platform version
- Streaming updates from the LLM
- Sensible timeouts for the Ollama client
- Use LLama 3.1 model
- Support for selected code in prompts
- Ollama integration to query LLMs
- Chat interface to list messages and enter a new message
- Support for rendering Markdown
- Support for rendering code blocks with syntax highlighting
- Support for slash commands
- Command to start a new conversation
- Help command to list all available commands
- Support for different themes