ConvoStack is a plug-and-play embeddable AI chatbot widget and backend deployment framework for your website. It is completely free and open source and currently running on our docs website!
The core technologies are:
- React (frontend)
- Express.js (backend)
- Redis (Production cache & pub/sub)
- Langchain (AI agent framework integration)
To learn more about the project, compatible technologies, and how to get started, check out the docs.
To see a live demo of ConvoStack, check out our free playground!
Get your AI chatbot up and running in minutes with our Quickstart repo and guide:
In the following example, we are connecting a Langchain OpenAI LLM to the chatbot playground.
import * as dotenv from "dotenv";
// Configures the OpenAI API key
dotenv.config();
import { playground } from "convostack/playground";
import { IAgentContext, IAgentResponse } from "convostack/agent";
import { OpenAI } from "langchain/llms/openai";
playground({
async reply(context: IAgentContext): Promise<IAgentResponse> {
// `humanMessage` is the content of each message the user sends via the chatbot playground.
let humanMessage = context.getHumanMessage().content;
// `agent` is the OpenAI agent we want to use to respond to each `humanMessage`
const agent = new OpenAI({ modelName: "gpt-3.5-turbo" });
// `call` is a simple string-in, string-out method for interacting with the OpenAI agent.
const resp = await model.call(humanMessage);
// `resp` is the generated agent's response to the user's `humanMessage`
return {
content: resp,
contentType: "markdown",
};
},
});
Follow our quickstart guide for more Langchain + ConvoStack examples.
To add the ConvoStack framework to an existing project, run the following command:
npm install --save convostack
To see the full documentation check out our docs site at docs.convostack.ai.