Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add chatgpt demo for Socket.IO #597

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
.env
node_modules/
public/
sessions/
Original file line number Diff line number Diff line change
@@ -0,0 +1,87 @@
# ChatGPT in Web PubSub for Socket.IO

A simple sample written in React.js and Node.js to show how to build an OpenAI [ChatGPT](https://chat.openai.com/)-like chat bot using OpenAI chat completion [API](https://platform.openai.com/docs/guides/chat).

Web PubSub for Socket.IO service is used for realtime messaging.
The client is connected with the service rather than the server for realtime messaging.

Main features:
1. Dialogue based chat with ChatGPT
2. Generate chat response in a streaming way (like how OpenAI ChatGPT does it)
3. Render chat messages in rich text format (using Markdown)
4. Automatically name the chat session from content
5. Support using system message to customize the chat
6. Multiple chat sessions support
7. Persist chat history at server side
8. Support both native OpenAI API and [Azure OpenAI Service](https://azure.microsoft.com/products/cognitive-services/openai-service) API
9. Use Web PubSub for Socket.IO as realtime service to support large-scale concurrent clients.

## How to run

### Use native OpenAI:
You need to have an OpenAI account first.

1. Go to your OpenAI account, open [API keys](https://platform.openai.com/account/api-keys) page, create a new secret key.
2. Set the key as environment variable:
```
export MODE="native"
export OPENAI_API_KEY=<your_openai_api_key>
```
Or you can save the key into `.env` file:
```
MODE="native"
OPENAI_API_KEY=<your_openai_api_key>
```

### Use Azure OpenAI Service
You need to have an Azure OpenAI resource first.

1. Go to your Azure OpenAI service resource, open tab "Keys and Endpoint" and get required information listed below.
2. Set the environment variable
```bash
export MODE="azure"
export AZURE_OPENAI_RESOURCE_NAME=<azure_openai_resource_name>
export AZURE_OPENAI_DEPLOYMENT_NAME=<azure_openai_model_deployment_name>
export AZURE_OPENAI_API_KEY=<azure_openai_api_key>
```
Or you can save the key into `.env` file:
```
MODE="azure"
AZURE_OPENAI_RESOURCE_NAME=<azure_openai_resource_name>
AZURE_OPENAI_DEPLOYMENT_NAME=<azure_openai_model_deployment_name>
AZURE_OPENAI_API_KEY=<azure_openai_api_key>
```

3. Open you Web PubSub for Socket.IO resource and get the connection string.
Set the environment variable
```bash
export WebPubSubConnectionString=<web-pubsub-connection-string>
```
Or you can save the connection string into `.env` file:
```
WebPubSubConnectionString=<web-pubsub-connection-string>
```

4. Run the following command:
```
npm install
npm run build
npm start
```

Then open http://localhost:3000 in your browser to use the app.

There is also a CLI version where you can play with the chat bot in command line window:
```
node src/server/test.js
```

## Persist chat history

This sample has a very simple [implementation](src/server/storage.js) to persist the chat history into file system (the files can be found under `sessions` directory), which is only for demo purpose and should not be used in any production environment. You can have your own storage logic by implementing the functions in `Storage` class.

## Use Web PubSub for Socket.IO for realtime messaging

Web PubSub for Socket.IO is used to handle realtime messaging and manage large-scale concurrent connections.

> No matter which transport you're using, in the backend communication between server and OpenAI service is using [Server Sent Events](https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events), which is not something we can customize. Also this chat bot scenario itself is a request-response model, so there may not be big difference of using WebSocket/Socket.IO. But you may find it useful in other scenarios (e.g. in a multi-user chat room where messages may be broadcasted to all users), so I implemented it here just for the completeness of a technical demo.
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
export default {
presets: [
['@babel/preset-env', { targets: { node: 'current' } }],
'@babel/preset-react'
]
};
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
{
"name": "mychatgpt",
"version": "1.0.0",
"description": "",
"main": "index.js",
"type": "module",
"scripts": {
"start": "node src/server/index.js",
"build": "webpack build --mode production",
"dev": "webpack build --mode development",
"watch": "webpack watch --mode development"
},
"author": "",
"license": "ISC",
"dependencies": {
"@azure/web-pubsub-socket.io": "^1.0.0-beta.5",
"dotenv": "^16.0.3",
"express": "^4.18.2",
"js-yaml": "^4.1.0",
"openai": "^4.0.0-beta.7"
},
"devDependencies": {
"@babel/preset-env": "^7.20.2",
"@babel/preset-react": "^7.18.6",
"babel-loader": "^9.1.2",
"classnames": "^2.3.2",
"copy-webpack-plugin": "^11.0.0",
"css-loader": "^6.7.3",
"react": "^18.2.0",
"react-animate-height": "^3.1.1",
"react-dom": "^18.2.0",
"react-markdown": "^8.0.6",
"react-syntax-highlighter": "^15.5.0",
"remark-gfm": "^3.0.1",
"socket.io-client": "^4.7.1",
"style-loader": "^3.3.2",
"webpack": "^5.75.0",
"webpack-cli": "^5.0.1"
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,173 @@
body {
position: fixed;
height: 100%;
width: 100%;
}

#app {
width: 100%;
height: 100%;
}

@media screen and (max-width: 1200px) {
.limit-width {
width: 100%;
}
}

@media screen and (min-width: 1200px) {
.limit-width {
width: 1200px;
}
}

.borderless-button {
border: none;
}

.limit-width {
margin: auto;
padding: 0 8px;
}

.sessions {
max-height: calc(100% - 56px);
overflow-y: auto;
}

.session-title {
display: flex;
height: 40px;
}

.session-title-content {
color: #666;
cursor: pointer;
margin: auto;
}

.session {
width: 100%;
color: #666;
cursor: pointer;
}

.session-content {
height: 56px;
display: flex;
justify-content: space-between;
align-items: center;
}

.session:hover {
background: #eee !important;
}

.current-session {
font-weight: bold;
}

.floating-button {
display: none;
color: #888;
}

.floating-button:hover {
color: #000;
}

.welcome-message:hover .floating-button,
.session:hover .floating-button {
display: inline;
}

.session .btn {
padding: 0;
}

.update-session-input {
padding-right: 48px;
}

.delete-session {
width: 100%;
}

.yes-button {
margin-left: -48px;
}

.messages {
height: calc(100% - 120px);
overflow-y: auto;
}

.message {
display: inline-block;
max-width: 90%;
background: #f0f0f0;
border-radius: 0.5em;
color: #444;
box-shadow: 1px 1px 8px #ccc;
}

.message p:last-child {
margin-bottom: 0;
}

.local-message .message {
background: #f8f8f8;
}

.message-header {
font-size: 12px;
font-weight: bold;
}

.message-content {
text-align: justify;
}

.local-message {
display: flex;
justify-content: flex-end;
}

.welcome-window {
width: 100%;
height: 100%;
display: flex;
justify-content: center;
align-items: center;
}

.welcome-message {
background: #f8f8f8;
border-radius: 0.5em;
width: 50%;
color: #444;
box-shadow: 1px 1px 8px #ddd;
}

.system-message-input {
resize: none;
overflow: hidden;
}

.system-message {
padding: 7px 13px;
}

.input-box {
height: 64px;
display: flex;
align-items: center;
}

.message-input {
padding-right: 40px;
}

.send-button {
margin-left: -40px;
}
Loading