-
Notifications
You must be signed in to change notification settings - Fork 3.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Google colab version of it is available at my fork🐢🌹 #299
Open
soheilpaper
wants to merge
23
commits into
OpenBMB:main
Choose a base branch
from
So-AI-love:main
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
23 commits
Select commit
Hold shift + click to select a range
acb6bee
Created using Colaboratory
soheilpaper 99b7d5d
Rename Dockerfile to Dockerfile_main
soheilpaper 1d6af19
Create DockerFile
soheilpaper af1192a
Created using Colaboratory
soheilpaper 6d9216e
Created using Colaboratory
soheilpaper 2ac7260
Created using Colaboratory
soheilpaper 888291e
Created using Colaboratory
soheilpaper 0e606eb
Created using Colaboratory
soheilpaper e766c3d
Created using Colaboratory
soheilpaper 035d617
Created using Colaboratory
soheilpaper aa71b67
Created using Colaboratory
soheilpaper 91e86c8
Created using Colaboratory
soheilpaper a73ca09
Created using Colaboratory
soheilpaper 923e5ad
Created using Colaboratory
soheilpaper 9df9044
Created using Colaboratory
soheilpaper 61383ed
Created using Colaboratory
soheilpaper 964890a
Created using Colaboratory
soheilpaper 6b8f5f9
Delete NotrBook directory
soheilpaper 92011ff
Created using Colaboratory
soheilpaper 2b4004e
Created using Colaboratory
soheilpaper c24eb71
Created using Colaboratory
soheilpaper cb2f8e9
Created using Colaboratory
soheilpaper d517bca
Created using Colaboratory
soheilpaper File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,292 @@ | ||
{ | ||
"cells": [ | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": { | ||
"id": "view-in-github", | ||
"colab_type": "text" | ||
}, | ||
"source": [ | ||
"<a href=\"https://colab.research.google.com/github/So-AI-love/ChatDev/blob/main/Auto_Making/API_8_3_Financial_Model_Prompt.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"source": [], | ||
"metadata": { | ||
"id": "Khj3UP-O_6aQ" | ||
}, | ||
"execution_count": 1, | ||
"outputs": [] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"source": [ | ||
"!pip install openai pdfkit python-docx" | ||
], | ||
"metadata": { | ||
"id": "eo0OhdMT_7SQ", | ||
"outputId": "4988ccb9-3327-4076-ac66-ebd4b522084f", | ||
"colab": { | ||
"base_uri": "https://localhost:8080/" | ||
} | ||
}, | ||
"execution_count": 2, | ||
"outputs": [ | ||
{ | ||
"output_type": "stream", | ||
"name": "stdout", | ||
"text": [ | ||
"Collecting openai\n", | ||
" Using cached openai-1.3.8-py3-none-any.whl (221 kB)\n", | ||
"Collecting pdfkit\n", | ||
" Downloading pdfkit-1.0.0-py3-none-any.whl (12 kB)\n", | ||
"Collecting python-docx\n", | ||
" Downloading python_docx-1.1.0-py3-none-any.whl (239 kB)\n", | ||
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m239.6/239.6 kB\u001b[0m \u001b[31m2.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", | ||
"\u001b[?25hRequirement already satisfied: anyio<5,>=3.5.0 in /usr/local/lib/python3.10/dist-packages (from openai) (3.7.1)\n", | ||
"Requirement already satisfied: distro<2,>=1.7.0 in /usr/lib/python3/dist-packages (from openai) (1.7.0)\n", | ||
"Collecting httpx<1,>=0.23.0 (from openai)\n", | ||
" Downloading httpx-0.25.2-py3-none-any.whl (74 kB)\n", | ||
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m75.0/75.0 kB\u001b[0m \u001b[31m1.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", | ||
"\u001b[?25hRequirement already satisfied: pydantic<3,>=1.9.0 in /usr/local/lib/python3.10/dist-packages (from openai) (1.10.13)\n", | ||
"Requirement already satisfied: sniffio in /usr/local/lib/python3.10/dist-packages (from openai) (1.3.0)\n", | ||
"Requirement already satisfied: tqdm>4 in /usr/local/lib/python3.10/dist-packages (from openai) (4.66.1)\n", | ||
"Requirement already satisfied: typing-extensions<5,>=4.5 in /usr/local/lib/python3.10/dist-packages (from openai) (4.5.0)\n", | ||
"Requirement already satisfied: lxml>=3.1.0 in /usr/local/lib/python3.10/dist-packages (from python-docx) (4.9.3)\n", | ||
"Requirement already satisfied: idna>=2.8 in /usr/local/lib/python3.10/dist-packages (from anyio<5,>=3.5.0->openai) (3.6)\n", | ||
"Requirement already satisfied: exceptiongroup in /usr/local/lib/python3.10/dist-packages (from anyio<5,>=3.5.0->openai) (1.2.0)\n", | ||
"Requirement already satisfied: certifi in /usr/local/lib/python3.10/dist-packages (from httpx<1,>=0.23.0->openai) (2023.11.17)\n", | ||
"Collecting httpcore==1.* (from httpx<1,>=0.23.0->openai)\n", | ||
" Downloading httpcore-1.0.2-py3-none-any.whl (76 kB)\n", | ||
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m76.9/76.9 kB\u001b[0m \u001b[31m8.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", | ||
"\u001b[?25hCollecting h11<0.15,>=0.13 (from httpcore==1.*->httpx<1,>=0.23.0->openai)\n", | ||
" Downloading h11-0.14.0-py3-none-any.whl (58 kB)\n", | ||
"\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m58.3/58.3 kB\u001b[0m \u001b[31m6.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", | ||
"\u001b[?25hInstalling collected packages: pdfkit, python-docx, h11, httpcore, httpx, openai\n", | ||
"\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\n", | ||
"llmx 0.0.15a0 requires cohere, which is not installed.\n", | ||
"llmx 0.0.15a0 requires tiktoken, which is not installed.\u001b[0m\u001b[31m\n", | ||
"\u001b[0mSuccessfully installed h11-0.14.0 httpcore-1.0.2 httpx-0.25.2 openai-1.3.8 pdfkit-1.0.0 python-docx-1.1.0\n" | ||
] | ||
} | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"source": [ | ||
" # @title (Please insert your request to be done by this code at below form:👇👇)\n", | ||
"TOPIC = \"PersonalityInteractions\" # @param {type:\"string\"}\n", | ||
"PARAGRAPH = \"Use game theory to understand how different personality types (Dark Triad, Dark Tetrad, Dark Empathy) might interact within a society \\u003CINFO> PowerPoint\" # @param {type:\"string\"}\n", | ||
"role = \"startup Entrepreneur\"# @param {type:\"string\"}\n", | ||
"\n", | ||
"\n", | ||
"Your_Email = \"[email protected]\" # @param {type:\"string\"}\n", | ||
"\n", | ||
"openai_api = \"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\" # @param {type:\"string\"}\n", | ||
"\n", | ||
"\n", | ||
"#!export OPENAI_API_KEY = openai_api\n", | ||
"\n", | ||
"import os\n", | ||
"\n", | ||
"os.environ['OPENAI_API_KEY'] = openai_api #'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'" | ||
], | ||
"metadata": { | ||
"id": "vRIB557vAbP0" | ||
}, | ||
"execution_count": 8, | ||
"outputs": [] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"source": [ | ||
"!pip install python-dotenv" | ||
], | ||
"metadata": { | ||
"id": "3BF-3djRAkVu", | ||
"outputId": "8a4f229b-ae56-4156-bf32-05bafa9c6e83", | ||
"colab": { | ||
"base_uri": "https://localhost:8080/" | ||
} | ||
}, | ||
"execution_count": 4, | ||
"outputs": [ | ||
{ | ||
"output_type": "stream", | ||
"name": "stdout", | ||
"text": [ | ||
"Collecting python-dotenv\n", | ||
" Downloading python_dotenv-1.0.0-py3-none-any.whl (19 kB)\n", | ||
"Installing collected packages: python-dotenv\n", | ||
"Successfully installed python-dotenv-1.0.0\n" | ||
] | ||
} | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"source": [ | ||
"'''\n", | ||
"This file contains the ChatGP class that generates a business plan using OpenAI's GPT model.\n", | ||
"'''\n", | ||
"import openai\n", | ||
"import pdfkit\n", | ||
"from docx import Document\n", | ||
"from dotenv import load_dotenv\n", | ||
"import os\n", | ||
"class ChatGP:\n", | ||
" def __init__(self):\n", | ||
" load_dotenv()\n", | ||
" self.api_key = os.getenv('OPENAI_API_KEY')\n", | ||
" def generate_business_plan(self, topic, description):\n", | ||
" prompts = [\n", | ||
" \"1. Executive Summary:\",\n", | ||
" \"2. Company Description:\",\n", | ||
" \"3. Market Analysis:\",\n", | ||
" \"4. Organization and Management:\",\n", | ||
" \"5. Product or Service Line:\",\n", | ||
" \"6. Marketing and Sales Strategy:\",\n", | ||
" \"7. Funding Request:\",\n", | ||
" \"8. Financial Projections:\",\n", | ||
" \"9. Appendix:\",\n", | ||
" \"10. Conclusion:\"\n", | ||
" ]\n", | ||
" results = []\n", | ||
" for i, prompt in enumerate(prompts):\n", | ||
" response = self.generate_response(prompt, topic, description)\n", | ||
" result = f\"{i+1}. {prompt}\\n{response}\"\n", | ||
" results.append(result)\n", | ||
" self.save_business_plan(results)\n", | ||
" return results\n", | ||
" def generate_response(self, prompt, topic, description):\n", | ||
" openai.api_key = self.api_key\n", | ||
" response = openai.Completion.create(\n", | ||
" engine='text-davinci-003',\n", | ||
" prompt=f\"{prompt} {topic} {description}\",\n", | ||
" max_tokens=100\n", | ||
" )\n", | ||
" return response.choices[0].text.strip()\n", | ||
" def save_business_plan(self, results):\n", | ||
" doc = Document()\n", | ||
" for result in results:\n", | ||
" title, content = result.split('\\n', 1)\n", | ||
" subtitle, numbering = title.split('. ', 1)\n", | ||
" doc.add_heading(subtitle, level=1)\n", | ||
" doc.add_heading(numbering, level=2)\n", | ||
" doc.add_paragraph(content)\n", | ||
" doc.save('business_plan.docx')\n", | ||
" pdfkit.from_file('business_plan.docx', 'business_plan.pdf')" | ||
], | ||
"metadata": { | ||
"id": "DlE3kHlfAL0n" | ||
}, | ||
"execution_count": 5, | ||
"outputs": [] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"source": [ | ||
"topic = TOPIC # request.form['topic']\n", | ||
"description = PARAGRAPH # request.form['description']\n", | ||
"chatgp = ChatGP()\n", | ||
"results = chatgp.generate_business_plan(topic, description)" | ||
], | ||
"metadata": { | ||
"id": "_bQqVCZnEDWD", | ||
"outputId": "5e21176c-f20b-432a-e38c-e891d2bca747", | ||
"colab": { | ||
"base_uri": "https://localhost:8080/", | ||
"height": 1000 | ||
} | ||
}, | ||
"execution_count": 9, | ||
"outputs": [ | ||
{ | ||
"output_type": "error", | ||
"ename": "APIRemovedInV1", | ||
"evalue": "ignored", | ||
"traceback": [ | ||
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", | ||
"\u001b[0;31mAPIRemovedInV1\u001b[0m Traceback (most recent call last)", | ||
"\u001b[0;32m<ipython-input-9-5374b87b775e>\u001b[0m in \u001b[0;36m<cell line: 4>\u001b[0;34m()\u001b[0m\n\u001b[1;32m 2\u001b[0m \u001b[0mdescription\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mPARAGRAPH\u001b[0m \u001b[0;31m# request.form['description']\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 3\u001b[0m \u001b[0mchatgp\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mChatGP\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 4\u001b[0;31m \u001b[0mresults\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mchatgp\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgenerate_business_plan\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mtopic\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdescription\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m", | ||
"\u001b[0;32m<ipython-input-5-2b8956535385>\u001b[0m in \u001b[0;36mgenerate_business_plan\u001b[0;34m(self, topic, description)\u001b[0m\n\u001b[1;32m 26\u001b[0m \u001b[0mresults\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 27\u001b[0m \u001b[0;32mfor\u001b[0m \u001b[0mi\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mprompt\u001b[0m \u001b[0;32min\u001b[0m \u001b[0menumerate\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mprompts\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 28\u001b[0;31m \u001b[0mresponse\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mgenerate_response\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mprompt\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtopic\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdescription\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 29\u001b[0m \u001b[0mresult\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34mf\"{i+1}. {prompt}\\n{response}\"\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 30\u001b[0m \u001b[0mresults\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mappend\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mresult\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", | ||
"\u001b[0;32m<ipython-input-5-2b8956535385>\u001b[0m in \u001b[0;36mgenerate_response\u001b[0;34m(self, prompt, topic, description)\u001b[0m\n\u001b[1;32m 33\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0mgenerate_response\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mprompt\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtopic\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mdescription\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 34\u001b[0m \u001b[0mopenai\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mapi_key\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mapi_key\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 35\u001b[0;31m response = openai.Completion.create(\n\u001b[0m\u001b[1;32m 36\u001b[0m \u001b[0mengine\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m'text-davinci-003'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 37\u001b[0m \u001b[0mprompt\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34mf\"{prompt} {topic} {description}\"\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n", | ||
"\u001b[0;32m/usr/local/lib/python3.10/dist-packages/openai/lib/_old_api.py\u001b[0m in \u001b[0;36m__call__\u001b[0;34m(self, *_args, **_kwargs)\u001b[0m\n\u001b[1;32m 37\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 38\u001b[0m \u001b[0;32mdef\u001b[0m \u001b[0m__call__\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m*\u001b[0m\u001b[0m_args\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mAny\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m**\u001b[0m\u001b[0m_kwargs\u001b[0m\u001b[0;34m:\u001b[0m \u001b[0mAny\u001b[0m\u001b[0;34m)\u001b[0m \u001b[0;34m->\u001b[0m \u001b[0mAny\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 39\u001b[0;31m \u001b[0;32mraise\u001b[0m \u001b[0mAPIRemovedInV1\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0msymbol\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_symbol\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 40\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 41\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n", | ||
"\u001b[0;31mAPIRemovedInV1\u001b[0m: \n\nYou tried to access openai.Completion, but this is no longer supported in openai>=1.0.0 - see the README at https://github.com/openai/openai-python for the API.\n\nYou can run `openai migrate` to automatically upgrade your codebase to use the 1.0.0 interface. \n\nAlternatively, you can pin your installation to the old version, e.g. `pip install openai==0.28`\n\nA detailed migration guide is available here: https://github.com/openai/openai-python/discussions/742\n" | ||
] | ||
} | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"source": [ | ||
"'''\n", | ||
"This file contains the main Flask application for the ChatGP business plan generator.\n", | ||
"'''\n", | ||
"from flask import Flask, render_template, request, redirect, url_for\n", | ||
"#from chatgp import ChatGP\n", | ||
"\n", | ||
"\n", | ||
"app = Flask(__name__)\n", | ||
"@app.route('/')\n", | ||
"def index():\n", | ||
" return render_template('index.html')\n", | ||
"@app.route('/generate', methods=['POST'])\n", | ||
"def generate():\n", | ||
" topic = request.form['topic']\n", | ||
" description = request.form['description']\n", | ||
" chatgp = ChatGP()\n", | ||
" results = chatgp.generate_business_plan(topic, description)\n", | ||
" return render_template('results.html', results=results)\n", | ||
"@app.route('/send_email', methods=['POST'])\n", | ||
"def send_email():\n", | ||
" email = request.form['email']\n", | ||
" # Add code to send the generated business plan to the specified email address\n", | ||
" return redirect(url_for('index'))\n", | ||
"if __name__ == '__main__':\n", | ||
" app.run()" | ||
], | ||
"metadata": { | ||
"id": "t80qXWMOANAH", | ||
"outputId": "aa8dba64-8517-4e8a-be29-65b12a1b2587", | ||
"colab": { | ||
"base_uri": "https://localhost:8080/" | ||
} | ||
}, | ||
"execution_count": 6, | ||
"outputs": [ | ||
{ | ||
"output_type": "stream", | ||
"name": "stdout", | ||
"text": [ | ||
" * Serving Flask app '__main__'\n", | ||
" * Debug mode: off\n" | ||
] | ||
}, | ||
{ | ||
"output_type": "stream", | ||
"name": "stderr", | ||
"text": [ | ||
"INFO:werkzeug:\u001b[31m\u001b[1mWARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.\u001b[0m\n", | ||
" * Running on http://127.0.0.1:5000\n", | ||
"INFO:werkzeug:\u001b[33mPress CTRL+C to quit\u001b[0m\n" | ||
] | ||
} | ||
] | ||
} | ||
], | ||
"metadata": { | ||
"colab": { | ||
"provenance": [], | ||
"include_colab_link": true | ||
}, | ||
"kernelspec": { | ||
"display_name": "Python 3", | ||
"name": "python3" | ||
}, | ||
"accelerator": "TPU" | ||
}, | ||
"nbformat": 4, | ||
"nbformat_minor": 0 | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,27 @@ | ||
# Start with a Python 3.9 base image | ||
FROM python:3.9-slim | ||
|
||
# Set the working directory in the container | ||
WORKDIR /app | ||
|
||
# Copy the current directory contents into the container at /app | ||
COPY . /app | ||
|
||
# Install necessary libraries for GUI support | ||
RUN apt-get update && apt-get install -y python3-tk x11-apps | ||
|
||
# Install the project dependencies | ||
RUN pip install --no-cache-dir -r requirements.txt | ||
|
||
# Install HuggingFace Transformers and Uvicorn server | ||
RUN pip install transformers uvicorn | ||
|
||
# Set the environment variable for OpenAI API key | ||
# (you'll need to provide the actual key when running the container) | ||
ENV OPENAI_API_KEY=your_OpenAI_API_key | ||
|
||
# Expose the port for Uvicorn server | ||
EXPOSE 7860 | ||
|
||
# Command to run the Uvicorn server with your FastAPI application | ||
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "7860"] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Careful, you have hardcoded openai keys in here. If they are still valid they could be hijacked.