Skip to content

Commit

Permalink
Merge pull request #16 from luiyen/llama2
Browse files Browse the repository at this point in the history
llama2-7b-chat-hf
  • Loading branch information
luiyen authored Aug 27, 2023
2 parents d84f4cf + e85b26e commit a79475f
Show file tree
Hide file tree
Showing 4 changed files with 17 additions and 16 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/test-action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ jobs:
githubRepository: ${{ github.repository }}
githubPullRequestNumber: ${{ github.event.pull_request.number }}
gitCommitHash: ${{ github.event.pull_request.head.sha }}
repoId: "microsoft/codereviewer"
repoId: "meta-llama/Llama-2-7b-chat-hf"
temperature: "0.2"
maxNewTokens: "250"
topK: "50"
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ jobs:
githubRepository: ${{ github.repository }}
githubPullRequestNumber: ${{ github.event.pull_request.number }}
gitCommitHash: ${{ github.event.pull_request.head.sha }}
repoId: "microsoft/codereviewer"
repoId: "meta-llama/Llama-2-7b-chat-hf"
temperature: "0.2"
maxNewTokens: "250"
topK: "50"
Expand Down
2 changes: 1 addition & 1 deletion action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ inputs:
repoId:
description: "LLM model"
required: true
default: "microsoft/codereviewer"
default: "meta-llama/Llama-2-7b-chat-hf"
maxNewTokens:
description: "The amount of new tokens to be generated, this does not include the input length it is a estimate of the size of generated text you want. Each new tokens slows down the request, so look for balance between response times and length of text generated."
required: false
Expand Down
27 changes: 14 additions & 13 deletions entrypoint.py
Original file line number Diff line number Diff line change
Expand Up @@ -86,13 +86,14 @@ def get_review(
)
for chunked_diff in chunked_diff_list:
question=chunked_diff
template = """Provide a concise summary of the bug found in the code, describing its characteristics,
location, and potential effects on the overall functionality and performance of the application.
Present the potential issues and errors first, following by the most important findings, in your summary
Important: Include block of code / diff in the summary also the line number.
```
{question}
```
template = """Provide a concise summary of the bug found in the code, describing its characteristics,
location, and potential effects on the overall functionality and performance of the application.
Present the potential issues and errors first, following by the most important findings, in your summary
Important: Include block of code / diff in the summary also the line number.
Diff:
{question}
"""

prompt = PromptTemplate(template=template, input_variables=["question"])
Expand All @@ -106,12 +107,12 @@ def get_review(

question="\n".join(chunked_reviews)
template = """Summarize the following file changed in a pull request submitted by a developer on GitHub,
focusing on major modifications, additions, deletions, and any significant updates within the files.
Do not include the file name in the summary and list the summary with bullet points.
Important: Include block of code / diff in the summary also the line number.
```
{question}
```
focusing on major modifications, additions, deletions, and any significant updates within the files.
Do not include the file name in the summary and list the summary with bullet points.
Important: Include block of code / diff in the summary also the line number.
Diff:
{question}
"""
prompt = PromptTemplate(template=template, input_variables=["question"])
llm_chain = LLMChain(prompt=prompt, llm=llm)
Expand Down

0 comments on commit a79475f

Please sign in to comment.