Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Example] ggml: increase llama ctx-size for tinyllama CI job #100

Merged
merged 1 commit into from
Feb 22, 2024

Conversation

dm4
Copy link
Member

@dm4 dm4 commented Feb 22, 2024

No description provided.

Copy link
Member

juntao commented Feb 22, 2024

Hello, I am a code review bot on flows.network. Here are my reviews of code commits in this PR.


Commit 6510caf1d16d3e80acbfa7f38be97297fe723b67

Key Changes:

  1. Increased the ctx-size option value from 512 to 1024 in main.rs file.
  2. No change in the size of the wasmedge-ggml-llama.wasm file.

Potential Problems:

  1. Increased ctx-size: The increase in the ctx-size option may have performance implications and might require additional memory resources. It would be advisable to ensure that this change is thoroughly tested to verify that it does not lead to memory-related issues or performance degradation.

  2. Binary Patch in Wasm File: The binary patch in the wasmedge-ggml-llama.wasm file suggests that there were changes made at the binary level. It is important to validate that these changes are intentional and do not introduce any unexpected behavior or security vulnerabilities.

  3. Maintenance of Binary Files: Maintaining binary files in version control can sometimes lead to challenges in tracking changes and reviewing modifications. It is recommended to document the reasons for changes in binary files and consider whether there are alternative approaches to version control for such files.

Overall, the increased ctx-size parameter is the primary change in this patch, and it is crucial to ensure that it is thoroughly tested to prevent any unforeseen issues related to memory usage or performance. Additionally, the binary patch in the Wasm file should be carefully reviewed and validated.

@dm4 dm4 requested a review from hydai February 22, 2024 06:53
@hydai hydai merged commit b10b7a1 into dev Feb 22, 2024
3 checks passed
@hydai hydai deleted the dm4/llama-ctx-size branch February 22, 2024 07:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants