Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Llama code review looping #825

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

aidando73
Copy link
Contributor

@aidando73 aidando73 commented Dec 22, 2024

What does this PR do?

Thought this would be an interesting example to have - to showcase structured outputs, tool calls and looping with Llama.

Here's a demo:

output-2024-12-22-compressed.mp4

Lmk what you think @mreso @wukaixingxp - do you think there's value in having this demo here?

Feature/Issue validation/testing

See README.md

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

Thanks for contributing 🎉!

CODE_REVIEW_CYCLES = 5

# Works:
MODEL_ID = "meta-llama/Llama-3.3-70B-Instruct"
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note that 3.3-70B for fireworks is not yet released - but will be in the next release: meta-llama/llama-stack#654

@aidando73 aidando73 force-pushed the aidand-llama-code-review-demo branch from e91f058 to 08e30ee Compare December 22, 2024 07:55
### Running the demo

We'll be using the fireworks llama-stack distribution to run this example - but you can use most other llama-stack distributions (instructions [here](https://llama-stack.readthedocs.io/en/latest/distributions/index.html)).
(Though note that not all distributions support structured outputs yet e.g., Ollama).
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR incoming for Ollama though: meta-llama/llama-stack#680

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants