Messed up Python indentation on Mixtral; is it an issue with the model or OobaTGWUI? Or do I just have some setting wrong? #5808
Unanswered
TiagoTiago
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Often I get results that seem to have 3 spaces instead of the more conventional 4, rarely a mix of 3 and 4, and a few times it just flattens out everything. Looking at the console, doesn't seem to be an issue of markdown formatting messing things up, same number of spaces there.
I haven't noticed this issue on any other models that are at least somewhat competent at writing Python code; but I don't think I've tried other MoE models yet.
Beta Was this translation helpful? Give feedback.
All reactions