We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
作者你好。
请问ChatLaw-MoE模型权重在哪里可以下载?提供的huggingface链接并没有模型的权重文件。
或者可以提供训练数据集以便复现模型训练结果吗?
两者都没有的话,无法复现出你们论文中的评估结果,对论文中的实验结果持怀疑态度。
The text was updated successfully, but these errors were encountered:
刚跑了13B全是乱码
Sorry, something went wrong.
论文里说4×7B的MoE就能比GPT-4的法律能力还厉害,我真的怀疑... 数据和MoE的权重都没有,也复现不了
No branches or pull requests
作者你好。
请问ChatLaw-MoE模型权重在哪里可以下载?提供的huggingface链接并没有模型的权重文件。
或者可以提供训练数据集以便复现模型训练结果吗?
两者都没有的话,无法复现出你们论文中的评估结果,对论文中的实验结果持怀疑态度。
The text was updated successfully, but these errors were encountered: