Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

export_model后在cppinfer中无法直接使用 #12

Open
hanliangwei opened this issue Jan 4, 2024 · 0 comments
Open

export_model后在cppinfer中无法直接使用 #12

hanliangwei opened this issue Jan 4, 2024 · 0 comments

Comments

@hanliangwei
Copy link

n

m

在windows下cmake编译了cpp_infer,然后下载上图中的推理模型,是可以正常使用的
然后下载上图中的训练模型,并直接使用命令
python tools/export_model.py -c configs/rec/PP-OCRv3/ch_PP-OCRv3_rec_distillation.yml -o Global.pretrained_model=./ckpt/best_accuracy Global.save_inference_dir=./inference_model/ch_PP-OCRv3_rec/
推理出来后,在python中是可以推理的,但是在cpp中推理就会crash,crash在paddle_inference.dll里的识别接口中。
同样也是.pdmodel的文件比较小,请问该如何处理呢

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant