Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to deploy mistralrs on Android for large model inference? #779

Open
sopaco opened this issue Sep 17, 2024 · 1 comment
Open

How to deploy mistralrs on Android for large model inference? #779

sopaco opened this issue Sep 17, 2024 · 1 comment
Labels
new feature New feature or request

Comments

@sopaco
Copy link

sopaco commented Sep 17, 2024

How to deploy mistralrs on Android for large model inference?

@sopaco sopaco added the new feature New feature or request label Sep 17, 2024
@EricLBuehler
Copy link
Owner

@sopaco sorry for the delay! I have not investigated this yet, but I think that the way to do this would be to generally follow the installation steps via some app like Termux, and then try to run the model? Perhaps something small like Llama 3.2 1b would work!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
new feature New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants