We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I'm trying to keep Wllama instance alive (not setting it to null) when it's not needed, or when I want to load another model.
I'm running into the error above though, if a model is already loaded.
I've tried unloading the existing model first, but that doesn't seem to cut it.
if(typeof window.llama_cpp_app.loadModelFromUrl != 'undefined'){ if(typeof window.llama_cpp_app.isModelLoaded != 'undefined'){ let a_model_is_loaded = await window.llama_cpp_app.isModelLoaded(); console.warn("WLLAMA: need to unload a model first?: ", a_model_is_loaded); if(a_model_is_loaded && typeof window.llama_cpp_app.unloadModel != 'undefined'){ console.log("wllama: unloading old loaded model first"); await window.llama_cpp_app.unloadModel(); } } await window.llama_cpp_app.loadModelFromUrl(model_url, model_settings); }
The text was updated successfully, but these errors were encountered:
No branches or pull requests
I'm trying to keep Wllama instance alive (not setting it to null) when it's not needed, or when I want to load another model.
I'm running into the error above though, if a model is already loaded.
I've tried unloading the existing model first, but that doesn't seem to cut it.
The text was updated successfully, but these errors were encountered: