Skip to content
This repository has been archived by the owner on Sep 27, 2024. It is now read-only.

Issue with LM Studio when using 2 GPUs for Layer Offloading #63

Open
SalsaDura opened this issue Sep 24, 2024 · 0 comments
Open

Issue with LM Studio when using 2 GPUs for Layer Offloading #63

SalsaDura opened this issue Sep 24, 2024 · 0 comments

Comments

@SalsaDura
Copy link

Hi everyone,

I’m encountering a problem with LM Studio when trying to use two GPUs for PARTLY offloading layers. Everything works perfectly when I use just one GPU (RTX 4070ti super), but as soon as I add a second one (RTX 3080) , I get an error. ("(Exit code: 0). Some model operation failed. Try a different model and/or config.") I’ve checked my gpu-preferences.json file and set the gpuType to nvidia, and I’m using the latest version of LM Studio (0.2.31). My GPU drivers are also up to date. Has anyone else faced this issue or have any suggestions on how to resolve it? Any help would be greatly appreciated!

Thanks in advance!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant