You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I upgraded from 0.2.31 to 0.3.2 and there are several issues that make it feel like a downgrade.
The garbage can icon for deleting chats should appear when you hover over them like before and there should be a prompt. Having to click the three dots and then delete every time is unintuitive and caused me to delete my whole Migrated chats folder when I intended to delete the latest chat above it.
I recovered the files with an UNELETE app but they're either corrupted or I'm doing something wrong because they all appear as Unknown chats when I fire up LM studio. That's the jsons in the conversation folder. In the chats folder, the contents seem preserved when I open them in text editor but copying these to the conversation folder also did nothing.
How do I restore these to how they were before my mistake?
It no longer shows the token count when I type or copy-paste stuff into the chat. You guys need to bring this back, it's crucial for me to know how close to the context window I am when I copy-paste an entire novel/post/document. Sure, it shows the token count AFTER each prompt but I need to know how much the initial paste will be.
Why does it only allow me to select half the threads my hardware is capable of using? I have a 6-core CPU with hyperthreading and the settings now only allow me to select a maximum of 3 when I could select 6 before. The performance didn't take a huge hit but I notice increased lag between the responses.
Why does the app continue processing after it finishes a response to my prompt? I try to press send after typing my response but it says I have to wait for it to finish despite that it did finish. I can press "stop generating" and do it but this seems to cause a greater delay when the LLM responds next. The previous version of LM Studio I used did not do this.
Why can we no longer save our settings as a profile? It's annoying having to update the system prompt for every single new chat I start.
The text was updated successfully, but these errors were encountered:
I upgraded from 0.2.31 to 0.3.2 and there are several issues that make it feel like a downgrade.
I recovered the files with an UNELETE app but they're either corrupted or I'm doing something wrong because they all appear as Unknown chats when I fire up LM studio. That's the jsons in the conversation folder. In the chats folder, the contents seem preserved when I open them in text editor but copying these to the conversation folder also did nothing.
How do I restore these to how they were before my mistake?
It no longer shows the token count when I type or copy-paste stuff into the chat. You guys need to bring this back, it's crucial for me to know how close to the context window I am when I copy-paste an entire novel/post/document. Sure, it shows the token count AFTER each prompt but I need to know how much the initial paste will be.
Why does it only allow me to select half the threads my hardware is capable of using? I have a 6-core CPU with hyperthreading and the settings now only allow me to select a maximum of 3 when I could select 6 before. The performance didn't take a huge hit but I notice increased lag between the responses.
Why does the app continue processing after it finishes a response to my prompt? I try to press send after typing my response but it says I have to wait for it to finish despite that it did finish. I can press "stop generating" and do it but this seems to cause a greater delay when the LLM responds next. The previous version of LM Studio I used did not do this.
Why can we no longer save our settings as a profile? It's annoying having to update the system prompt for every single new chat I start.
The text was updated successfully, but these errors were encountered: